Skip to main content
SAGE
Search form
  • 00:04

    [Studying Automation in EverydayLife Using Surveys and Computational Methods]

  • 00:09

    AARON SMITH: Hello, my name is Aaron Smith.[Aaron Smith, Associate Director, Pew Research Center]I am an associate director for internet and technologyresearch at the Pew Research Center.So in my role at Pew, I wear a number of different hats.I have my own research portfolio.And I also represent the center in the mediaand in various public venues.As an associate director, I also am

  • 00:32

    AARON SMITH [continued]: responsible for the day-to-day management of our researchprocess.So that involves everything from planning our project calendar,working with our other researchers at various stagesof their own projects to make surethat things are progressing OK and that they havethe support that they need.And also, working on our longer term strategywith our leadership here.

  • 00:53

    AARON SMITH [continued]: So I wear a lot of different hatsthat involve both day-to-day research and alsomanagement of our team here.In terms of my background, I'm a little bit uniquein terms of the folks who typically work here.I have an undergraduate degree in liberal arts.I have a master's degree in public policy.And after my master's degree, wound up doing market research

  • 01:16

    AARON SMITH [continued]: in the private sector.And found that I really enjoyed both the qualitativeand quantitative aspect of that type of work.I got to work with numbers, but also go out and talk about themand present them to clients.And found that to be quite engaging.And I followed the work of the Pew Research Centerfor a long time.And when a position here opened up, I jumped at that.

  • 01:40

    AARON SMITH [continued]: And 11 years later, I am now--I'm sitting in the position that I'm sitting in.[What is your current research focus?]At the Pew Research Center, my team's research focusis on the social impact of the internetand other digital technologies thatinfluence the ways people live, work, and get information

  • 02:03

    AARON SMITH [continued]: about the world around them.In terms of specific issue areas that we study,one branch of our work looks at access to technology.And specifically, how lack of accessto devices or other tools might hinder peoplein their ability to get news, accessinformation about the world or engage in important decisionmaking.

  • 02:24

    AARON SMITH [continued]: Another aspect of our work focuses on people's attitudestowards technology.So one example of that is our longstanding privacy portfolio,and examining how Americans' attitudes towards privacyissues have shifted and evolved in a world of increasingsurveillance by various parties both online and offline.We also look at issues such as how social media influences

  • 02:46

    AARON SMITH [continued]: people's ability to engage with political issuesor the other broader political process.And lastly, we try to examine people's attitudestowards emergent or nascent issuesin the field of technology.So technologies like automation, artificial intelligence,machine learning, technologies that maybe aren't necessarily

  • 03:09

    AARON SMITH [continued]: in wide use today, but that may become moreprominent in the coming years.Understanding people's attitudes towards those emergent issuesis something that we try to focus on in our workthat we do.Our automation in everyday life projectwas a study in which we were hopingto learn about people's attitudestowards some emerging issues in the world of automation.

  • 03:31

    AARON SMITH [continued]: And we wanted to gain a pretty broad understanding of howpeople are viewing some emerging automation technologies thatmight impact them in their work livesas well as their personal lives.As well as some technologies thatare more currently available.As well as those that are a little bitfurther on the horizon.And so the way that we did that studywas we took four sort of archetypal examples

  • 03:55

    AARON SMITH [continued]: of automation and presented them to peopleand asked them questions about them.Those four issues that we used were autonomousvehicles that can operate without the aidof a human driver, a fully autonomous robotcaregiver for older adults, an algorithm thatcan select candidates for a job based on data about them

  • 04:15

    AARON SMITH [continued]: and their resumes, and lastly, a scenario in which machinesand computers are capable of doing many of the jobsthat humans currently do today.And what our goal was, ultimately,in that piece of research was to get a broad understandingof people's hopes and concerns about some of these emerging

  • 04:37

    AARON SMITH [continued]: issues.Understand where their pain points are,as well as their expectations for how these technologiesmight develop.And also get a broader understandingof some of the interventions, whether policy interventionsor otherwise, that they might support as these technologiesbecome more prominent in their day to day lives.[How does this project build on past survey research?]

  • 05:03

    AARON SMITH [continued]: So the first thing to note is that this particular projectbuilt on a relatively long-standing body of workwithin this specific issue focus for us.Several years ago, we did a studyof elites in the technology field,asking them for their own views about the potential impact

  • 05:24

    AARON SMITH [continued]: of automation on the future of work.And we got amazing responses.We could tell that the people we were speaking withwere extremely engaged and interestedin this particular topic.We got people with incredibly Utopian views and also peoplewith incredibly dystopian views.And it made us really realize that there was something there

  • 05:46

    AARON SMITH [continued]: in this topic that needed future insight from us.And so, with this particular project,we wanted to move beyond what elites in the fieldwere thinking about with this issue,and try to get an understanding of what the publicitself is thinking about this issue.The extent to which they're paying attentionto it, the extent to which it excites or makes them afraid.

  • 06:09

    AARON SMITH [continued]: And obviously, this is a fairly nascent development.Many of these technologies are not in widespread use today.But getting an early understandingof how ordinary Americans are thinking about this topicwas something that we felt could really add some context to allof the debates going on about the role of automationin various aspects of society.

  • 06:30

    AARON SMITH [continued]: So the biggest takeaway from our automation surveywas that the public is simply verynervous and reticent about turning human decision-makingover to machines in various contexts, especiallythose that are very important or even potentially life or death.We saw that in a number of ways.Across all of the different scenarios that we presented

  • 06:52

    AARON SMITH [continued]: to people, the public was more likely to express worrythan to express enthusiasm about those technologies.They were very likely to support a variety of policyinterventions that either limit the use of those technologiesor that place humans more firmly in controlof those technologies.And in particularly the open ended responses

  • 07:15

    AARON SMITH [continued]: that we got from people, there was a real senseof value being placed on human decision-making, humaningenuity, human creativity.All of the things that are unique,that humans bring to the table in all of the thingsthat we do.So at a very broad level, this is something that whenyou present it to people--

  • 07:35

    AARON SMITH [continued]: despite the fact that they do have some positive viewsin some cases--is really something that makes people a little bit nervousand that they feel very reticent about.In terms of surprises from this work,I think one of the most surprising thingsis that it's not necessarily the very far off advancesthat people are most nervous about.

  • 07:56

    AARON SMITH [continued]: One of the technologies that we asked people aboutwas a computer program that couldtake someone's resume and some data about themand decide whether or not they should be hired for a job.That's a technology that, in some form or fashion,is being used by any number of companies today.It's quite commonplace.And yet, out of all of the scenarios

  • 08:17

    AARON SMITH [continued]: that we presented people with, thatwas the one that they had perhaps the most negativeresponse to.They were much more worried than enthusiastic about it.They were much less likely to saythat they would apply to a job that used that type of programthan they would be even to use a robot caregiver for oneof their loved ones.People really had a visceral response

  • 08:39

    AARON SMITH [continued]: to this technology that is, in many ways,something that's being used in many different scenariosas we speak.And so, I think what that speaks to isthe fact that for many Americans,despite the fact that they understandthis concept at a general level, they don't alwaysrecognize the extent to which automationtechnologies, machine learning, other types of algorithms

  • 09:00

    AARON SMITH [continued]: underlie a lot of the decisions that are even today being madeabout them in various aspects of their lives.[Why is it important to understandpublic opinion about emerging computational technologies?]So from our standpoint, the reasonit's important to understand public opinionabout these types of technologiesis that they are currently being used

  • 09:21

    AARON SMITH [continued]: to influence many important aspects of people's lives.I'll give a couple examples of that.20 years ago, the vast majority of the content that peopleencountered-- whether it was good content or bad content--was specifically chosen by a human editorto appear in the venue that they saw it in.Today, as people go about their liveson social media or online more generally,

  • 09:44

    AARON SMITH [continued]: the vast majority of the content that they encounteris not chosen by a human editor but ischosen by an opaque algorithm crunching a lot of data,and serving them up that piece of content for reasons thataren't entirely clear even to the people who created thosealgorithms.So from the standpoint of the content that people encounter,

  • 10:04

    AARON SMITH [continued]: the information that they're exposed to,it's hugely important to understandhow machine learning and other typesof algorithmic intelligence influence the content that theysee, and their perceptions of howthat influences the types of thingsthat they're exposed to online.In addition to the types of informationthat people are exposed to online,

  • 10:25

    AARON SMITH [continued]: machine learning and other types of algorithmsare now playing an increasingly prominent rolein all sorts of important decisions in people's lives.From whether they qualify for a loanor not to whether they can access government assistance.Or even whether or not they might qualify for paroleafter they're convicted of a crime.And obviously there are challenges in humans

  • 10:48

    AARON SMITH [continued]: making those decisions as well.But there's a real concern that algorithmsmay reinforce a lot of inequities and inequalitiesunder the guise of being outwardly biased or neutral.So understanding both people's awarenessof the extent to which those algorithms are involved

  • 11:11

    AARON SMITH [continued]: in those decisions, as well as their views over howwell those decisions are being madeand their understanding of those techniques,is hugely important for both policymakers as well asthe informed public to help us understand this issue morebroadly.[What kinds of challenges do you face in your research work &how do you overcome them?]

  • 11:31

    AARON SMITH [continued]: In terms of areas where computational social sciencecan help us answer questions that we can't answer at allusing surveys, a great example of thatis a study we did last year looking at submissionsto the FCC's is open comment processaround the net neutrality plan.Net neutrality is an incredibly wonky issue.

  • 11:53

    AARON SMITH [continued]: It's very hard to describe.We've never figured out how we can ask peopleabout that in a survey in a way that is clear non-misleading.It's just a very challenging subject in that venue.So what we were able to do using social science--computational social science techniquesis look at the comments that people were submitting

  • 12:14

    AARON SMITH [continued]: to the FCC, and use that as a way to gaugethe opinions of the public.And, in this case, gauge some of the challenges of open commentprocesses entirely, using a techniquethat we didn't have in our toolbox as recentlyas a few years ago.And for which traditional surveys would not have really

  • 12:35

    AARON SMITH [continued]: been a good instrument for gainingan understanding of that issue.That said, are some of our best work,I think, is work that utilizes both surveys and big dataanalytics to compliment each otherand provide new insights into similar topics.So one very good example of that is a study

  • 12:56

    AARON SMITH [continued]: we did a few years ago looking at conversations about raceon social media.So with our survey, we were able to askpeople about the types of contentaround race that they were exposed to, the extent to whichpeople of different races encountered those discussionsat different rates.And get a sense of the overall importance of social media

  • 13:18

    AARON SMITH [continued]: when it comes to processing those types of issuesthat are going on in our culture right now.Simultaneously, we were also able to do a social media--an analysis of social media data,looking specifically at conversations about racemore broadly.But also about the Black Lives Matter hashtag.And track the ways in which sentiment around Black Lives

  • 13:39

    AARON SMITH [continued]: Matter had shifted over time, and the waysin which different issues emerged in the social mediaconversation in a way that we wouldn'thave had an opportunity to do in a traditional survey.So that was a great example of a study wherewe were able to combine the types of attitudinal datathat you can get from a survey to talkabout how people are feeling about a particular issue.

  • 14:01

    AARON SMITH [continued]: And combined that with social media datato talk about the broader landscape of the conversationsthat people are having about a particular issue.[What kinds of challenges do you face in your research work &how do you overcome them?]So I will say that asking people about technology,

  • 14:23

    AARON SMITH [continued]: particularly technology that isn'toutwardly visible to people, is an inherent challengeof our work.And one of the things that we spend the most time onis figuring out how to ask peopleabout things in a way that describes a particular behaviorthat they might engage in, that serves as a proxy for somethinghappening elsewhere.

  • 14:43

    AARON SMITH [continued]: And in many cases, when we're tryingto ask about fairly complex issues like that,many times that's how we try to frame things.So we don't ask them about whether--the types of content that algorithms are serving them.We ask them whether they see content on social media thatis angry or happy or makes them feel good or makesthem feel bad.

  • 15:05

    AARON SMITH [continued]: And by extension, we can then determine a little bit aboutwhether or not people are being exposedto particular types of content, knowingthat algorithm on the back end isserving them that information.I will also say that framing things in plain language,in a way that uses as little technical jargon as possible,

  • 15:25

    AARON SMITH [continued]: is something that we try to spend a lot of time on.And that can be the biggest challenge in many cases,in doing a survey of these issues,is either describing technical subjects in a waythat people can latch on to, or coming upwith a good proxy for a behavior that gets at the thing you'reactually trying to get at.And in some cases, such as net neutrality,

  • 15:47

    AARON SMITH [continued]: we just have to throw up our handsand decide that surveys are not the ideal way of capturingattitudes about that behavior.But in many cases you're able to come upwith something that works, even if it doesn't exactlyuse the phrase that the technologists would use.One of the other things that we do quite a bitis we just we ask people open ended questionsand have them respond in their own words.

  • 16:09

    AARON SMITH [continued]: One of the most valuable insightsthat we got in our automation reportwas simply asking people why or why not theywould have used the various scenarios that we presentedto them.That gives a chance for people to sayin a very unguided, unprompted way whatit is about a particular issue that is making them excited

  • 16:31

    AARON SMITH [continued]: or making them feel nervous or worrying them or makingthem concerned.We make a lot of use of that type of data.We can then code it on the back end.But having those in-their-own-words responsesfrom ordinary people about what it is that grabs them abouta particular concept or a particular topic,

  • 16:53

    AARON SMITH [continued]: that's something that we try to build into as manyof our studies as we can.Because it really is valuable to havethe literal voice of the public in the context of allof the data that we collect.[Conclusion]In terms of why it's important to utilize

  • 17:15

    AARON SMITH [continued]: various types of computational social scienceand doing the work that we do, I would say those are twofold.One is that obviously, there's a sea of data out there today.And in many cases, what we found is that the amount of data youcan collect--whether that's from Twitter or from a government open commentperiod or any other source you can think of--

  • 17:38

    AARON SMITH [continued]: is just simply too vast for any human or even a group of humansto go through each of those comments individually.Just to give you a sense of the scope of these issues, whenwe studied the FCC net neutrality open comment period,there were tens of millions of responses

  • 17:58

    AARON SMITH [continued]: that were ultimately submitted over the courseof that process.So there was no possible way that anyonecould go through and individually lookat each of those responses.So just in that sense, from a very mechanical level,having the ability to train a machine of some kindto identify the things in a very, very large data set

  • 18:20

    AARON SMITH [continued]: that we as researchers are interested inis hugely helpful.And, quite simply, just makes these issues tractablein the way that they would not havebeen even possible, no matter how many humans wecould throw at them.So just at a very logistical, mechanical level,having these tools at our disposal

  • 18:40

    AARON SMITH [continued]: just lets us do things that we simplywouldn't be able to do using our traditional methodologies,or using just the numbers of bodiesthat we have at our disposal.

Abstract

Aaron Smith, Associate Director, Pew Research Center, discusses his research using surveys and computational methods to study automation in everyday life, including his interest in this type of research, current research focus, the importance of understanding public opinion about computational technologies, examples of areas where computational social science can answer questions that surveys can't, challenges faced and overcome, and why computational social science in an important tool in this research.

Looks like you do not have access to this content.

Studying Automation in Everyday Life using Surveys and Computational Methods

Aaron Smith, Associate Director, Pew Research Center, discusses his research using surveys and computational methods to study automation in everyday life, including his interest in this type of research, current research focus, the importance of understanding public opinion about computational technologies, examples of areas where computational social science can answer questions that surveys can't, challenges faced and overcome, and why computational social science in an important tool in this research.

Copy and paste the following HTML into your website