Skip to main content
Search form
  • 00:00


  • 00:17

    KATIE METZLER: My name's Katie Metzler.And I'm the head of methods innovation at SAGE Publishing.[Katie Metzler, Head of Methods Innovation,SAGE Publishing] And I'm delighted to bechairing this evening's event, "Putting Big Data to 'Good'Use."So I'm joined tonight by four panelists, who will each speakfor about 10 minutes.And then there will be time at the end for questions.So I'm going to start by introducing you to our panel.

  • 00:38

    KATIE METZLER [continued]: Dr. Maria Fasli is professor of computer scienceand Director of the Institute for Analytics and Data Scienceat the University of Essex.In 2016, she was awarded the first UNESCO Chairin Analytics and Data Science, with the aim of supportingthe development of a research baseand skills in data science and analytics internationally,

  • 00:60

    KATIE METZLER [continued]: and in developing and transitioning countries,in particular.Dr. Slava Mikhaylov is a professorof public policy and data scienceat the University of Essex, holding a joint appointmentin the Department of Government and the Institute for Analyticsand Data Science.He's Chief Scientific Adviser to Essex County Counciland a co-investigator in the ESRC's Consumer Data Research

  • 01:23

    KATIE METZLER [continued]: Centre at UCL.Dr. Jonathan Gray is lecturer in critical infrastructure studiesat the Department of Digital Humanities, King's CollegeLondon, where he is currently writing a book on data worldsand the politics of public information.He's also co-founder of the Public Data Lab and ResearchAssociate at the Digital Methods Initiative

  • 01:45

    KATIE METZLER [continued]: at the University of Amsterdam and the Medialabat Sciences Po in Paris.Ian Mulvany is head of product innovation at SAGE Publishing,and is responsible for supportingthe development of tools that can help social scienceresearchers work with big data.He's passionate about creating digital tools that supportthe research enterprise.

  • 02:06

    KATIE METZLER [continued]: Previously, Ian was head of technology at eLife Sciences,head of product at Mandalay, and a product managerfor a number of Nature Publishing Group'sonline services for researchers.And so, I've introduced them in the orderthat they're sitting here.So please join me, first, in welcoming our speakers.[APPLAUSE]

  • 02:32

    KATIE METZLER [continued]: So the timing of this event, I think, feels very apt.Big data headlines are appearing daily across our newspapersand magazines, and are spreading through our social mediachannels.Last week, Facebook, Twitter, and Googlewere in front of a congressional committee in Washingtonto answer questions about Russia's attemptsto influence last year's US presidential election

  • 02:54

    KATIE METZLER [continued]: by spreading misinformation online.In the UK, an article in The Observer on Brexittells of a shadowy, global operation involvingbig data and billionaire friends of Trump,who used microtargeting of political advertisingto suppress voter segments and influencethe outcome of the referendum vote.A few weeks ago, a Fortune headline

  • 03:16

    KATIE METZLER [continued]: asked, is big data killing democracy?And last week's Economist cover answered,with a smoking gun of a yes.And as if that weren't scary enough,it seems it isn't just democracy that is being threatened.In Cathy O'Neil's recent book, Weapons of Math Destruction,

  • 03:37

    KATIE METZLER [continued]: she gives examples of how big dataand predictive, proprietary algorithmsare being used to maximize profits and reduce costsfor businesses, with damaging effectsfor whole swaths of the population,but especially for already disadvantaged groups.One example of this from Cathy's bookis the predatory targeted advertising carried out

  • 03:59

    KATIE METZLER [continued]: by for-profit universities in the US,which has left thousands of vulnerable studentswith mountains of debt.Another example that's made the news,and is featured in Cathy's book, is the use of big dataand re-offending risk algorithms in the US penal system, whichhad been written in a way that guaranteesblack defendants will be inaccurately identified

  • 04:20

    KATIE METZLER [continued]: as future criminals more often than their white counterparts.From these various news stories, and from Cathy's book,some common criticisms of the waybig data is being collected and used emerge.Firstly, there's the issue of consent.Though we've all ticked a box at some pointto agree to Facebook's terms and conditions, did any of us

  • 04:42

    KATIE METZLER [continued]: read them all the way through?I know I didn't, or did we know that Cambridge Analytica wouldcome along and use our Facebook data to build a model thathelped the Believe Campaign microtarget political ads?There's also an issue with the way in which algorithmsare perceived as scientific and objective, despite the factthat human biases can, and often are, baked into their design.

  • 05:06

    KATIE METZLER [continued]: Related to that, there's a lack of transparencyaround how some algorithms work, especially when theyare proprietary.Cathy O'Neil and many, many othershave talked about the danger of the black box algorithm,and the danger of a model which doesn't update or self-correctwhen new information becomes available.And who's actually regulating these algorithms,

  • 05:26

    KATIE METZLER [continued]: to ensure they aren't racist or sexist, for example?In many cases, it's nearly impossible for an individualto fight back if they find they're unfairly scoredby a company's algorithm.Again, in Cathy's book, she talksof teachers fired due to algorithmic scoring,credit denied, jobs not offered.

  • 05:46

    KATIE METZLER [continued]: The consequences can be widespread and destructive,and it seems, especially for thosewho are already disadvantaged.But, and this is a very important but,big data and mathematical models aren't inherently bad.Big data has the potential to do wonderful things.Big data can be used to support election monitoring

  • 06:08

    KATIE METZLER [continued]: in the global south, to tackle epidemics and cure diseases,to improve the targeting of humanitarian aidto those who need it most.Big data is neither good nor bad, inherently.It depends on the way it's used, by whom,and in the service of what outcomes.I've often wondered, actually, if I'd

  • 06:28

    KATIE METZLER [continued]: be less concerned about Facebook'suse of targeted advertising if Clinton had been elected.For many of us in the room tonight, Iexpect the issue is around who we trust with our data,and what outcomes the use of our data brings about.And so, with that, I'm going to hand over to the panelto talk us through some examples of how big data is being used

  • 06:49

    KATIE METZLER [continued]: for social good, and some of the challengesfacing academics, who are strivingto put big data to better use, in waysthat reduce inequality and improve outcomes for society.So Professor Fasli, over to you.

  • 07:07

    DR. MARIA FASLI: Thank you, Katie.Thank you for the introduction.Good evening, ladies and gentlemen.My name is Maria Fasli.[Dr. Maria Fasli, Director of the Institute for Analyticsand Data Science, University of Essex]And I'm the Director of the Institute for Analyticsand Data Science at the University of Essex.So I want to start by, first, making a few things clear.You will have heard the term big data, analytics, data science.

  • 07:29

    DR. MARIA FASLI [continued]: And very often, people are confused.They use this terminology interchangeably, perhapssometimes in their own way, or they maybe meaning different things.So let's be a bit clear.So data science is the emerging discipline,the science whose purpose is to drive progress and development

  • 07:51

    DR. MARIA FASLI [continued]: in this area, by, basically, developing novel methodsfor processing, understanding, and drawing insights from data.So it's this new science that is emerging, but very muchinterdisciplinary.My own personal view is that it's not justcomputer science or statistics, or it can't

  • 08:12

    DR. MARIA FASLI [continued]: be claimed by social science.It brings a lot of things together.So you will also have heard the term analytics.I'm not entirely sure you can read what my sign says, here.And when we talk about analytics,basically, what we refer to is the useof quantitative methods, artificial intelligence

  • 08:35

    DR. MARIA FASLI [continued]: methods, statistical methods, to mind the datasets that we have.So it's basically the applicationof different types of methods, quantitative methods, on data.And when we talk about big data, big datais, basically, a huge volume of data.

  • 08:59

    DR. MARIA FASLI [continued]: It may be data that is structured or unstructured.But when we use the terminology big data,we typically refer to data sets thatare not that easy to handle with conventional means, as in,put them on an Excel spreadsheet and process them on your laptop

  • 09:19

    DR. MARIA FASLI [continued]: or on a typical PC.So big data represents truly huge volumes of data.And for a company or an organizationto claim that they have big data,they really must have large volumes of data.So as I said, typical spreadsheetsare not what we would call big data.

  • 09:40

    DR. MARIA FASLI [continued]: So I hope it's a bit clearer what we are talking about.[(Big) Data Does Not Equal (Big) Knowledge, Data, Information,Knowledge]Big data does not necessarily mean big knowledge.And I have the big, here, in parentheses,because the same is the case with your conventional dataset.Just because you have data doesn't

  • 10:00

    DR. MARIA FASLI [continued]: mean that you have knowledge.There is a distinction that we need to make.So data is structured records of transactions.When you go to your bank, and you take money out,or you deposit money, a transaction takes place.That's data.Information-- you have information

  • 10:21

    DR. MARIA FASLI [continued]: when you, basically, process the data that you have.And then you extract some meaning.So information has meaning and shape.And knowledge is something different.Knowledge is, you are trying to understand what is leadingto what, cause and effect.You are looking for something deeper,

  • 10:42

    DR. MARIA FASLI [continued]: rather than just processing the dataand extracting some information.So I would just like to make this distinctionbetween these three terms, as they are being used,because for me, as a scientist, they are really important.So just because people claim that theyhave data or big data, it does not necessarilymean that they have the knowledge,

  • 11:03

    DR. MARIA FASLI [continued]: or they know how to extract knowledge.They may be able to partially process data.But it takes an awful lot more to be able, especially,to process big data, and to processbig data in an appropriate way.So at this point, I have to say, we take big data, we take data,and we take computational technologies

  • 11:25

    DR. MARIA FASLI [continued]: for granted here, in the developed world.But this is not the case, necessarily, across the world.This is a map that was produced back in 2015.And it shows, basically, the internet usersas a percentage of the populationin different countries.And as you can see, in Africa--if you are able to read the percentages there--

  • 11:48

    DR. MARIA FASLI [continued]: Africa, for instance, is one of the countries whereinternet uptake is not, as you can expect, very high.So just because we take things for granted here,in the developed countries, it doesn't necessarilymean that the technology and the advancesthat the technological changes have brought to our liveshave actually trickled down.So we need to be aware of this.

  • 12:08

    DR. MARIA FASLI [continued]: And it's not just what is called the digital divide.So it's not just the lack of infrastructurethat we are talking about.We are also talking about the lack of skills,because you may have the infrastructure in place.But if people don't know what to do with the data thatis being produced, then there's not much they can do about it.

  • 12:30

    DR. MARIA FASLI [continued]: So that brings us to the Sustainable DevelopmentGoals and data science.And you will have all heard that, yes, the SustainableDevelopment Goals, is an initiative that almost allthe countries in the world have signed up for.That was back in 2015.

  • 12:50

    DR. MARIA FASLI [continued]: 17 sustainable development goals attempt to,basically, alleviate poverty and address issues with climatechange and education and a number of other thingsin the next 15 years.As I said, 17 development goals, and a number of themhave to do with skills and infrastructure and education.

  • 13:15

    DR. MARIA FASLI [continued]: So that's where my work as the UNESCO Chair in Analyticsand Data Science comes in, because in my view, as Isaid earlier, we have a digital divide.We also have a skills divide.And we can only breach this divide if we actuallystart doing something about it, and we transfer the skills

  • 13:38

    DR. MARIA FASLI [continued]: to the developing countries and the transitioning countrieswhere this is mostly needed.So the aim of the UNESCO Chair isto work together with international collaboratorsto try and bridge this skills divide, by making surethat researchers across the worldare actually trained in the most up-to-date methods.

  • 13:58

    DR. MARIA FASLI [continued]: They work together with people that work in universities herein the UK, for instance.And we jointly develop projects.And we upskill them.But the aim of the Chair is not just to upskill researchers.It's, in general, to upskill people.And that means upskilling professionals

  • 14:19

    DR. MARIA FASLI [continued]: in developing countries, or even school children, teachers,government officials, because theyhave to do a lot with data.So in my view, the road to knowledge societiesis through education and upskilling.And if we are able to transfer skills,then this is going to lead to economic growth,

  • 14:42

    DR. MARIA FASLI [continued]: because people are going to have the right skills.So they can feed back into the economy.They can create new jobs.They can create new companies.They can become the techno-entrepreneursof the future.And they can come up with wonderful ideas,especially in developing and transitioning countries,because they have not been accustomed to technology.

  • 15:04

    DR. MARIA FASLI [continued]: And there have not been the same sort of periods of changethat we have been.So when new technologies are being introduced,they can look at these with completely fresh eyes.And they come with astonishing ideas.As some examples, I can talk to youlater about that, if you're interested to know more.

  • 15:24

    DR. MARIA FASLI [continued]: So education and upskilling can bring in economic growth, whichcan further lead to improved services for all,because if you upskill, not just the professionals in a country,but government officials, then theycan understand the needs of the population better.They can target services better, according

  • 15:47

    DR. MARIA FASLI [continued]: to the needs of the population, whichwould mean that the services that you deliver are targeted.So for instance, if we talk about health services,then you can target them where they are needed,rather than have surgeries spread, for instance,across the country, across a specific countryin the same way.If you are able to identify that there

  • 16:08

    DR. MARIA FASLI [continued]: is a specific need through the data that you have,provided that the people can process the data,and they can extract knowledge, thenyou can have targeted service delivery.And not only that, but actually, thiscan lead to transparency and accountability and, in essence,you can have improved governance in these countries,

  • 16:31

    DR. MARIA FASLI [continued]: as well as participation, true participation, in public life.When people understand data, they can read data,they can hold into account their governments,organizations, businesses.And this is a good thing.And we should strive for this.So initiatives that we've been undertaking under the UNESCOChair, they involve working together,

  • 16:53

    DR. MARIA FASLI [continued]: as I was saying earlier, with international partners,to upskill researchers, to upskill professionalsin developing and transitioning countries.And my firm belief is that, in orderto make progress in this area, what is neededis a cross-disciplinary approach.I mentioned at the beginning of my presentation--

  • 17:15

    DR. MARIA FASLI [continued]: I gave you-- well, not a definition,but I did say that data science is emerging discipline,that it's bringing together different areas.So we really need to have a cross-disciplinary approach.It's not just about developing the novel methodsto draw insights from data.

  • 17:36

    DR. MARIA FASLI [continued]: It's about making sure that the methodsthat we develop, we understand them,so that we don't have the kind of problemsthat Katie has been referring to.We don't have algorithms that are producing outcomesthat we can not understand.And I can bring you examples of current machine learningtechniques, for instance, deep learning, that

  • 17:56

    DR. MARIA FASLI [continued]: are referred to as black boxes.And they are very often accused of notbeing able to give you an answer as to whyit is that they are giving you a particular prediction.These algorithms may appear to be racist, at times, simplybecause of the data that you feed them with.

  • 18:19

    DR. MARIA FASLI [continued]: And we need to be aware of these.So the people that are developing these technologiesand these methods need to work togetherwith the social scientists, because thereis an impact on society, but also,with the legislators, the policymakers,so that the problems that we are allgoing to be facing going forward are, basically, addressed

  • 18:40

    DR. MARIA FASLI [continued]: in a joined-up way, rather than after the fact we are tryingto change outcomes, and make sure that we don'thave problems going forward.And on this note, I'm going to finish, thank you.[APPLAUSE]

  • 19:00

    DR. MARIA FASLI [continued]: [Dr. Slava Mikhaylov, Professor of Public Policy and DataScience, University of Essex]

  • 19:03

    DR. SLAVA MIKHAYLOV: All right, sothank you very much for inviting me.And thank you for coming here.When Katie put together the panel,the idea was that she will start offscaring us all that big data is really bad,and that big brother is around.And we should be really concerned.And our job was to be uplifting and give some examples where

  • 19:26

    DR. SLAVA MIKHAYLOV [continued]: big data can work.So Maria has done a great job laying the foundation.But thinking about examples where it should work,and being really uplifting, let me talk about climate change.[LAUGHTER]So I don't know how much you can see from the slide.But essentially, on the left-hand side,this is a picture--

  • 19:46

    DR. SLAVA MIKHAYLOV [continued]: well, this is a slide-- from CDC, Centerfor Disease Control in the US.And it's about the link between climate changeand public health.It's about the intersection of the two aspects.So everybody is concerned about the public health.We are all concerned about health.This is one issue that we can relate to.Climate change, as we know, is a bit of a questionable topic.

  • 20:11

    DR. SLAVA MIKHAYLOV [continued]: Some people, even on this side of the Atlantic,may be skeptical about climate change.And it's really difficult to drive the pointthat climate change is happening.And even the US Government has just released a reportthat climate change is happening,that it's human-made activity.But it's still difficult to convince some people, right?So one of the ideas that a lot of academics had was,

  • 20:32

    DR. SLAVA MIKHAYLOV [continued]: let's link together an issue that everybody can relate toand an issue that is still fundamentally important for us,but really difficult to relate to for some people,through skepticism.So it's relating public health and climate change.What is the effect of climate change on public health?So in this Center for Disease Control,it's relating, essentially, to everything we have.

  • 20:54

    DR. SLAVA MIKHAYLOV [continued]: Anything we can think of about climate change,it has an effect.So this is an example from Public Health England.They've done a study quite a few years back,thinking about trying to predict whatwould be the spread of the disease in England,with the increase in temperature?And the map, since you cannot read any of that,the map is picking up the spread of, potentially,

  • 21:17

    DR. SLAVA MIKHAYLOV [continued]: the spread of malaria.And that's, at the top, this is a map from--we're talking about 19th century,and the cases in the 19th century of malaria in England.And here are several scenarios of development,where we will land if we don't do anything,or where will we land if we still try to do something,

  • 21:39

    DR. SLAVA MIKHAYLOV [continued]: but it's not enough.And as you can see, it is expanding quite a lot,and Malaria coming back and, potentially,coming back in such a way that some parts of Londonare appearing as purple, right, which is really bad, accordingto the map.Colchester, on the other hand, University of Essex,

  • 22:01

    DR. SLAVA MIKHAYLOV [continued]: seems to be fine.So you're all welcome.You can come to Colchester.All right, so this is the backdrop.And I'm still trying to be a bit uplifting, there,because Katie gave us a task to be uplifting.So what do we do about that?This is the problem.This is the setting.So one of the things that we can look at,

  • 22:21

    DR. SLAVA MIKHAYLOV [continued]: how do international leaders think about public health?How do international leaders think about climate changeand public health, in particular,and the intersection between them?So in part of our research, we looked at the debatesin United Nations.So this is the debate in the United Nations.At every session opening of United Nations, the leaders,

  • 22:45

    DR. SLAVA MIKHAYLOV [continued]: country leaders, heads of state, heads of government,all their representatives, they come and speak.So this is usually in September for the first couple days.This is called the opening of the General Assembly.And the debate is called the General Debate.Usually, country leaders come and theyspeak about the most important problemsthat face their countries.Theresa May spoke about Brexit and about immigration.

  • 23:05

    DR. SLAVA MIKHAYLOV [continued]: Barack Obama spoke about Iran, spoke about North Korea,but also about all kinds of other global issues, like ISIS.And the President of Kiribati, he spoke about climate change,because for Kiribati, this is an issue that is really--it's important.The country will disappear.

  • 23:26

    DR. SLAVA MIKHAYLOV [continued]: They all face extinction as a country that will have to move.So for him, this is the most important issue.So what we try to see is, does climatechange and the discussion of climatechange in the relationship to public health,does it move, not only from the countries most affected,but also, does it appear in speechesfrom the leaders of the developed countries, Theresa

  • 23:47

    DR. SLAVA MIKHAYLOV [continued]: May, Barack Obama, or Donald Trump this year, right?So this was done as part of a project thatis under The Lancet, the medical journal,the premier medical journal The Lancet,"Lancet Countdown" commission.So this is a report that was commissioned by Lancet.And the report was specifically to track

  • 24:08

    DR. SLAVA MIKHAYLOV [continued]: the intersection of climate change and public health.The report was just launched on Monday.And it's in public domain.So anyone can download.If you saw, we get to read about it.It's quite a long report.Our job within the report was really tiny.We were one of the working groups, one of the smallerworking groups, tracking the engagement, public and

  • 24:30

    DR. SLAVA MIKHAYLOV [continued]: political engagement, with the issue of climate changeand public health.So what we did, we looked at all the speeches over the last 10years, and tried to see how many timesdoes climate change appear in the context of public health.So do the leaders, when they speak about it,does the context provide for this intersectionbetween the two issues, because we know

  • 24:51

    DR. SLAVA MIKHAYLOV [continued]: that happens in some countries.But we don't know about other countries.So the top graph, here, just a ballpark figure,is to tell you that it does appear.But it also fluctuates.And one thing that we can see is that it fluctuates.In the run-up to a major climate change summit,politicians talk a lot about this issue, specifically,

  • 25:11

    DR. SLAVA MIKHAYLOV [continued]: the intersection of public health and climate change.As soon as the summit is over, they stop talking about ituntil the next summit.So if we think about, as citizens, if we want to engage,and we want to drive this issue forward,the awareness of climate change and public health,one of the things we can do is put iton the agenda, not with the fluctuations up and down,

  • 25:32

    DR. SLAVA MIKHAYLOV [continued]: but a bit more even.It's always on the agenda.If it's on the agenda, something willhave to be done, hopefully.But also, countries speak differently.So the Pacific speaks-- countriesfrom the Pacific region, they speak a lot about the issue.Countries from Western Europe, they almostnever discuss the issue, the intersection of public healthand climate change.

  • 25:53

    DR. SLAVA MIKHAYLOV [continued]: So again, it's working in different areas,raising the awareness.So that's the positive spin on that.So we can do something.And we can raise the awareness.We can engage with the issue thatseems to be intractable issue by itself.But we can engage with it by using some of the data sciencetechniques.So specifically underneath of that,it's applying some of the things from natural language

  • 26:14

    DR. SLAVA MIKHAYLOV [continued]: processing, some of the things from machine learningand natural language processing, to identify the trends,analyze the data, convert text into data,and analyze that for public good, here, specifically.And just as a brief overview, theseare all the other headline indicatorsthat you can look at in the full report on The Lancet website.

  • 26:36

    DR. SLAVA MIKHAYLOV [continued]: And all of them are telling that unless we do something now,it will be pretty dire.But we can do things.And I think this is the point, is that we can do things.And we can move forward.

  • 26:50


  • 27:03

    DR. JONATHAN GRAY: Great, so I'm going to talk to you, now,about a entity which I've co-founded,called the Public Data Lab.[Jonathan Gray, Lecturer in Critical InfrastructureStudies, King's College London] And specifically, I'mgoing to look at, not just the analytical capacitiesof big data technologies and how they can be used, but also

  • 27:26

    DR. JONATHAN GRAY [continued]: the social life of some of these different capacities,and the way in which data's put to workin different sorts of institutional contexts,and the infrastructures and techniques and thingsthat it depends on.And specifically, I'm going to lookat some experiments in participationusing digital methods and data infrastructures that we'vestarted to do with the Public Data Lab, in which many

  • 27:47

    DR. JONATHAN GRAY [continued]: of its partners in its network have been doing for some time.Before I do that, I'm going to quickly explaina little bit about the way that Icome at this, which is through my current work on data worlds.And what I'm particularly interested in in this workis how digital technologies are beingused to redistribute different sorts of public data worlds.

  • 28:08

    DR. JONATHAN GRAY [continued]: And by data worlds, I'm just going to briefly explain.There are three things that I'm particularly interested in.The first is this notion of horizons of intelligibility.So how are things made meaningful or experienceable,or how could we reason with things using data?I think this relates to several of the points

  • 28:28

    DR. JONATHAN GRAY [continued]: that we've seen earlier, like what data is collectedabout what different sorts of issues,and what different sorts of phenomena in the world and how.The second, I guess, is this idea of social worlds.So as you were saying earlier, if you look at anythingfrom a national statistics institute to a social mediacompany, there are going to be these whole teams of people

  • 28:49

    DR. JONATHAN GRAY [continued]: who are involved in creating, using, and makingmeaning with this data.And I think there's this fantastic,classic sociological work called Art Worlds by Howard Becker,where he makes exactly this point very well,that you don't just look at this beautiful Eduardo Paolozzion the wall.You have to look at all the things,like the frame and all of the different sorts of workthat go on, in order to enable usto see this on the wall as a picture,

  • 29:10

    DR. JONATHAN GRAY [continued]: with these various kinds of conventions that we have,that means this is meaningful as a cultural artifact.The final thing is transnational political projectsto reshape the world through data.So how are we seeing different sorts of alliances,different sorts of relationships between not only countries,but also large corporations and research institutes

  • 29:31

    DR. JONATHAN GRAY [continued]: and NGOs and other entities?So to summarize, I think there aredifferent ways of creating and organizing and usingdata, which enable different sorts of what we might callepistemic social and political possibilities,different sorts of ways of knowing the world through data.And the thing that we're interested in with the lab

  • 29:51

    DR. JONATHAN GRAY [continued]: is, how can we make space for reflection and interventionaround the different sorts of data worlds,everything from the sort of data we see from large technologycompanies, through to the STGs and new sorts of indicatorswithin the public sector and beyond?And so, we're interested in experiments and participation,public imagination, deliberation, and mobilizationaround these emerging data infrastructures and data

  • 30:13

    DR. JONATHAN GRAY [continued]: worlds, through a combination of digital methods, digital data,participatory design, as well as social and humanistic research,which should be familiar to many of you,if you're here at this event run by SAGE.So one of the things that we're drawing onis this notion from this artificial intelligenceresearcher called critical technical practice, which Agre

  • 30:35

    DR. JONATHAN GRAY [continued]: describes as the process of having "one foot plantedin the craft work of design and the other footin the reflexive work of critique."And in a similar vein, we're interested in developingwhat you might consider a critical data practice, wherewe are combining the craft of working with datawith critical reflection on data and the social and historicalprocesses of its making.

  • 30:57

    DR. JONATHAN GRAY [continued]: And this is something, I think, many of our colleagueswho are involved in the lab have also been working on,which is not just about analysis,as a colleague, Noortje Marres, says, but alsoabout the different forms of interactivity,the different sorts of social relationswhich are made possible through data.So with the lab, we have these themesof facilitating research, democratic engagement,

  • 31:18

    DR. JONATHAN GRAY [continued]: and public debate around the future of the data society.And we have a number of different researchersand research networks who are involved.I see there's several slides missing, OK.

  • 31:34

    KATIE METZLER: Oh, no.

  • 31:35

    DR. JONATHAN GRAY: Yeah, there's a few missing.Never mind, so what I'm going to do is, I'm going to tell you.And you have to imagine the picturesabout one of our projects, which is "A Field Guide to FakeNews," which is what we've been doing with SAGE,and which we're also working--we're in discussion about this as a potential book project.But the idea, essentially, was to takethis hugely mediated and politicized notion

  • 31:59

    DR. JONATHAN GRAY [continued]: of fake news as a site for experimentationaround the way in which you can use different forms of dataand digital methods and infrastructuresin order to understand what fake news is, how people are makingmeaning with it, how it's being shared, and so on.And there are three things that I was going to show.And I guess, the emphasis of this project

  • 32:20

    DR. JONATHAN GRAY [continued]: was not just to look at how one could instrumentallyuse data and methods in order to crack down on fake news,but also to give a different picture about what fake news isand what it tells us about the different sorts of mediaecosystems and platforms, which are increasinglyentangled in many areas of social life.

  • 32:40

    DR. JONATHAN GRAY [continued]: So there are three things I was going to show you.In fact, I could-- no, I'm not going to do it.The temptation is to go on the web and show you.But there are three things that I was going to talk about.The first is just, apart from lookingat the volume of sharing, which is a very common thing that yousee in many newspapers, like how much fake news was shared,particularly ahead of the US elections.

  • 33:02

    DR. JONATHAN GRAY [continued]: We were, instead, looking at where it was shared, by whom.Who's making meaning with it?What sorts of things is it being used in orderto do in the world?And this immediately re-frames the problem,not as something of rogue information or--there's a lot of these metaphors of the virus that's

  • 33:23

    DR. JONATHAN GRAY [continued]: infecting the host body of these passive publics.Instead, we're looking at the rolethat people play in using this information,and how it reinforces different sorts of sensesof political identity, and people feelinglike they're not being heard.And sometimes, there's also satire and all kindsof other things that are going onthat I think it's quite important to look at.The second thing that we were looking at

  • 33:43

    DR. JONATHAN GRAY [continued]: is the social life of fake news and lines,so taking a few of these different storiesand following them through from wherethey started, to them being spread by various mediasources and groups.And one of the things that we found quite interestingis, if you do this, you can see that oneof the most famous claims is Pope supports Trump, right,

  • 34:04

    DR. JONATHAN GRAY [continued]: which is one of the most widely sharedpieces of fake news stories in the run-up to the US elections.One of things that we did, in following this through, was--you can see, it starts life as satire.So it's very clearly on a satirical website.And it's laundered by what we describein the project as a laundering agent.

  • 34:25

    DR. JONATHAN GRAY [continued]: And then, it triggers these various responsesby fact checkers, by media organizations and otherspicking it up and referring to it, which then, in effect,perpetuates the life of this thing.And studying that, studying how things move around,gives us a very different sort of pictureabout what fake news is and how importantthe way in which it's shared is to it as a phenomenon.

  • 34:47

    DR. JONATHAN GRAY [continued]: It isn't just about not being true.It's also about the media, ecosystems,and infrastructures, which make this phenomenon possible.The final thing that we did in the project, which-- again,I had some pictures, you have to imagine in your head--is thinking about the techno-commercial underpinningsyou might consider of fake news.

  • 35:08

    DR. JONATHAN GRAY [continued]: So Obama was said to be extremely concerned,in the final days of his being in office,about Macedonian teenagers.And Macedonian teenagers were saidto be spreading this material, and making huge amountsof money on their phones.And it's very lucrative.And the line was, we don't actuallycare what it is that we're sharing.

  • 35:29

    DR. JONATHAN GRAY [continued]: We've just done some testing with different sorts of groups.And it turns out that Trump groups, apparently,seem to really like this stuff.So we seed it there.And we make some advertising revenue.So what we were doing was investigating those sortsof claims by looking at trackers, or tiny bits of codewhich are embedded in these different websites,and using that to understand how people were monitoring

  • 35:51

    DR. JONATHAN GRAY [continued]: the flow of these things online in different settings.And we were looking at the difference between mainstreammedia organizations' use of trackersand the use of trackers on fake news websites,and seeing different sort of footprints.And by that, you can, then, startto identify these clusters of the media groups,

  • 36:11

    DR. JONATHAN GRAY [continued]: as it were, of how fake news is being shared.The final thing I was going to say was, that isan example, with the lab, of how we'relooking at not just the analytical possibilitiesof big data and data technologies and datainfrastructures, but also how they'reput to work in different social settings.

  • 36:31

    DR. JONATHAN GRAY [continued]: We're interested in substantive experiments, which involveothers outside of research.In this case, we were working with journalists.We worked with BuzzFeed News, The New York Times, NRC,and a number of other media outlets.And of course, as many of you have alluded to,genuinely interdisciplinary work, in this sense,is really hard, because it involves different ways

  • 36:51

    DR. JONATHAN GRAY [continued]: of seeing the world through data and, of course,different sorts of methods, different sorts of priorities.But nevertheless, this is somethingthat we feel is important, if we careabout the way in which data is put to work in the world.And it will take hard work.And it will, sure.There's all kinds of policy and institutional work that needsto happen to support this.So the three or four areas that we're looking at next

  • 37:13

    DR. JONATHAN GRAY [continued]: are air pollution, data and cities, data and democracy,and continue to to look at fake news in France and Germany.So if you happen to be interested in anyof those topics, then we'd love to hear from you.[APPLAUSE]

  • 37:38

    IAN MULVANY: All right, thank you very much.[Ian Mulvany, Head of Product Innovation, SAGE Publishing]So how many people in the room, here, wouldassociate yourselves with the social sciences?Let's see a show of hands.So there's loads of people in the room.That's great.So thinking back on some of the things we've heard today,we at SAGE, when we think about whogets to look at this data that's being generated,when we get to think about which groups

  • 37:60

    IAN MULVANY [continued]: get to play in this arena, and whenwe think about who is going to be responsible for buildingthese algorithms that will take control of our lives, at SAGE,we passionately believe that all of you in this room, allof those people who are associatedwith the social sciences, need to have a seat at that tableand need to be involved in those discussions.Even though you might think of usas just being a publisher, what we do is,

  • 38:22

    IAN MULVANY [continued]: we try to build bridges to knowledge.We want to build bridges between the worlds of big dataand the social sciences.So what I'm going to talk about today is fairly concrete.I'm going to talk about how we, as a publisher,understood the needs that might exist in this gap, howwe picked an area that we could try to affect change

  • 38:42

    IAN MULVANY [continued]: in that gap, and what we've done about that over the last year.So here we go.So SAGE Publishing, we are an independent publisher.We were founded in 1965.And the way we tried to understand the needsgap around big data for the social sciencesis, we did a survey of about nine and a half thousand

  • 39:04

    IAN MULVANY [continued]: participants.I have to say, my colleague, Katie, here,was heavily involved in that survey.I came in later.We got a good set of responses.We actually surveyed, I think, about half a million people,and got about nine and a half thousand responses.And then, we crunched that data.We got a lot of results.We did the analysis.

  • 39:25

    IAN MULVANY [continued]: But the results ultimately boiled downto social scientists telling us that theyhad challenges around access to skills, accessto the raw data, access to creatinggood collaborations, access to good quality softwareand understanding how to work with that software,and problems of getting credit when workingin interdisciplinary domains.

  • 39:47

    IAN MULVANY [continued]: And now, the speakers on our panel,here, are great representatives of peoplewho've been able to get past these challenges.But what we found was that most social scientists are stillstruggling with one or other of the challenges on the screen.Does this look like something that any of you in the roomwould recognize as problems that you might have in gettingstarted with big data?Maybe not, OK, so we can keep moving.

  • 40:07

    KATIE METZLER: They nodded.

  • 40:08

    IAN MULVANY: OK, we got some nods.That's good.All right, those are our substantive problems,as we understand them, that social scientists havein getting started with working with this kind of material.But what are we going to do about that?So a little over a year ago, we decidedto create, within SAGE, a small innovation incubator

  • 40:30

    IAN MULVANY [continued]: to take this information on boardand try to come up with ideas about howwe could help in this domain.We came up with a mission.Our mission is "to improve social scienceby giving every researcher the skills and tools that they needto work effectively with big data and new technology,"essentially, the skills and tools that all of youneed to work with big data and new technology.

  • 40:52

    IAN MULVANY [continued]: We decided not to tackle the getting credit partwithin our incubator group, because we thought that thatwas something that was being dealt wellby other areas of SAGE's publishing business.So that focused us on these other four key areasof need, skills, data, collaboration, and software.And it was a very interesting process to go through.

  • 41:13

    IAN MULVANY [continued]: We adopted methodologies from Lean Startupand from Lean Product Development, taking leafs outof the playbooks of the companiesfrom Silicon Valley who are actuallybuilding this kind of new, big data worldthat we're coming to live in.These are some of the techniques that weused to try and understand some of our product ideasand concept ideas.

  • 41:34

    IAN MULVANY [continued]: But in effect, what we were doing is,we were running little mini-micro experiments.And those experiments were tryingto help us invalidate our ideas as quickly as possible,so we could move on to try and findan idea that would actually be something we could doin a reasonable amount of time, that could provide valuefor the community.And a lot of this work was just doneusing Post-it notes, people in a room,

  • 41:54

    IAN MULVANY [continued]: talking to a lot of academics, talkingto a lot of social scientists.Eventually, we came up with the first viable idea.And when I say eventually, it waswithin about a four or five week period of time,which was to target trying to create an area where we couldproduce skills and give researchers the abilityto learn some of these skills themselves,through creating an online set of courses.

  • 42:17

    IAN MULVANY [continued]: We did a little throwaway test by setting up a landing page.At this stage, there were no product behind this.This was just a landing page wherewe tested the hypothesis of whether people like yourselveswould be interested in learning these skills for various pricepoints.And we got quite a bit of traction.We discovered that people would bequite interested in doing this.So we started in March of this year.

  • 42:40

    IAN MULVANY [continued]: And between March and September, wewent out and built an online learning platformfor social scientists to get some of these skills.These are the courses that we ran.We found that people were interested in learning thingslike Python, R visualization, data sciencemethods, and fundamentals of quantitative text analysis.

  • 42:60

    IAN MULVANY [continued]: So that's what we did.The courses are in the process of wrapping up,the first set of courses.We've gotten really good feedback from the community.But I'll just step back a bit.The reason we've done this is because we'retrying to put our weight behind initiatives that can help youas a community get the skills that you

  • 43:21

    IAN MULVANY [continued]: need to work with the kinds of substantive questionsthat the people on the panel today have been talking about.And this has just been our first initiative.This is something that we've beenable to build and put together in about seven months.But when we go back to thinking about the landscapeof other, wider needs that people in this communityhave to be able to work with this kind of materialityand this kind of information, there are dozens of other ideas

  • 43:43

    IAN MULVANY [continued]: that are out there.And if any of you have ideas that youthink you are passionate about seeing something like SAGEget involved with, please come up and talk to me afterwards,because we want to continue iterating on our thinkingand iterating on the kinds of thingsthat we can offer to you as a community.There are other challenges, challengesaround ethics, challenges around industry capturing research.

  • 44:05

    IAN MULVANY [continued]: We're seeing that a lot of peoplewho have these kinds of skills are migrating into industry.Access to data is one that we haven't been able to crack.Questions around co-creation of tools,rather than adoption of tools, are people justpicking up tools from other disciplines,because they seem to be easy to use, to work with data,or is it better for you, as a community,to build your own tools, with the understanding

  • 44:26

    IAN MULVANY [continued]: of the theories and grounding of howyou want to work with that data from the first place?Also, recreating the wheel, tryingto get that balance between taking where you canand inventing where you need to.We created a white paper.We published a white paper on some of our findings.This is getting to that white paper.And those were the things that I wanted to tell you about,

  • 44:48

    IAN MULVANY [continued]: thank you very much.[APPLAUSE]

  • 44:52

    KATIE METZLER: All right, OK, so nowis the fun part, where everyone gets to grill the panel.So there should be a microphone coming around.Does anyone want to start us off with a question?Otherwise, I will.All right, while you're all thinkingof your excellent questions, I will ask the first one.

  • 45:14

    KATIE METZLER [continued]: So, actually, my first question is back to the introductionthat I gave, around how concerned we should beversus hopeful we should feel.And that's, actually, to each of the panel members.I'd love to hear a little bit morefrom you about how you read all of these news storiesthat we're seeing all over the place,

  • 45:35

    KATIE METZLER [continued]: and as people who are, actually, working in the field.[How concerned, or hopeful, should we be for the future,given the number of recent news stories surrounding big data?]

  • 45:45

    DR. MARIA FASLI: So I think you needto maintain a balanced view.Unfortunately, being in the discipline, and understandinghow some of these methods work, can make you paranoid.And if you do, indeed, read the Facebook terms and conditions,and they do run into hundreds of pages,you will realize that you are signing quite a lot away.

  • 46:08

    DR. MARIA FASLI [continued]: So you need to be careful when you are uploading photos.But you need to maintain a balanced view on this.You can use data for good.And you can use data--and I'm talking about using analytics methods and data--for bad purposes, or you may cause damage, unintentionally.

  • 46:29

    DR. MARIA FASLI [continued]: That's why I said that we need to work together.Sometimes, when you read the news, it can be quite scary.But we have been through other industrial revolutionsin the past, when things looked as if they are going to go downa route where people didn't necessarily understand

  • 46:52

    DR. MARIA FASLI [continued]: what was going to happen next.And we've managed to pull through.And if we are able to put togetherour collective efforts, if you like,and I'm talking about the disciplines coming together--so the state assigns discipline.That's why I was so insistent on this being interdisciplinary.It's not about just developing the methods.

  • 47:13

    DR. MARIA FASLI [continued]: Computer scientists, developers, needto understand there are going to be implicationsof their methods being used.And if we start talking about other methods,like artificial intelligence methods, for instance,and methods that can profile you,you can understand why people are getting, perhaps

  • 47:33

    DR. MARIA FASLI [continued]: understandably at this moment in time, concerned.But we need to have a balanced view.And we need to have the social sciences perspective,because all of these technological changes, and allof these analytics methods, they do have an impact on our lives.And we do need the policymakers and the legislators

  • 47:55

    DR. MARIA FASLI [continued]: to work together to address this really big problem that weare going to have, because technology is notgoing to stop.We're not going to stop developing these methods.But we need to make sure that individual rights areprotected, and we use these methodsfor the benefit of society, and for the benefit of individual.

  • 48:16

    KATIE METZLER: Anyone else want to answer that one?

  • 48:18

    DR. JONATHAN GRAY: Sure.I have this--I don't know whether it's good or bad--but I have this completely unshakable and profoundoptimism, but in the really, really far future.So it's kind of hard to feel optimistic about what's going

  • 48:39

    DR. JONATHAN GRAY [continued]: to happen imminently, right?But I think that if you look far, far, far in the future,maybe things aren't, necessarily, so bad.But I guess, what really interests me at the momentis, actually, in terms of-- so a lot of my researchat the moment is not just about using these different sortsof methods, but also about studyinghow different sorts of data and methods

  • 49:01

    DR. JONATHAN GRAY [continued]: are put to work in the world in different ways.And the thing that interests me a lot at the momentis how do you go beyond the imaginaries,because there's a lot about this.This is not just to do with the infrastructure,not to do with the practices, but howwe imagine data to play a role in our societies.There's huge things that this draws on in the world,from modernism and democracy and science

  • 49:23

    DR. JONATHAN GRAY [continued]: and all sorts of other things that weimagine are part of the world and playa particular role in the world.But I think the thing that interests me at the momentis to go beyond the imaginaries we have about transparency,on the one hand, and privacy, on the other hand.And this applies from everything to algorithms.I don't think that, at the moment,just thinking about algorithms in termsof seeing under the hood-- and this

  • 49:44

    DR. JONATHAN GRAY [continued]: is a point that Kate Crawford and Mike and Anniehave made very eloquently, recently, whichis, we are, actually, part of these systems.It's not as though we can just look inside what'shappening at a number of technology companiesand have a better understanding of what's going on.We are entangled with them in very fundamental ways.So the way that Google search works,the way that these algorithms are optimized,

  • 50:05

    DR. JONATHAN GRAY [continued]: responds to the social life around themin ways which mean that, when youtalk about an algorithm being racist,it's not just what happens at a code levelinside the technology company.There's all kinds of other thingsthat are going on there, which I thinkit's very important to unpack, which leads meto the second point, which is that this isn't just about whohas access to what and when.It's also about how these systems are configured,

  • 50:27

    DR. JONATHAN GRAY [continued]: which leads you to very important social, political,and ethical questions about what is measured and how,which is something that anyone who'sa social scientist and familiar with the of history of methodswill be deeply aware of, which is to do with how you configurethese apparatuses for knowing the world in different ways,and who shapes them, in a much more substantive way than who

  • 50:51

    DR. JONATHAN GRAY [continued]: just gets access to their fruits.So I think I'll leave it there.

  • 50:57

    DR. SLAVA MIKHAYLOV: Just a couple of points,I'm also optimistic, but in a not that far future.I think it's much shorter term.We're not really that far off from a lot of thingsworking at a level that is comfortable for us to use it.So Facebook terms and conditions are impenetrable.

  • 51:20

    DR. SLAVA MIKHAYLOV [continued]: And I don't know, maybe Maria is the only person who read them.

  • 51:23

    DR. MARIA FASLI: My husband, actually, not me.

  • 51:25

    DR. SLAVA MIKHAYLOV: That's absolutely fine.But lawmakers and policy makers are catching upwith the governance.If you think about that the third wave of AI revolutionroughly started 2006, 2007, it is really recentwhere this abundance of data is pushing the boundaries.And it is developing.But in the meantime, we did come up with a lot of regulation,

  • 51:48

    DR. SLAVA MIKHAYLOV [continued]: already.And May 2018, GDPR, the new data protectioncoming in, more and more things are being put in.And as a user, as a person who is tryingto work with public data--and this is probably one of the answers to the questionnairefor social scientists-- is access to data.

  • 52:09

    DR. SLAVA MIKHAYLOV [continued]: It is really difficult to gain accessto data from public sector.It is probably easier if you work for a private companyand you get access to within products,so probably easier for Google to get access to Google data.But it's almost impossible to get access to public data,because of all the regulations in place.Risk aversion, also-- nobody wants

  • 52:30

    DR. SLAVA MIKHAYLOV [continued]: to be on the headlines of Daily Mail.But it's part of life, now, that everybodyunderstands the drawbacks.And partly, it is this self-regulating thingthat is also playing.So we have bad examples.Some people would think bad example of Cambridge Analytica.Some people would really enjoy the fruitsof Cambridge Analytica work.But we have things like Cambridge Analytica.

  • 52:51

    DR. SLAVA MIKHAYLOV [continued]: But we also have GDPR.We have data protection.And we have a lot of governance mechanismsthat are already in place.So I'm much more optimistic, and for much shorter term.

  • 53:02

    DR. JONATHAN GRAY: OK, good.

  • 53:04

    IAN MULVANY: I'm also an optimist.If I think back to my life--I was born in 1974.The internet didn't exist.The ability for the web to connect people,to create micro-communities, is somethingthat has just transformed my experience of the world.And the world itself, I think, whenyou look at it at very broad levels,has many indicators that are improving.

  • 53:26

    IAN MULVANY [continued]: Child literacy is increasing.Child mortality is going down.The number of wars are going down.And those are things that we tend to neglectto look at when we think about these questionsabout existential risk, because we're alwaysthinking about the really flashy, exciting, riskyquestions.And when you talk about big data in AI,the things that can become manifestare will AIs become self-aware and eat humans, or turn us

  • 53:49

    IAN MULVANY [continued]: all into paperclips?And I just don't--I don't buy that in any way.I do see regulation increasing.And one of the other things that Ithink I see happening over the next five to six yearsis that many of these techniques,which seem a little alien to us today,will become completely commoditized and completelynormalized in our life, and very comfortable

  • 54:11

    IAN MULVANY [continued]: for us to work with.And with that comfort will come expectationsaround extra regulation.So I'm definitely on the optimistic side.On the other hand, every now and again,a story does come along which makes you go, ooh that is ita little bit scary.But I think it's important to keep those in context.[If there was one question you could put to Facebook thatwould enhance your own research, or that would be goodfor academic research in general, what would it be?]

  • 54:35

    DR. JONATHAN GRAY: Do you have a planfor making some of the data that you possessavailable to researchers?And what is the government's mechanismthat you would propose for doing that in a way whichmeans that there can be some sort of accessto, but also oversight of, the way in which that data is

  • 54:57

    DR. JONATHAN GRAY [continued]: being put to work?I think that's the single thing that I'll ask.[INTERPOSING VOICES]

  • 55:03

    DR. SLAVA MIKHAYLOV: So if you'reswinging by any of the other technological companiesalong the way, when you're talking to Facebook,the same question can be-- exactly the same thing canbe asked of any one of them.

  • 55:13

    AUDIENCE: --things that Facebook could do to [INAUDIBLE]focusing--

  • 55:16


  • 55:17

    DR. JONATHAN GRAY: I have a second.

  • 55:18

    AUDIENCE: You have a second one?

  • 55:19

    DR. JONATHAN GRAY: Yeah, yeah.I was just going to say that the thing that, actually,was really fascinating was, one of the big approachesto fake news-- and this is just aboutfake news-- was that material was starting to disappear.And that, actually, makes it quite difficult for researchersto understand what material is being producedand how it's circulating.It is also a byproduct of their approach,of these flagging mechanisms.

  • 55:40

    DR. JONATHAN GRAY [continued]: So one of the things that, I think, many of ushave proposed that worked on this project is,would they consider something like an archive, yeah,like an archive of fake news, as ridiculous as it sounds?That's just one case, but I think itwould be tremendously valuable.

  • 55:56

    DR. SLAVA MIKHAYLOV: So if Facebook can fix it,that will be trailblazing for all the other companies.And for social science researchers,that's one thing that--it often takes one.And, hopefully, it could be Facebook.It takes one and then, with a proper governance mechanism,as an example, it could work.If it works, then the other companies will also follow.

  • 56:18


  • 56:19

    DR. SLAVA MIKHAYLOV: The data access issue, the governance,the data access-- so I think it boils down to one question,yeah.I think it's the big question for us, yeah.

  • 56:27

    DR. MARIA FASLI: Yeah, I would second that.It's the data access, because the wealthof data, the richness of the datathat you would have there, from a social science, as wellas a computational, point of view,would be absolutely incredible to work with.So if that could be arranged, I think

  • 56:47

    DR. MARIA FASLI [continued]: you would have immense gratitude from a really big communityof researchers.[Is Facebook the eternal company that will go on forever,and if not, what is going to bring them down?]

  • 57:02

    DR. MARIA FASLI: It would be fascinating to beable to use data from Facebook to createpersonal histories of people, and in the years to come,see the evolution of society, of individuals, of institutions.If they were to go on for a very long time,it would be amazing, probably not in my lifetime.

  • 57:23

    DR. MARIA FASLI [continued]: But you never know.Maybe AI is going to come up with some kind of chip thatcan be put in my brain.And I can keep going on for a while.

  • 57:33

    KATIE METZLER: Is there any other questions?Yeah, great.[What are the individual's rights in relation to data?]

  • 57:44

    KATIE METZLER: I think it's a great questionto ask right now, though I have a very strong view on this,which is that it has to be about risk and reward.It has to be about benefits and costs.And I think one of the reasons why SAGE is so keen to supportthis field of computational social scienceis because, until we can see lots

  • 58:05

    KATIE METZLER [continued]: of really great examples of how data, our data,is being used to improve outcomes,rather than just to make companies money,I think we are right to be really worried about givingour data away.It turns out, we aren't as worried, maybe,as we even should be.We tick lots of terms and conditions.And then we're upset afterwards, ratherthan thinking that we'll change our behavior

  • 58:28

    KATIE METZLER [continued]: and, perhaps, not sign up for a Facebook account.But if we all, instead of being afraid of giving our data away,had lots and lots of examples of how, by doing so,we had been able to improve the world,cure diseases, increase benefits for society,

  • 58:52

    KATIE METZLER [continued]: I think that that would be quite a different question.So for me, the question has to beabout how we can balance that cost and benefit.And I, actually, think it's very important for social scientiststo be actively sharing examples of times and waysin which data is being used for public good,

  • 59:13

    KATIE METZLER [continued]: because I think it will help us overcome those concernsabout data sharing.

  • 59:20

    DR. SLAVA MIKHAYLOV: Just wanted to say,so the individual rights is absolutely important.But we shouldn't be really bashing Facebook,because, especially, since you guys aregoing to Facebook quite soon.There's absolutely amazing things done using Facebookas data for public good.So I was recently at a presentation from Bangladesh,the equivalent of the cabinet office in Bangladesh.

  • 59:43

    DR. SLAVA MIKHAYLOV [continued]: And there, they used Facebook as this mechanismfor people in rural areas to access public services in a waythat, if there's an easy way to complainabout the absence of public services, I would rather say.And things get done.There's a feedback.And it's a short feedback mechanism,

  • 01:00:05

    DR. SLAVA MIKHAYLOV [continued]: because it's short on the usual writing a complaint,waiting for it to be--a letter, waiting for it to be delivered,waiting for it to be reviewed.Now, they have this, the equivalentof cabinet office in every province and every district.And citizens can directly refer.And everybody, the officials there, feel accountable.

  • 01:00:25

    DR. SLAVA MIKHAYLOV [continued]: And that was a dramatic change.So talking to people from Bangladesh,they felt that it's a dramatic change.And this was not directly thanks to Facebook,but using Facebook as an infrastructureto deliver public goods, and using data for public good.

  • 01:00:42

    DR. JONATHAN GRAY: I think the way that I would probablyaddress this is to suggest that, I think,a lot of the most common policy responses--and many of this research and advocacy responsesfocus on different sorts of rights and interestsof transparency of privacy.There's a strong digital rights movement, which, I think,is very important in the context of this area.

  • 01:01:06

    DR. JONATHAN GRAY [continued]: But I also think that one of the points that wemake a lot with the lab is that we alsoneed what we describe as a kind of infrastructural imagination,which is a very silly phrase.But it's the idea of thinking about whatthese infrastructures entail, which is that these aren't justtechnical systems.These are systems which involve many people,

  • 01:01:27

    DR. JONATHAN GRAY [continued]: many different sorts of technologies,many different sorts of standards, many different sortsof committees.Some of these things are internal to the technologycompanies we talk about, some not, different sortsof conventions and protocols.There's a whole range of different thingswhich are involved.And I think, also, for my taste, as someonewho's interested in the history of political thought, the idea

  • 01:01:49

    DR. JONATHAN GRAY [continued]: that something can only have a rights-based solutionseems quite optimistic.There are a range of different sorts of policy responses,political responses, but also, different sortsof public debate and democratic engagementthat, are required over a long period of time,to shape these different systems to dodifferent things in society.

  • 01:02:09

    DR. JONATHAN GRAY [continued]: I think it's interesting.A lot of the debates that we're having so farhave been very much about technology companies, whichis true on the one hand, but also,kind of stepping back and seeing the continuity, here, ratherthan just the novelty.People have been generating, maybe not big data,but quite large-scale data sets about thingslike climate change or migration or many other thingsfor a long time.

  • 01:02:29

    DR. JONATHAN GRAY [continued]: And I think developing the capacityto have imagination about how we shape these things is goingto be really import, which will involve a range of different--that's not very helpful, but I think, basically, that would bemy response, is we need to think about not just doing researchwith data, but also about the way in which datais created in our society.

  • 01:02:51

    IAN MULVANY: It's everything all these people said,but there was one aspect of your commentthat I want to just think about a little bit.So you said, this data that we have, that we've created,that's incredibly valuable.Actually, it's only marginally valuable.So for any single individual, for Facebook,it's worth, maybe, a couple of dollars, maybe,a couple of tens of dollars per individual per year.

  • 01:03:11

    IAN MULVANY [continued]: And so, it only becomes incrediblyvaluable at vast, aggregated scales.And there are a couple of really interesting byproducts of that,I think.So one of these byproducts is that mostof the data that you produce in your digital footprintsis real, just sort of exhaust.It's lots of junk, which is really hard to interpret.You got lots of data, very little insight.

  • 01:03:32

    IAN MULVANY [continued]: And the mechanisms, mechanics, and infrastructureyou need to be able to collate that, aggregate it, and collectit into one place, and make any insight from it, whatsoever,requires the creation and inventionof entirely new tooling and methodologiesfor dealing with data at scale and at pace.And so, I'll just put that idea there.

  • 01:03:53

    IAN MULVANY [continued]: An area where it would be, clearly, advantageousif we could get our act together around lookingat data in the public sphere, wouldbe if we could reasonably connect the health dataand health profiles of all of the individuals in the NHS.And the government has wasted billions of poundsin trying to create an integrated ITsystem, which has failed them.

  • 01:04:14

    IAN MULVANY [continued]: And so, I think the pressure from these commercialcompanies, to be able to extract this minimal value at scalefrom individuals, has led to systems which may, potentially,be able to revolutionize how things are donein the public sector, to create services that could, actually,benefit us.So it's kind of a byproduct effect.

  • 01:04:35

    IAN MULVANY [continued]: And we're beginning to see the emergence of interesting piecesof work, like ESTA in London are startingto use data science to identify delinquent landlords.But the tooling and methodologieshas had to be derived, because our data is only approximately,marginally valuable, and only gets to value at massive scale.So yeah, there are lots of questions about risk, reward.

  • 01:04:57

    IAN MULVANY [continued]: How valuable is our data?How much privacy do we need to have control of?How do we, then, take advantage of the benefitsof the systems that have emerged to drivethe profits of these companies, but take those systemsand put them into good use in the public domain?I think there's lots of interesting questions, here.[If the world of big data could gain something from the socialsciences, what would it be?]

  • 01:05:18

    KATIE METZLER: One piece of evidenceis how many social scientists these companies are hiring.So there are a lot of social scientistswith computational skills who are being hired by Facebookand Google and Twitter and Microsoft, whichsays to me that they do value and need these skills,because ultimately, the data that they are working with

  • 01:05:39

    KATIE METZLER [continued]: is human data.And they want to understand what it tells usabout the humans creating the data,and the humans using their products.So that is one example, is that thereis certainly a bit of a brain draingoing on, with academics being poachedby technology companies, which suggests

  • 01:06:01

    KATIE METZLER [continued]: that they want the skills.

  • 01:06:05

    DR. SLAVA MIKHAYLOV: I think domain expertise--so social science as being one of the domains, applicationsto society, domain expertise will always be important.And if you think about data size as this intersectionof computer science, analytical skills, statistics,and domain expertise, that's where social science comes in.And it's really completely different thing

  • 01:06:27

    DR. SLAVA MIKHAYLOV [continued]: to get a computer scientist, who does a lot of machine learning.And it will be absolutely amazing algorithms.It is going to work absolutely fine.But if you want it to apply to a specific problem in society,sooner or later, they will have to bring in a domain expert.And that's where social scientists have--that's where we have value, specifically.

  • 01:06:47

    DR. SLAVA MIKHAYLOV [continued]: So that's how I ended up in the Institutefor Analytics and Data Science, because they needed a domainexpert about public sector and application of data sciencein public sector.And it's difficult to get a computer scientist involvedin that, purely on that, because you need to understandthe workings, how public policy is being done,how the public administration, public management,

  • 01:07:09

    DR. SLAVA MIKHAYLOV [continued]: is being done, if you want to apply that.And I think that's much more general.So demand expertise, and I think that's wheresocial science comes in.

  • 01:07:20

    DR. JONATHAN GRAY: Yes, is the--[LAUGHS] I think.But there's a couple of things, I guess.The first is, again, understanding the social lifeof data infrastructures and data practices and data technologiesand what they do.There's all kinds of things that can be done, from ethnography

  • 01:07:41

    DR. JONATHAN GRAY [continued]: and even very qualitative approaches,in order to understand how people are making meaningwith data and how data is being putto work in many different contexts, because I think--which brings me onto my second point,is the huge continuities that we have to always remember,even though the sorts of particular infrastructuresand tools might be computationally more advanced.

  • 01:08:06

    DR. JONATHAN GRAY [continued]: Many of the same concepts and metaphors and ways of thinkingabout social life are--there are some interesting continuities.So whether it's to do with the influence of graph theoryor networks, you can go back to Jacob Morenoand huge traditions of the social sciences, whichhave gone through physics and biology

  • 01:08:27

    DR. JONATHAN GRAY [continued]: and then come back again, to all other kinds of areasof understanding social life.But also, understanding concepts,so many of these things are so heavily dependent on--it's not just-- this isn't just technological or computationalknowledge, which is required to configurethese different things and put themto work in these different domains

  • 01:08:48

    DR. JONATHAN GRAY [continued]: that we've been talking about today.There's also an understanding of, conceptually,what's happening, which is hugely important,and which borrows heavily from many areasof the social sciences.So I think my answer would be hugely yes.I mean, yeah.

  • 01:09:04

    IAN MULVANY: So if you think about someof these AI and machine learning algorithms,they've been presented here as these great toolsand techniques that can help you drive insight.But the dirty little secret of a lot of these algorithmsis that they are hyper fine-tuned.And they require a lot of nudging.And there isn't a science to it.

  • 01:09:24

    IAN MULVANY [continued]: It's sort of an art.And so, to get to a point where you get it to work and becomea predictive tool for you, never an explanatory tool, but evenjust a predictive tool, you often get into a situationwhere you're not quite sure what's going on.And to get past some of that, youneed some insight into the domain knowledge,as these people are saying.But one of the other things that's happening is, I think,

  • 01:09:45

    IAN MULVANY [continued]: there's still an issue of people from physics backgroundsand from computer science backgrounds taking the toolkitthat they learned in their domains,and applying them to social data sets,and beginning to think that they'remaking headway, and having a great time, but really,just redoing really basic work that the social sciences havedone 40 years ago, and wasting a lot of time.

  • 01:10:07

    IAN MULVANY [continued]: And so, I think there's absolutelya need in creating a kind of a marketplace, wherethese people can have a lot of their efforts short-circuited,so they can get onto the really interesting question.However, right now, today, such a marketplace does not exist.It's happening in these more ad hoc networks,or through people going into these big companies.And I think we really have to work as a community to creating

  • 01:10:29

    IAN MULVANY [continued]: a marketplace like that, to help advance the whole fieldand begin to make better results outof these techniques for society.I think it's really, critically important.

  • 01:10:41

    DR. MARIA FASLI: Absolutely yes, social scientistsare needed, in fact, desperately,because you can have your computer scientistsand your statisticians coming up with new techniques and methodsfor processing data and extracting insight, as we said.But actually, the social scientistsare the ones that are going to help the computer

  • 01:11:04

    DR. MARIA FASLI [continued]: scientists and the statisticians and the developersask the really interesting questions, because of the waysocial scientists are being trainedto look at data that have to do with people,that have to do with society.And computer scientists, your statisticians, we're

  • 01:11:24

    DR. MARIA FASLI [continued]: not being trained.And I'm talking from experience.I'm a computer scientist by background.So my brain has been trained in a particular way.And I will ask questions around a data set.But when I work with a social scientist,because their perspective is very different to mine,that's when you get to ask the really interesting questions.

  • 01:11:48

    DR. MARIA FASLI [continued]: And then who develops the methods,we can talk about that later.But it's this intersection that is needed.And unless you have the social scientist there,I don't think you're going to be able to getthe maximum "value"--and I'm putting value in quotes--out of the data.So absolutely, social scientists have an absolutely key role

  • 01:12:09

    DR. MARIA FASLI [continued]: to play.

  • 01:12:10

    KATIE METZLER: Great.OK, so I noticed that wasn't much of a debate at the end,because we are all optimists.And we all agree on social sciencebeing a really important part of this.So I apologize if there wasn't enough argumenton the panel for you, tonight.But I'd just like to thank you all for joining us.And if we can give everyone a round of applause, thank you.

  • 01:12:32



Four panelists talk through examples of how big data is being used for social good, including the challenges facing academics who are striving to put big data to better use in ways that reduce inequality and improve outcomes for society.

Looks like you do not have access to this content.

Putting Big Data to "Good" Use

Four panelists talk through examples of how big data is being used for social good, including the challenges facing academics who are striving to put big data to better use in ways that reduce inequality and improve outcomes for society.

Copy and paste the following HTML into your website