Skip to main content
SAGE
Search form
  • 00:05

    [Studying Abuse Perception on Facebook Using Android App]

  • 00:09

    SAJEDUL TALUKDER: My name is Sajedul Talukder.[Sajedul Talukder, PhD Candidate, FloridaInternational University] I'm a fourth yearPhD student at Florida International University,and I'm working in the cyber security and privacy researchlab there.So, basically, my research is on social network.So we do research on Facebook and similar kind

  • 00:30

    SAJEDUL TALUKDER [continued]: of social networks.And my research is, basically, to detectthe abuse in online social networks, how the abuses occursin the friends, and is there any wayto automatically detect the abuses,and how can you prevent abuses from the friends.Our lab is, basically, mainly for customer privacy,

  • 00:53

    SAJEDUL TALUKDER [continued]: so privacy was the first concern.So we found that since Facebook isthe most popular social network in the US and almost allover the world--so people are more willing to havea social network in Facebook, rather than havingin some other social networks.And mostly the younger population--

  • 01:15

    SAJEDUL TALUKDER [continued]: they have a tendency to share their information,and their photos, and everything in Facebook.And we found that they're not safely doing it.So they're sharing everything, notknowing that these things are being shared.And these things can be used against them in future--so that people can be spying on them,

  • 01:36

    SAJEDUL TALUKDER [continued]: can be getting the information very easily justbeing friend in Facebook.So we felt that it's is very necessary to finda solution to this problem.So can we detect this abuses in online automaticallyand can you build some solution, so that the people who

  • 01:58

    SAJEDUL TALUKDER [continued]: are not aware of these things--maybe some proportion of the population they're awareand they don't share these things, but most of themare not ever and we blindly believe the friends--so this is a significant problem and thatwas our main motivation to work on the abuse

  • 02:18

    SAJEDUL TALUKDER [continued]: direction in Facebook.[How did you collect your data?]It was difficult because in orderto get some Facebook data, you needto have the user study with the real participants.So it's not other kind of research

  • 02:39

    SAJEDUL TALUKDER [continued]: where you get some data set from onlineand you would do the research--you have a big data set, you do some research--so we need to deal with the real people here.And our goal was to collect the data set from the--across the globe.So we could have easily done that in our university,recruited participants from the University students,

  • 03:01

    SAJEDUL TALUKDER [continued]: but that would make a bias of the populationthat we are doing.So our goal was to collect the data set from that world,so we used some crowdsourcing websitesand we posted our jobs in crowdsourcing websites.We paid its participants for taking our interviewsand taking our users studies.

  • 03:22

    SAJEDUL TALUKDER [continued]: And in order to facilitate them to take the user study,we have built Android apps.We uploaded it in Play Store, so thatare people that are taking the userstudy from other countries-- they can download itfrom the Play Store and they can follow the instructionson the app.And the app is going to tell them what to do,

  • 03:43

    SAJEDUL TALUKDER [continued]: what to answer, and what kind of things they'd need to do.[How did you select participants?]There were several because it's very difficultto filter the participants when you are notseeing that person online--in person.

  • 04:04

    SAJEDUL TALUKDER [continued]: So you're-- everything is being done in virtual world.So you don't know who's there doing the userstudy behind the computer.He might be lying.He might be doing some garbage thing.He might be just rushing over the questions to get the money.So we needed to have a lot of filtersto filter out all those bad participants.

  • 04:28

    SAJEDUL TALUKDER [continued]: So initially we had this primary filteringwhere the participants needed to be at least 18 years old.We checked through the birthday in Facebook.And we also asked them specifically how old are they,and they needed to have an well-established Facebookaccount.You cannot just make an account and then participate

  • 04:50

    SAJEDUL TALUKDER [continued]: in the study.And then, you need to have access to the Android devicefor sure, because, otherwise, you cannot take the app.And then we had some attention check questionsat the beginning of that to checkwhether the participants really understand the instructionsand they are actually paying their attention.

  • 05:12

    SAJEDUL TALUKDER [continued]: So if somebody failed at the beginning screening steps,they were refused to participate in the study.And in the study we selected friends from their friends listto ask questions about those friends.And we also created some bogus friends--fake profiles.

  • 05:32

    SAJEDUL TALUKDER [continued]: And we also mixed these fake profiles togetherwith the real ones and observe howthe user is answering the questions for the fake ones.Do they really identify this person is nottheir real friend?And if they were in automated mode--they're just trying to answer everything--

  • 05:53

    SAJEDUL TALUKDER [continued]: they would fail in that check.So that was one of the other checks.And then, we had the timing checkswhere we recorded the time it takesto click on each of the events and to check how many time ittook to answer the questions.And if they were really going fast,we have discarded the data.

  • 06:13

    SAJEDUL TALUKDER [continued]: Because we know that they were not paying attention.So this were the primary filtering things that we did.[How did you collect data from the app?]I already said we had an Android app,we named the app as AbuSniff.So when the user participated in the study,

  • 06:35

    SAJEDUL TALUKDER [continued]: the app would store the data from the responsesin the local storage of the app and then send itover the server to our lab.So the cyber storage was pretty secured.So there was no way somebody could hack and getthe information out there.And we also conducted the study according to the IRB approval.

  • 06:56

    SAJEDUL TALUKDER [continued]: So there was no way that the personally identifyinginformation of the users who are participating the studycould be revealed.So we are pretty cautious on that thing.And at the initial phases we neededto send the data to the server because wehad to do the machine learning and supervised learning

  • 07:19

    SAJEDUL TALUKDER [continued]: based on these data sets.And then we actually built the model based on the datasets that we collected and embedded this model in the app,so there was no need to send the data about the serverbecause it could be-- if you send the data over the serversomebody going over the internet is not safe.So then we prevented this thing in the later studies.

  • 07:41

    SAJEDUL TALUKDER [continued]: So the app would, itself, analyze the data,then show the things, and at the end,send somebody to the server.[How did you select the right machine learning algorithmto pair with the app?]For the machine learning-- we have tested several machinelearning algorithms.So we tested mostly 10 to 12 machine learning algorithms.

  • 08:04

    SAJEDUL TALUKDER [continued]: And the important thing were the decision trees, the supportvector machines, the random forest, the park,multi-class classifier, the net bias,and all those classifiers.And the software that we used for the machine learningis Weka.So it's an open-source software, so you

  • 08:26

    SAJEDUL TALUKDER [continued]: can use-- anybody can use that software,anybody can customize the software based on their needs.So based on that data set that we collected,we identified several mutual activity featuresfrom the participant and their Facebook friends.And we fitted those data in the Weka classifiers

  • 08:47

    SAJEDUL TALUKDER [continued]: for that training purpose.So for the training we built the model after the trainingand then tested the test data sets on the models and then--the outcome was pretty impressive.So we tested all the machine learning algorithmsand then choose the best performing classifierfor each of the predictions.

  • 09:08

    SAJEDUL TALUKDER [continued]: [How did you build the selected model into the app?]Weka has some library fund that youcan embed the model in the Android itself.So when we built the model, we saved the modelin the computer.And when we built the app again, we actually

  • 09:29

    SAJEDUL TALUKDER [continued]: embedded the model inside the app.So there was no need to communicate outside again.So it's the standard machine learning thing.You collect the data set and you identify your classes.

  • 09:52

    SAJEDUL TALUKDER [continued]: If the classes are not balanced, if the data set is imbalanced--so you balance your data set.There are several methods to balance the data set,so you can use anything.And if there is any missing informationin the data set or some bad data--corrupted data set-- you can justfilter out all those things.

  • 10:12

    SAJEDUL TALUKDER [continued]: Give the good data sets--balance all the data sets, and thentrain your classifiers on those.[What were the results of the study?]Basically, our study was pretty impressive.So, in the initial studies, we tried to find--

  • 10:35

    SAJEDUL TALUKDER [continued]: is the abuse in Facebook really a problem in Facebook?We didn't know what is the people's perception of abusein Facebook.So, basically, the first studies were to identify the abuse.And we found that--in the initial studies we had 80 participants

  • 10:56

    SAJEDUL TALUKDER [continued]: and 71 out of 80 participants hada friend whom they believed is either a stranger-- soby stranger we mean the person hasno interaction with that friend in Facebook or in real life.So one-- 71 out of 80 was either a stranger, or a Facebook

  • 11:18

    SAJEDUL TALUKDER [continued]: timeline abuser, or a news feed abuser.So timeline abuser means the friendis posting some abusive material in your timeline.So in your timeline, you have the photos,you have the status updates, you have the check-ins.So the person is putting some bad things there.And the news feed abuser means the person

  • 11:40

    SAJEDUL TALUKDER [continued]: itself is posting some fake news, some malicious link,some malware, and that is being spread in your news feed.And you can accidentally click and thenyou can be attacked by the malware.Or, you can be attacked by the fake news spam.

  • 12:05

    SAJEDUL TALUKDER [continued]: Initially, our approach was pretty much strict,so we took the hard approach.So we identified the abuse and then suggested actionsfor those abusive friends.So the actions ranged from unfriending the friendto the unfollowing.So we suggested to unfollow, to restrict access,

  • 12:28

    SAJEDUL TALUKDER [continued]: or to unfriend.And we found that when we presented them to unfollow,they mostly agreed.When we to-- tell them to restrict the access,they agreed.But, in terms of unfriending, they were confused.Because if they unfriend a friend,

  • 12:50

    SAJEDUL TALUKDER [continued]: they would lose the connection.So even if the friend is abusive,there might be a lot of reasons why you shouldkeep that friend in Facebook.He could be your boss in your office,he could be your colleague, he could be your, maybe,childhood friend.So you need to have that person in Facebook,but you might be the victim of the abuse from that person.

  • 13:15

    SAJEDUL TALUKDER [continued]: And then we introduced a novel action that we call sandboxing.So by sandboxing we mean that the person will stillbe in your friend list, but he will have no communicationwith you in Facebook.So whatever you do in Facebook, youwill not be able to see them.And whatever they do will not be in your news feed.So they will have no way to identify

  • 13:38

    SAJEDUL TALUKDER [continued]: that you have done something with their friendship.So they will still see that you are still his friend,but there is no communication.So this is the sandboxing.And then we offer sandboxing instead of unfriending,and then the result was amazing.People are willing to sandbox rather than unfriending.

  • 14:04

    SAJEDUL TALUKDER [continued]: I feel like most of the things we did was impressive.So I don't feel we did-- we have to do anything if Iam given the chance to revert.But one of the thing was it would be glad--it would be better if we could have done more userstudies with more data sets.

  • 14:25

    SAJEDUL TALUKDER [continued]: Since we are limited to the crowdsourcing sites,we had no way to recruit more participants more than this.And then, since we had to ask the questions for each friend--so we could ask only 30 friends per each participant.And some of the participants had up to 5,000 friends,so we have suggested them for only these 30 friends,

  • 14:49

    SAJEDUL TALUKDER [continued]: so we couldn't see the other list of the 5,000 friends.So if there is any system that can automaticallydo this for all the friends, that would be better.So this is our future direction-- maybe,we will go in that direction.

  • 15:13

    SAJEDUL TALUKDER [continued]: In this future research--we did this for the existing friends, so whoeveris your already friends.And then our next goal is to identify somebodyat the friend request level.So when you get a friend request for someone,you have no way other than seeinghow many mutual friends you have, and the picture,and the name of the--

  • 15:34

    SAJEDUL TALUKDER [continued]: and the short-- maybe short affiliation on that person--but if you had some way that the machine can do that detection,it can analyze your profile and that person's profile,cross match everything, and then dothe machine learning to predict whether this person wouldbe abusive after being friend.

  • 15:56

    SAJEDUL TALUKDER [continued]: Then it can show you the warnings--you should friend this guy or you should unfriend--or not friend this guy--that this is our next project that we are working on.I mean, social network is a huge thing, so--

  • 16:19

    SAJEDUL TALUKDER [continued]: I mean, one person cannot do everything.So we have tons of social medias,so we are only doing the things Facebook.So there are several types of abuses in Twitter--other things.So-- I mean, whatever you feel that there should be a systemto help people because people are generally not aware.

  • 16:41

    SAJEDUL TALUKDER [continued]: You know the Cambridge Analytica,where somebody collected a data set from 87 million users,and then they actually changed the user perceptionin an election.So that is pretty bad.So we should have some system that people are--if they're interested in the research,they should try to find some existingproblems in the system, and then try to find a solution.

  • 17:03

    SAJEDUL TALUKDER [continued]: How can they solve this problem?And then, maybe go for the method that--I should go for this method or that method.So identify the problem, then tried to find a solution,then go for the method.[Why is this project so important?]Our product was intended to be automated system that

  • 17:28

    SAJEDUL TALUKDER [continued]: can be a social network assistant for people whoare using Facebook, and who are not aware of their friends,and who are unlikely to know all of their friends.So this tool can be handy things for themto help them identify the bad ones, and keep the good ones,and monitor if somebody is doing bad things.

  • 17:51

    SAJEDUL TALUKDER [continued]: So we wish we could build this toolyou know in a very good way.And people can use this thing and thenit would be good for the whole community like for Facebookand all the other people out there.[Further Reading, Talukdar, S. & Carbunar, B. (2018).

  • 18:12

    SAJEDUL TALUKDER [continued]: "A study of friend abuse perception in Facebook."ACM Transaction on Internet Technology.]

Abstract

Sajedul Talukder, PhD candidate at the Florida International University, discusses his research of friend abuse on Facebook, including data collection, participant selection, app creation, algorithm development, data management, results of research, and advice for those interested in this type of research.

Looks like you do not have access to this content.

Studying Abuse Perception on Facebook Using Android App

Sajedul Talukder, PhD candidate at the Florida International University, discusses his research of friend abuse on Facebook, including data collection, participant selection, app creation, algorithm development, data management, results of research, and advice for those interested in this type of research.

Copy and paste the following HTML into your website