How researchers use Facebook ‘likes’ to sway your thinking

Algorithms can predict nuances of your political views more accurately than your loved ones

Kim Kardashian West: researchers  figured out how to tie your interest in Kardashian West to certain personality traits.  Photograph Mike Segar/Reuters
Kim Kardashian West: researchers figured out how to tie your interest in Kardashian West to certain personality traits. Photograph Mike Segar/Reuters

Perhaps at some point in the past few years you've told Facebook that you like, say, Kim Kardashian West. When you hit the thumbs-up button on her page, you probably did it because you wanted to see the reality TV star's posts in your news feed. Maybe you realised that marketers could target advertisements to you based on your interest in her.

What you probably missed is that researchers had figured out how to tie your interest in Kardashian West to certain personality traits, such as how extroverted you are (very), how conscientious (more than most) and how open-minded (only somewhat). And when your fondness for Kardashian West is combined with other interests you’ve indicated on Facebook, researchers believe their algorithms can predict the nuances of your political views with better accuracy than your loved ones.

That is what motivated the consulting firm Cambridge Analytica to collect data from more than 50 million Facebook users, without their consent, to build its own behavioural models to target potential voters in various political campaigns. The company has worked for a political action committee started by John Bolton, who served in the George W Bush administration, as well as for President Donald Trump's presidential campaign in 2016. "We find your voters and move them to action," the firm boasts on its website.

Sway elections

Cambridge Analytica now says it has destroyed the user data it collected on Facebook. Raw data suggests the information, or copies of it, may still exist. In either case, specific user information was merely a means to an end, a building block in a far more ambitious construction: a behavioural model powerful enough to manipulate people’s activity and, potentially, sway elections.

READ MORE

The firm adapted its approach to personality modelling from studies conducted by researchers at Stanford University and the psychometrics centre at the University of Cambridge. The studies relied on data collected by a Facebook app called myPersonality, a 100-question quiz developed by the Psychometrics Centre that assessed a person's openness, conscientiousness, extroversion, agreeableness and neuroticism, traits commonly referred to in the academic community by the acronym Ocean.

Many respondents who took the quiz through the myPersonality app authorised it to gain access to their Facebook profile data, and that of their friends – access that was allowed by the social network at the time. That allowed researchers to cross-reference the results of the quiz – numeric Ocean scores – with the users’ Facebook “likes”, and build a model from the correlations they found between the two. With that model, the researchers could often make precise guesses about subsequent users’ personalities using only a list of their likes, no 100-question quiz necessary.

Accuracy

One of the studies the psychometrics centre produced, published in 2015 in the Proceedings of the National Academy of Sciences, was built on the "likes" and Ocean scores of more than 70,000 respondents who took the myPersonality quiz on Facebook. It found that a person who liked the movie Fight Club, for example, was far more likely to be open to new experiences than a person who liked American Idol, according to a review of data provided to the New York Times by Michal Kosinski, an author of the 2015 study and a computer science professor at Stanford University.

In that study, the researchers compared the accuracy of their model with personality assessments made by the respondents’ friends. The friends were given a 10-question version of the myPersonality quiz and asked to answer based on their knowledge of the respondents’ personalities.

Based on a sample of more than 32,000 participants who were assessed by both the model and one or two friends, the researchers found that the model, using just 10 likes, was more accurate than a work colleague. With 70 likes, it was more accurate than a friend or roommate; with 150, more accurate than a family member; and with 300, more accurate than a spouse.

Substance use

The model, the researchers said, was particularly adept at “predicting life outcomes such as substance use, political attitudes, and physical health”. The real-world efficacy of the approach, however, has been called into question.

When Cambridge Analytica approached the psychometrics centre about using its models, the centre declined. Cambridge Analytica then turned to Aleksandr Kogan, a psychology professor at Cambridge University who was familiar with the centre's work.

Kogan developed a Facebook app called “thisisyourdigitallife”, a quiz similar to myPersonality, and used it to harvest data from more than 50 million Facebook profiles. Of those, 30 million contained enough information to generate personality models. Only 270,000 users authorised Kogan’s app to access their data, and all were told their information was being used for academic research.

Cambridge then pitched its services to potential political and commercial clients, ranging from Mastercard and the New York Yankees to the US joint chiefs of staff.

Banned

Facebook has now banned Cambridge Analytica from its platform, as well as its parent company and Kogan. In Facebook’s eyes, Kogan’s infraction was not collecting the data, but giving it to Cambridge Analytica. “Although Kogan gained access to this information in a legitimate way and through the proper channels that governed all developers on Facebook at that time, he did not subsequently abide by our rules,” Facebook’s deputy general counsel said on March 16th.

By handing over that information to a private company, Facebook said Kogan violated its terms of service.

Facebook in 2015 changed its policies, including altering rules about how third-party apps can access information about users' friends. But user data collected through such apps over the years probably remains in the wild, not to mention the models that can continue to be used to target people around the world. – New York Times