Michal Kosinski has only just published his latest paper, but, later in the same day, the Stanford University psychologist and data scientist is already seeing criticism emerge on Twitter.
The article, 鈥溾, uses 鈥渞eal-world experiments鈥, in the form of Facebook adverts for genuine products that reached 3.5 million people, to show that micro-targeting users with messages tailored to their individual psychological profiles makes them more likely to click on ads and buy products. Similar research into online psychological persuasion might well go on, less transparently and at a much grander scale still, at the Silicon Valley tech companies just down the road from Stanford 鈥 not to mention in Russian troll factories. But the Twitterati鈥檚 censure is very much focused on Kosinski.
鈥淚鈥檓 taking some flak because people [think that] because I showed that this psychological persuasion is effective, suddenly this [work] is the root of the problem,鈥 he tells 糖心Vlog in his Stanford office. Such critics, he adds, 鈥渄on鈥檛 understand that telling people 鈥楬ey, the flu virus is deadly鈥 doesn鈥檛 make it deadly. The virus is objectively deadly, and I鈥檓 trying to warn you guys that it鈥檚 deadly.鈥
That Kosinski seems a little preoccupied about the reception of his paper is understandable given the whirlwinds set in motion by his previous research into how our intimate traits can be detected through the digital footprints we all leave behind. While at the University of Cambridge, the Polish-born academic 鈥 now assistant professor of organisational behaviour at Stanford Graduate Business School 鈥 was lead author on a 2013 paper that demonstrated that people鈥檚 Facebook likes could be used to 鈥渁utomatically and accurately predict a range of highly sensitive personal attributes鈥, including their personality traits, sexual orientation, intelligence and political views. That paper has led to his by some commentators as the man who inadvertently 鈥渆nabled鈥 the 鈥渄igital revolution鈥 set in motion by Cambridge Analytica, the political data-mining and strategic communications firm owned mostly by the US billionaire and conservative supporter Robert Mercer that was prominent in Donald Trump鈥檚 presidential election campaign and that has links to a data firm involved in the pro-Brexit campaign in the UK. The firm was in the headlines last weekend when The Observer alleged that it had made use of a large amount of Facebook data without users鈥 consent in a possible attempt to influence the 2016 US presidential election.
糖心Vlog
Then, in 2017, a co-authored by Kosinski showing that facial recognition technology could be used to detect individuals鈥 sexual orientation generated a huge backlash from gay rights groups. Ashland Johnson, director of public education and research at US LGBTQ rights organisation the Human Rights Campaign, the paper, 鈥溾, as 鈥渏unk science鈥 that had left the lives of millions 鈥渨orse and less safe鈥 because of its potential to be used to aid a 鈥渂rutal regime鈥檚 efforts to identify and/or persecute people they believed to be gay鈥. Kosinski says that he was targeted with death threats in the aftermath of the paper鈥檚 publication, resulting in a campus police officer being stationed outside his door.
So is Kosinski really making bombs, as his critics claim? Or are his papers, as he argues, controlled explosions of weaponry already in use by others, and intended to advise us of its pressing dangers? Either way, his work in a cutting-edge field has made an extraordinary impact. It has relevance for all our lives by explaining how our digital footprints expose us to terrifying privacy risks, and it offers insights into the unequal yet significant power relationship between Silicon Valley and academia.
糖心Vlog

Kosinski was already deputy director of Cambridge鈥檚 Psychometrics Centre before he had even completed his PhD, in 2013. He had set out as a 鈥渢raditional social psychologist trained not in computer science but in traditional small-sample research [and] questionnaire research鈥. But he became frustrated that the scientific establishment 鈥渞efused to accept that the new reality鈥f the online environment at large has any significance鈥. Kosinski was also convinced that biases could be removed from recruiting processes for jobs and educational courses if applicants鈥 personalities were assessed via the evidence of their digital footprints rather than with the traditional psychometric tests that use a lengthy questionnaire to probe the 鈥渂ig five鈥 personality traits: openness to experience, conscientiousness, extraversion, agreeableness and neuroticism (known collectively by the acronym Ocean).
Psychometrics 鈥 psychological measurement 鈥 is 鈥渟uch an important field, with potential to greatly ease the lives, especially, of underprivileged people and people who suffer from psychological ailments鈥, he adds.
David Stillwell, now in Kosinski鈥檚 former position at Cambridge, launched a Facebook app called MyPersonality in 2007. Users could take a traditional Ocean psychometric test but could also opt in to allow researchers to record their Facebook profile (including their likes), as an easily accessible and interpretable form of digital footprint. The app proved popular, and the database contains more than 6 million test results and 4 million Facebook profiles.
The resulting 2013 paper, 鈥溾, co-authored by Kosinski, Stillwell and Microsoft Research鈥檚 Thore Graepel, used a dataset of 58,000 volunteers who took psychometric tests and provided their Facebook likes and detailed demographic profiles. The findings were startling: for the openness trait, 鈥渙bservation of the user鈥檚 Likes is roughly as informative as using their personality test score鈥 gathered from an Ocean questionnaire, the paper found. A detail as intimate as whether a subject鈥檚 parents stayed together or separated before they were 21 was detectable through their likes. The best predictors of high intelligence included likes for curly fries; low intelligence was indicated by likes including 鈥淚 Love Being a Mom鈥. Similar personality predictions, the paper suggested, could probably be made using other forms of digital footprint, such as web searches, browsing histories and credit card purchases.
Altmetrics ranked the paper ninth in its top 100 of the academic papers that received the most attention online in 2013. Within weeks of its publication, Facebook changed users鈥 settings to make likes private by default.
A on which Kosinski was a co-author tested things further. It found that computers鈥 judgements of people鈥檚 personalities based on their Facebook likes were 鈥渕ore accurate and valid鈥 than those made by their colleagues, friends, family and even spouses. The paper, 鈥淐omputer-based personality judgments are more accurate than those made by humans鈥, showed that a 鈥渟imple equation based on a random collection of around 200 likes from a Facebook profile can make those judgements and predictions [about personality] better than your own wife鈥, Kosinski says. 鈥淲hich is completely mind-blowing because it鈥檚 a stupid equation based on 200 likes. A 16-year-old person probably leaves more than 200 digital footprints per hour.鈥
This 鈥渂rings me to the conclusion that going forward鈥e are not going to have any privacy鈥, he adds.
Privacy is at the heart of concerns about the political applications of such knowledge. Alexander Nix, Cambridge Analytica鈥檚 CEO, explained in a 2016 presentation that 鈥渋f you know the personality of the people you are targeting, you can nuance your messaging to resonate more effectively with these kinds of groups鈥. Nix also explained that Cambridge Analytica, then just starting its work on the Trump campaign, centred its method on the Ocean personality model. One of his examples of targeted marketing was a pro-gun rights advert tailored to a 鈥渘eurotic and conscientious audience鈥, based on the image of a burglary.
糖心Vlog
The Guardian聽聽that it had seen documents showing that 鈥淐ambridge Analytica鈥檚 parent, a London-based company called Strategic Communications Laboratories (SCL), was first introduced to the concept of using social media data to model human personality traits in early 2014 by Dr Aleksandr Kogan鈥, the psychology lecturer at Cambridge at the centre of last weekend鈥檚 revelations about the misuse of Facebook data.
Meanwhile, a suggested that Kosinski was approached by Kogan 鈥渙n behalf of a company that was interested in Kosinski鈥檚 method, and wanted to access the MyPersonality database. Kogan wasn鈥檛 at liberty to reveal for what purpose.鈥 Kosinski ultimately broke off contact, according to the magazine, when Kogan finally revealed the company鈥檚 name and Kosinski discovered that one of its focuses was 鈥渋nfluencing elections鈥.
Cambridge Analytica told Das Magazin that it 鈥渉as had no dealings鈥 with Kosinski and 鈥渄oes not use the same methodology鈥 as he did. However, Cambridge Analytica鈥檚 methods are undeniably similar.
Kosinski says that the 鈥渕ain framing of the [2013] paper is privacy risks鈥. The paper鈥檚 conclusion refers to how predicting individuals鈥 attributes from their digital footprints might 鈥渋mprove numerous products and services鈥, but might also have 鈥渃onsiderable negative implications, because it can easily be applied to large numbers of people without obtaining their individual consent and without their noticing鈥.
When Kosinski started working on the paper, he realised that he was 鈥渘ot the first person who figured it out鈥. Companies such as Netflix and Facebook, he says, were already using the capacities of digital footprints to reveal individual personality traits in more sophisticated ways. Barack Obama鈥檚 2008 presidential election campaign had pioneered 鈥減sychological micro-targeting鈥, and governments were 鈥渢racking people online trying to figure out their underlying, intimate traits 鈥 trying to distinguish between a terrorist and a not-so-dangerous fanatic鈥.
Given this level of sophistication, adopting methods similar to those detailed in his 2013 paper would be 鈥渟tupid鈥, in Kosinski鈥檚 view. Regarding Cambridge Analytica, he says: 鈥淭he other thing that shows they were novices who didn鈥檛 know what they were doing鈥s that they actually talked about it. If you are in this business, you know that you do not tell people how you make the sausage.鈥 Those who worked for Obama and Hillary Clinton had an 鈥渁bility to micro-target鈥 few levels higher than whatever Cambridge Analytica could pull off. But they were smart enough to realise that you just don鈥檛 talk about it.鈥
But, for Kosinski, it is 鈥済reat that Cambridge Analytica said those things [about its methods] because it brought to public awareness something that I had tried to make people aware of for years鈥. When he published his 2013 paper, 鈥淚 would go out and say: 鈥楲ook guys, there are risks for privacy.鈥 And people would be like: 鈥極h, but tell us about how curly fries predict intelligence.鈥欌 (Kosinski that he had 鈥渧ery little idea鈥 about the answer to that last question, but speculated that 鈥渟ome clever people might say they liked quirky things to express their own novelty鈥.)
However, he feels that critics of Cambridge Analytica 鈥渨orry about the wrong thing鈥 and suggests that the firm is 鈥渘ot such a dangerous actor compared with what governments and institutions and more serious companies can do鈥.
Asked about the connections between his research, Cambridge Analytica, Trump and Brexit, Kosinski says: 鈥淚鈥檓 not trying to improve the bomb鈥鈥檓 just saying: 鈥業 will, in my lab, detonate this bomb and show you how much damage it can do.鈥 I鈥檓 basically warning you against the negative impact of technologies that are already being used for the very particular purpose I鈥檓 warning against.鈥
糖心Vlog
The rationale for his paper showing that sexual orientation can be predicted from facial images 鈥渋s exactly the same鈥, he continues. 鈥淭he technology has been developed for many years: it鈥檚 being widely used for precisely the purpose [gay rights groups worried about]: detecting crime. That is what I鈥檓 warning against.鈥
His mention of crime is presumably a reference to the fact that 鈥 as his to the paper makes clear 鈥 homosexuality is illegal in some countries; the note also references a about the use of facial recognition technology in China to track crime suspects. This use of technology seems rather distinct from profiling people鈥檚 intimate traits, although Kosinski also links to a about an Israeli start-up that claims to be able to predict how likely people are to be, for instance, terrorists or paedophiles by analysing their faces.
The paper, co-authored with Stanford colleague Yilun Wang, used deep neural networks 鈥 a type of artificial intelligence that 鈥渓earns鈥 in a way that mimics the human brain 鈥 to examine more than 35,000 facial images of self-identified gay and straight people, gathered from a dating website. An algorithm could correctly distinguish between gay and heterosexual men in 81 per cent of cases, and in 71 per cent of cases for women, the study found. The paper says that given that 鈥渃ompanies and governments are increasingly using computer vision algorithms to detect people鈥檚 intimate traits, our findings expose a threat to the privacy and safety of gay men and women鈥.
Kosinski emphasises how much information humans can already read from faces. 鈥淕ender, age, ethnicity, genetic disease are all clearly displayed on the face, and we have no problem, even without any training, with judging those.鈥 Yet humans 鈥渁re not really great at [using facial information] when it comes to, say, sexual orientation or political views, but it seems that computers are鈥.
But in addition to the protests of gay rights groups 鈥 which also called on Stanford to distance itself from the research 鈥 that 鈥渄ozens of academics, scientists and others鈥icked apart the study in blog posts and Tweet storms鈥. In one , a mathematician told Kosinski: 鈥淲hat you call academic research, I call weaponised algorithms.鈥
This reaction was not entirely unexpected. 鈥淚 sat on the sexual orientation paper for a year before I published it,鈥 Kosinski admits. He 鈥渨orried about the hate鈥 and 鈥渁bout keeping my job鈥. If he had stayed quiet, his career would be 鈥渆asier鈥 and he would be facing fewer death threats. However, he eventually decided that 鈥渋t鈥檚 morally inexcusable to keep this鈥nowledge away from people. Given that Russia, China, America, Germany and other countries are rolling things like this out, I went ahead and published it. But given the reaction that I got, none of my students will ever do a similar thing.鈥

If academia has sometimes been a hostile environment for Kosinski, a lucrative haven would surely await him in the tech industry; Stanford, after all, has been described as a 鈥溾. However, 鈥渂eing in academia maximises my chances to have a positive impact on the world鈥, he says. 鈥淚 would not be able to warn people against privacy risks if I worked at a company.鈥
On the brain drain to Silicon Valley, he admits that he 鈥済ave up trying to work with computer science students because they always leave me after three months because they鈥檝e got a seven-digit sign-up bonus with one company or the other鈥. The drain happens 鈥渘ot only because of money鈥 but also 鈥渂ecause you can do projects in industry you cannot do in academia鈥, such as 鈥減laying with people鈥檚 newsfeeds, or playing with people鈥檚 experience on Google search鈥.
Given that it is 鈥渄ifficult to expect that academia will be able to compete with industry when it comes to funding for research and access to data鈥, Kosinski suggests that 鈥渨e may have to accept that the societal function of educating people will shift from the universities to firms like Facebook. When you are graduating with your bachelor鈥檚 in computer science, you [then] go to Facebook. After three years [there], you have learned a lot 鈥 probably the equivalent of two master鈥檚 and a PhD.鈥
Kosinski also advocates that tech companies publish more academic journal articles, in order to 鈥渟hare the science they are producing within the walls of the company鈥. One notable case of a tech company doing just this is the by a member of Facebook鈥檚 core data science team and two Cornell University researchers, detailing the 鈥渆motional contagion鈥 seen when individuals鈥 Facebook newsfeeds were manipulated by increasing the level of 鈥減ositive鈥 or 鈥渘egative鈥 stories. But, according to , 鈥渓awyers, internet activists and politicians鈥 reacted by describing the research as 鈥溾榮candalous鈥, 鈥榮pooky鈥 and 鈥榙isturbing鈥欌.
鈥淕uess what? [Facebook] will still be doing it,鈥 says Kosinski. 鈥淭hey just won鈥檛 be telling anyone what they are doing. So we, as a society, lost a chance to learn, to have a discussion about potential policies. We just bullied them into silence.鈥
But is there more that could be done to incentivise young computer scientists to remain in academia? Kosinski argues that apart from 鈥渕aybe trying to pay people better鈥, universities should allow researchers 鈥渢o run research more freely鈥 by relaxing privacy rules around data collection.
鈥淲e already have companies doing way more invasive research than whatever social scientists could ever do,鈥 he says. 鈥淭he only way for us to catch up as scientists 鈥 to try to tell the general public what鈥hose companies might be doing behind closed doors 鈥 is [to be given] a bit more ability to run those studies and use those data.鈥

For all his concerns about privacy risks, Kosinski also seems, at times, to be something of a techno-utopian. On the wall of his office is an artwork that shows, in the background, a police officer in riot gear and gas mask, wreathed in tear gas. In the foreground, facing towards the officer and with its back to the viewer, is a figure wearing a western-style gunslinger鈥檚 belt. The figure鈥檚 hand is poised to pull out a weapon, but, rather than a pistol, it is the 鈥渇鈥 from Facebook鈥檚 logo.
During Kosinski鈥檚 early years, Poland was still under the control of the Soviet Union. 鈥淐oming from a country that for 50 years was basically closed off from the rest of the world鈥 showed him how 鈥渢he attitudes of an entire country, or the established truth, can change in a matter of days or hours,鈥 he says. Recalling that his first days at school were just after Poland鈥檚 first free elections in 1989, he remembers the headmaster coming into class and removing from the wall the white eagle on a red background that, deprived of its pre-communist crown, was the symbol of communist Poland. The poster the headmaster replaced it with had the crown restored. This sudden overturning of the 鈥渋deology that was being promoted to people鈥 made Kosinski 鈥渞eally cautious when it comes to accepting well-established truths鈥, and he cites 鈥渆cho chambers, information bubbles and fake news鈥 as 鈥減otential myths we are perpetuating in society鈥.
In an on-stage interview last year at the Computer History Museum, near Palo Alto, Kosinski was questioned about the impact of personalised marketing in politics and whether it might ultimately break down 鈥渃onsensus reality鈥 and democracy. He responded that the Soviet Union was an example of a country that achieved perfect 鈥渃onsensus reality鈥 through propaganda. In a country like the US, he added, people鈥檚 鈥渋nformation bubbles鈥 are larger than ever before in human history thanks to a combination of expanded social networks, journalism and algorithms that 鈥渢ry to give you a personalised view of the world鈥.
Kosinski鈥檚 latest paper shows the effectiveness of psychological targeting in influencing 鈥渢he behaviour of large groups of people鈥. Taking up his flu metaphor again, Kosinski explains: 鈥淏y warning people the flu virus is deadly, inevitably I鈥檓 also maybe giving some bad guys some bad ideas鈥ut people forget that [most of] those guys already know that the flu virus is deadly,鈥 he says. 鈥淭hey spend a lot of time and a lot of resources researching those things.
鈥淚鈥檓 just one single computational social scientist working with very limited resources and a few students. Russia, the US and big corporations have buildings filled with people like me 鈥 much better paid, much better equipped, without any IRB [institutional review board] control, with much more data 鈥 studying not only how to improve people鈥檚 lives with those technologies but how to take advantage of people, how to affect their well鈥慴eing.鈥
No amount of legislative resistance can avert the coming 鈥渉urricane鈥 around privacy, Kosinski believes. But he hopes that his forecasting, however badly it may be received, will ultimately help to minimise the storm鈥檚 destructive power. 鈥淭he sooner we start getting ready for this unpleasant future,鈥 he says, 鈥渢he better protected we are going to be.鈥澛
糖心Vlog
POSTSCRIPT:
Print headline: Privacy investigator
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰鈥檚 university and college rankings analysis
Already registered or a current subscriber?







