Mark Zukerberg’s 21 March 2018 public post about Cambridge Analytica  gives a timeline which starts In 2013, with   Cambridge University researcher Aleksandr Kogan creating a personality quiz app. However, our  March 2017 Cambridge Analytica article Fear the Geeks  tells a different story. That article is primarily about  “The Reclusive Hedge-Fund Tycoon Behind the Trump Presidency”   but it gives some background from a Das Magazin  article on how, in 1912, Dr Michal Kosinski at Cambridge University Psychometrics Center used Facebook data to explore a model developed by psychologists in the 1980s. That model assesses human beings based on five personality traits, known as the “Big Five.” The Facebook results were astonishing. The Das Magazin article reported that:

“In 2012, Kosinski proved that on the basis of an average of 68 Facebook “likes” by a user, it was possible to predict their skin color (with 95 percent accuracy), their sexual orientation (88 percent accuracy), and their affiliation to the Democratic or Republican party (85 percent). But it didn’t stop there. Intelligence, religious affiliation, as well as alcohol, cigarette and drug use, could all be determined. From the data it was even possible to deduce whether someone’s parents were divorced.”

Our March 2017 article notes that:

Kosinski  had understood that his research could be open to abuse and he  warned that it, “could pose a threat to an individual’s well-being, freedom, or even life.” He was alarmed when in 2015 he came to suspect that the Facebook Big Five measurement approach was being used by an election-influencing firm. Cambridge Analytica denies using Kosinski’s model, claiming it developed its own model. Nevertheless, after the Brexit campaigns, Kosinski’s  friends and acquaintances were holding him responsible for the result. The Das Magazin article concludes:

“The world has been turned upside down. Great Britain is leaving the EU, Donald Trump is president of the United States of America. And in Stanford, Kosinski, who wanted to warn against the danger of using psychological targeting in a political setting, is once again receiving accusatory emails. ‘No,‘  says Kosinski, quietly and shaking his head. ‘This is not my fault. I did not build the bomb. I only showed that it exists.’”

Mark Zukerberg says that “In 2015, we learned from journalists at The Guardian that Kogan had shared data from his app with Cambridge Analytica.” Clearly Facebook was not aware of  Kosinski’s earlier work or of the Das Magazin prediction. (Or of our Fear the Geeks article.)

The problem is bigger than Facebook. An 8 March 2018 ScienceDaily article from Indiana University, “Researchers call for large-scale scientific investigation into fake news,” notes that  “The paper includes estimates that the number of automated “bots” is 60 million on Facebook and up to 48 million on Twitter,”

“The spreaders of fake news are using increasingly sophisticated methods…. “If we don’t have enough quantifiable information about the problem, we’ll never be able to design interventions that work.”

Tags: Mark Zukerberg  Fear the Geeks Das Magazin ScienceDaily

However, in terms of apportioning blame, there is a bigger picture, which is being tackled by “an international team of researchers in fields as diverse as philosophy, engineering and anthropology,” see our December 2017 article, Engineers, philosophers and sociologists release ethical design guidelines for future technology, by Rafael A Calvo, Professor and ARC Future Fellow, University of Sydney Dorian Peters, Creative Leader, Positive Computing Lab, University of Sydney. This tackles ethical design from a software engineering perspective.

Tags: Artificial intelligence Morality Ethics Guidelines Technology Moral standards

For an even bigger political perspective, see a March 23, 2018 essay in The Conversation, Post-truth politics and why the antidote isn’t simply ‘fact-checking’ and truth, by John Keane Professor of Politics, University of Sydney. “This article is part of the Revolutions and Counter Revolutions series, curated by Democracy Futuresas a joint global initiative between the Sydney Democracy Network and The Conversation. The project aims to stimulate fresh thinking about the many challenges facing democracies in the 21st century.

It is also part of an ongoing series from the Post-Truth Initiative, a Strategic Research Excellence Initiative at the University of Sydney.”

Tags:  Truth  Democracy Futures Post-truth Post-Truth Initiative

The essay starts with the caveat: “This essay is much longer than most Conversation articles, so will take some time to read. Enjoy!”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.