In the fallout, Facebook faces its toughest test on privacy safeguards, and its founder and CEO, Mark Zuckerberg, has been summoned by MPs in the U.K.He faces similar calls from the U.S. Congress and from India, with revelations that Cambridge Analytica worked to influence the 2016 Brexit referendum and elections in India, Nigeria and other countries as well.
U.S. special counsel Robert Mueller is already examining Cambridge Analytica’s ties with the Trump campaign as part of his probe into Russia’s alleged meddling in the 2016 presidential election. Significantly, U.S. billionaire and conservative fundraiser Robert Mercer had helped found Cambridge Analytica with a $15 million investment, and he recruited former Trump advisor Steve Bannon, who has since left the firm. The firm initially sought to steer voters towards presidential candidate Ted Cruz, and after he dropped out of the race, it redirected its efforts to help the Trump campaign.
“We’re experiencing a watershed moment with regard to social media,” said Aral. “People are now beginning to realize that social media is not just either a fun plaything or a nuisance. It can have potentially real consequences in society.”
The Cambridge Analytica scandal underscores how little consumers know about the potential uses of their data, according to Berman. He recalled a scene in the film Minority Report where Tom Cruise enters a mall and sees holograms of personally targeted ads. “Online advertising today has reached about the same level of sophistication, in terms of targeting, and also some level of prediction,” he said. “It’s not only that the advertiser can tell what you bought in the past, but also what you may be looking to buy.” ……..
Nave said the Cambridge Analytica scandal exposes exactly those types of risks, even as they existed before the internet era. “Propaganda is not a new invention, and neither is targeted messaging in marketing,” he said. “What this scandal demonstrates, however, is that our online behavior exposes a lot about our personality, fears and weaknesses – and that this information can be used for influencing our behavior.”
In Golbeck’s research projects involving the use of algorithms, she found that people “are really shocked that we’re able to get these insights like what your personality traits are, what your political preferences are, how influenced you can be, and how much of that data we’re able to harvest.”
………….An Expanding Scandal
Although Cambridge Analytica’s work in using data to influence elections has been controversial for at least three years, the enormity of its impact emerged last Saturday. The whistleblower, Christopher Wylie, who had worked with Cambridge Analytica, revealed to The Observer how the firm harvested profiles of some 50 million Facebook users. The same day, the New York Times detailed the role of Cambridge Analytica in the Trump campaign.
Facebook had allowed Cambridge University researcher Aleksandr Kogan access to data for an innocuous personality quiz, but Kogan had passed it on without authorization to Cambridge Analytica. Wylie told The Observer: “We exploited Facebook to harvest millions of people’s profiles. And built models to exploit what we knew about them and target their inner demons. That was the basis the entire company was built on.”
Meanwhile, the U.K.’s Channel 4 News captured in a video sting the strategies Cambridge Analytica used in its work to “change audience behavior,”………
Finding a Solution
Golbeck called for ways to codify how researchers could ethically go about their work using social media data, “and give people some of those rights in a broader space that they don’t have now.” Aral expected the solution to emerge in the form of “a middle ground where we learn to use these technologies ethically in order to enhance our society, our access to information, our ability to cooperate and coordinate with one another, and our ability to spread positive social change in the world.” At the same time, he advocated tightening use requirements for the data, and bringing back “the notion of informed consent and consent in a meaningful way, so that we can realize the promise of social media while avoiding the peril.”
Regulation, such as limiting the data about people that could be stored, could help prevent “mass persuasion” that could lead them to take action against their own best interests, said Nave. “Many times, it is difficult to define what one’s ‘best interest’ is – ……
Legitimate Uses of Data
Golbeck worries that in trying to deal with the fallout from the Cambridge Analytica scandal, Facebook might restrict the data it makes available to researchers. “You don’t want this big piece of how society operates just blocked off, accessible only to Facebook and basically the people who are going to help them make money,” she said. “You want academic researchers to be able to study this.” But balancing that with the potential for some academic researchers to misuse it to make money or gain power is a challenge, she added
Aral described Cambridge Analytica as “a nefarious actor with a very skewed understanding of what’s morally right and wrong.” He pointed out that there’s an important line to be drawn between the appropriate uses of technology “to produce social welfare” through platforms like Facebook, and the work that Cambridge Analytica did. “It would be a real shame if the outcome was to, in essence, throw the baby out with the bathwater and say that the only solution to this problem is to pull the plug on Facebook and all of these social technologies because you know there’s no way to tell the difference between a bad actor and a good actor.”
All said, sophisticated data analytics “may also be used for generating a lot of good,” said Nave. “Personalized communication may help people to keep up with their long-term goals [such as] exercise or eating healthier, and get products that better match one’s needs. The technology by itself is not evil.”http://knowledge.wharton.upenn.edu/article/fallout-cambridge-analytica/
Leave a Reply