A Facebook study that manipulated the news feeds of close to 700,000 users caused considerable outrage a few years ago. At the same time, it provided intriguing data, with a large sample group, about the emotional contagion that can occur through social networks.
Whether you were incensed by Facebook’s secret efforts to control emotions or fascinated with the empirical results, the project illustrates how computer scientists can become involved in humanities research – and the issues that result — said Mary L. Gray, a senior researcher for Microsoft Research, who will speak at the ATL Feb. 19, from 11 a.m. until noon.
“It’s a new world for computer science and engineering to have to think about the social implications and needs created by technology,” said Gray, who will discuss the Facebook experiment during her discussion, “When Social Media Companies, Research Ethics, and Human Rights Collide.”
In addition to her role with Microsoft, Gray is a fellow at Harvard University’s Berkman Klein Center for Internet and Society and an associate professor of Informatics, Computing and Engineering at Indiana University.
Historically, Gray said, computer scientists and engineers have primarily built systems and focused on making things more efficient. But over time, more possibilities emerged in the area of human research.
“To me, this is the beginning of a new era of having technology serve society,” she said.
Algorithms based on our past behaviors already impact most of us, whether it be the recommendations Netflix makes or how your credit score is calculated. But computations are also used to make significant decisions on college admissions, foster care placement and criminal court sentencing.
However, the computer science tendency to focus on efficiency can prove problematic when applied to humans.
Nationwide, many superior court judges use a computer program called COMPAS, a risk assessment algorithm, to help guide their sentencing decisions. Yet an algorithm can reflect biases – and lacks the ability to individualize. One critical ProPublica study concluded that COMPAS predicts African-American defendants will have higher risks of recidivism than they actually do, while white defendants are predicted to have lower rates than they actually do.
Meanwhile, major colleges receive tens of thousands of applications for admissions (Cal Poly had over 65,000 in 2018), making it necessary to use software as a part of the early screening process. But that software might ignore key components of a candidate’s background, achievement and skills.
“The challenge is when we start believing those numbers are an effective stand-in for somebody’s potential,” Gray said.
So while data can certainly help decision makers expedite their choices, it comes with a word of caution.
“In most cases what we need is to see the limitations of data,” Gray said.
From a research perspective, engineering offers a potential for a new human data research paradigm, Gray said. Yet, ethical considerations will arise — as they did when Facebook employed its users as unknowing guinea pigs.
Collecting data for social media is nothing new, of course. But gathering social media data to gage the human condition is.
“When computer science was applied to those domains, it wasn’t with an awareness of what those worlds were creating,” she said.