Major networks such as social media platforms, highway systems, and even our genes contain vast amounts of data hidden in plain sight. However, the techniques scientists devise to learn about the nonlinear relationships within these structures often lead to unintended discrimination against historically disadvantaged groups. These biased results are what electrical engineering and computer science professor Sucheta Soundarajan is trying to avoid by making network algorithms fairer.
Soundarajan has a National Science Institute (NSF) CAREER Award for her research on network analysis algorithms. The scholarship is a one-time research award designed to support Soundarajan’s professional development. In addition to providing funding for research, it will support a number of non-research projects.
“Every time I get a grant, it feels great because it’s an acknowledgment of the larger scientific community,” Soundarajan says. “This one mainly because it is tied to me as an individual and not just to the project. It feels like I’m being validated as a scientist. It means a lot.”
While the award is an individual achievement, it supports research that has potential for communities around the world. Increasingly, information is obtained from network analysis and what scientists are discovering is that although algorithms cannot access protected characteristics such as age, disability, gender identification, religion and national origin, they still discriminate against these groups.
“What we are seeing is that people from these minority and disadvantaged groups are being unfairly discriminated against at a higher rate,” Soundarajan says. “We want to create algorithms that automatically put people at the center of a network, but in a fair way.”
Soundarajan says criminal sentencing and lending are two examples of areas where algorithms are used to make critical decisions and where scientists have uncovered possible wrongful discrimination. Another example of a fairness issue is the way we interact with each other on social platforms. Friendship recommendation algorithms can exacerbate people’s tendency to seek out people who are similar to themselves.
“Topped to the limit, if people follow these recommendations, people end up in silos where they only connect with people who are similar to them and that’s how you get echo chambers,” Soundarajan says.
In addition to her research, Soundarajan will have the opportunity to hire a graduate student to help develop ethics-based modules that can become part of computer science courses, in the hope that it will help students develop ethical thinking .
“We’re going to design these labs where we’ll give students a data set and they’ll apply some algorithms to it and then they’ll look at the results and they’ll have to think about whether these results are fair,” Soundarajan says.
Soundarajan will also look into developing continuing education for lawyers. She hopes to create lessons that focus on explaining how algorithms can create discriminatory problems.
Using her time and talent for something socially meaningful is important to Soundarajan. She views the support she has received throughout her life as a factor in choosing her field of research, and she recognizes that the help she has received from members of her department contributed to her latest achievement.
“So much has been invested in me as a scientist, I feel like I have a moral obligation to do something that benefits everyone,” Soundarajan says. “I’ve been really lucky to be surrounded by people who really want to see me succeed and that was the case at Syracuse University. People have given me their time and spent hours reading the proposal that got me this award, and that means a lot to me.”
“Falls down a lot. Writer. Passionate alcohol maven. Future teen idol. Hardcore music practitioner. Food fanatic. Devoted travel fan.”