👋🏽 I’m an Assistant Professor in the Department of Computer Science at Swarthmore College and a Faculty Affiliate at the Healthy, Equitable, and Responsive Democracy (HEARD) Research Initiative. My research interests lie at the intersection of social computing and usable security & privacy.
At Swarthmore College, I direct the Collective Resilience Lab where we help people resist technologically-mediated harm, ranging from strategic misinformation and hate speech to privacy violations; build sociotechnical systems to repair trust in each other and our institutions; and empower people to advocate for their rights while refusing harmful data and labor practices.
My research has been published in top-tier HCI venues such as ACM CSCW, CHI, DIS, FAccT, WWW, and AAAI HCOMP. My work has also received the HCOMP 2019 Best Demo Award and been featured in or cited by The New York Times, Reuters, Forbes, PolitFact, and Snopes. My research is supported by an NSF SaTC CRII Award and a Google Award for Inclusion Research, as well as generous support from Swarthmore College’s Lang Center for Civic and Social Responsibility.
With Tim Gorichanaz, I co-founded Speculative Futures Philadelphia to explore the ethical, social, and environmental impacts of emerging technologies. I currently serve on the ACM CSCW Steering Committee and will be the Communications and Outreach Co-Chair for CSCW 2025.
Previously, I was a Postdoctoral Researcher at the University of Washington’s Center for an Informed Public, where I worked with Kate Starbird and Emma Spiro. I received my Ph.D. and M.S. in Computer Science from Virginia Tech, where I was advised by Kurt Luther. In the past, I have interned at Meta (Facebook) and Microsoft Research.
Here is my CV. You can also view a full list of my publications here or on my Google Scholar profile. My email address is sukrit@swarthmore.edu.
Travel and Updates:
November 2024:
- San José, Costa Rica: Lab member Paulina Trifonova will be presenting a poster on a taaxonomy of harm caused by deepfakes at CSCW 2024. Anirban Mukhopadhyay will be presenting a paper with Kurt Luther and me on scaling up open source intelligence (OSINT) investigations.
August 2024:
- Philadelphia, PA: Three Collective Resilience Lab members presented work at SOUPS 2024. Aishi Debroy gave a lightning talk on joint work with Callahan Hanson at the SUPA Workshop on eliciting privacy values in regulating deepfakes. Rodrigo Carvajal and Adi Chattopadhyay presented a poster titled Where are Marginalized Communities in Cybersecurity Reearch?.
- I’ll be serving as an AC for CHI 2025’s Understanding People (Mixed Methods) subcomittee and as the Communications and Outreach Co-Chair for CSCW 2025.
July 2024:
June 2024:
- Excited to have been awarded a $174k NSF SaTC CRII grant to understand and collectively mitigate harm from deepfakes.
May 2024:
- Congratulations to lab alumni Ziming Yuan for being one of two graduating seniors selected for Swarthmore College’s Lang Award in recognition of outstanding academic accomplishment. Ziming will be starting a MS in Computer Science at CMU in the fall.
January 2024:
- Atlanta, GA: Morgan Wack and I presented on our experimental analyzing how deepfakes impact trust in platforms at Google’s Collective and Society-Cenetered AI Workshop.
November 2023:
- I spoke to The New York Times on recent changes to social media platforms' APIs and how it impacts researchers' ability to study mis- and dsinformation.
October 2023:
- Minneapolis, MN: Attending CSCW 2023. I’m excited to meet my fellow CSCW Steering Committee Members!
- Honored to receive a $60k Google Award for Inclusion Research with Morgan Wack at Clemson University to study how deepafkes impact trust in institutions and online paltforms, and co-design more responsible generative AI tools in South Africa and Kenya.
- A co-authored paper was accepted to CSCW 2024 with Anirban Mukhopadhyay and Kurt Luther at Virginia Tech on designing a framework to scale up open source intelligence (OSINT) investigations.
- CITI Program’s On Campus podcast interviewed me about the ethical and privacy concerns of using generative AI tools in higher education.
September 2023:
- A co-authored paper was accepted to the ACM Journal of Computing and Sustainable Societiesled. Led by Himanshu Zade and Spencer Williams, with supprot from Theresa T. Tran, Christina Harrington, Gary Hsieh, and Kate Starbird. We found that replies reframed the conversation of a post compared to quote retweets. But retweets from politically opposed accounts twisted the original posters' words.
August 2023:
- I started as an Assistant Professor in the Department of Computer Science at Swarthmore College. Say hi if you’re in the Philadelphia area!
- I spoke to Inside Higher Ed about the ethical concerns of changes to Zoom’s terms of service on using customer data for generative AI features.
July 2023:
- Pittsburgh, PA: I presented a first-author paper at DIS 2023 on a system to enable collaborative capture-the-flag competitions to investigate misinformation.
- My undergraduate mentee, Ashlyn Aske, presented a poster on a networked approach to study the cross-platform spread of visual misinformation at IC2S2 2023.
June 2023:
May 2023:
- Discussed my research on crowdsourced investigations with Emma Spiro’s Problematic Information class.
- ~Submitted a workshop proposal to CSCW 2023 🤞🏽~ It wasn’t accepted, but glad to see so many other exciting workshops were!
- Washington, D.C.: Attended an NSF SaTC Aspiring PI Workshop at GWU.
- Blacksburg, VA: Attended commencement — I’m Dr. Venkatagiri! 👨🏽🎓
- Tianjiao Yu, a graduate student co-advisee, presented our paper at WWW 2023 in Austin, TX analyzing a Twitter community, the Sedition Hunters, who investigated the 2021 U.S. Capitol Attack.
- Joseph Schafer presented our work (with Stephen Prochaska) on the challenges of studying misinformation on video-sharing platforms during crises at the CHI 2023 workshop on Building Credibilty, Trust, and Safety on Video-Sharing Platforms.
April 2023:
- Excited to have been awarded $20k from the UW Center for an Informed Public’s Innovation Fund to study photo and video misinformation.
- One first-author paper was accepted at DIS 2023, an abstract at IC2S2 2023, and a workshop paper at CHI 2023. Two co-authored papers were accepted at WWW 2023 and FAccT 2023.