Leading expert in Responsible AI, Kay Firth-Butterfield, joins Lucy Family Institute for Data & Society as visiting scholar

Society will face drastic changes as we all begin to access artificial intelligence (AI) programs for increased efficiency in performing tasks previously conducted by humans. From education and healthcare to law enforcement and industry, AI is being developed for every area of our lives–but without guardrails to minimize risk and ensure equitable representation, the rapid increase in machine learning might leave humanity behind.       

The University of Notre Dame’s Lucy Family Institute for Data & Society has appointed Kay Firth-Butterfield as a visiting scholar, focusing on the responsible and inclusive development and deployment of AI across diverse applications and use cases.

Kay Firth-Butterfield

“When we consider transparency, bias, or discrimination in AI, we often think in terms of ethics, but ethics can vary considerably from person to person or from culture to culture,” said Firth-Butterfield. “My research focuses on responsible AI and ways to develop broader principles and practices in behaving responsibly while using and developing machine learning models.” 

Firth-Butterfield is collaborating with Notre Dame faculty including Nitesh Chawla, founding director of the Lucy Family Institute for Data & Society, and Olaf Wiest, the Grace-Rupley Professor of Chemistry and Biochemistry and also on the Advisory Committee of the Lucy Family Institute, to consider responsible policies for AI deployment across several disciplines.    

Her work includes projects with the National Science Foundation Center for Computer-Assisted Synthesis (C-CAS). Firth-Butterfield, Chawla, and Wiest, the director of C-CAS, will work together with the rest of the C-CAS team to develop responsible use principles and practices for developing and deploying AI for use in chemistry. 

Firth-Butterfield discusses AI governance as part of the Soc(AI)ety Seminars, April 2024 (Photo credit: Angie Hubert/Notre Dame Research)

In April 2024, Firth-Butterfield visited the University as part of the Lucy Family Institute’s ongoing series, The Soc(AI)ety Seminars. The 2023-2024 edition of The Soc(AI)ety Seminars focused on Responsible AI and AI Governance.

“I am thrilled to welcome Kay to Notre Dame as a visiting scholar for the Lucy Family Institute for Data & Society,” said Chawla, who is also the Frank M. Freimann Professor of Computer Science and Engineering. “She is a distinguished scholar and I look forward to her contributions to deepening our commitment to guiding responsible AI innovation for a world in need,” he added.

Firth-Butterfield is the CEO of Good Tech Advisory LLC. and served as the inaugural Head of AI and Machine Learning for the World Economic Forum (WEF) from 2017 to 2023. While at the WEF,  she pioneered novel guidelines to cultivate a responsible technology culture in AI and quantum computing that could be used by businesses in the private and public sectors. Firth-Butterfield began her professional career in England as a barrister-at-law and served a part-time appointment as a judge.

In 2024, she received the Time magazine Impact award and was recognized in Forbes 50 over 50 for her work in AI Governance.

To learn more about Kay Firth-Butterfield, please visit her website

Contact:

Christine Grashorn, Program Director, Engagement and Strategic Storytelling
Lucy Family Institute for Data & Society / University of Notre Dame
cgrashor@nd.edu / 574.631.4856
lucyinstitute.nd.edu / @lucy_institute

About the Lucy Family Institute for Data & Society

Guided by Notre Dame’s Mission, the Lucy Family Institute adventurously collaborates on advancing data-driven and artificial intelligence (AI) convergence research, translational solutions, and education to ethically address society’s wicked problems. As an innovative nexus of academia, industry, and the public, the Institute also fosters data science and artificial intelligence (AI) access to strengthen diverse and inclusive capacity building within communities.