After more than two centuries of steady expansion worldwide, democracy reached a milestone in 2024, with over 1.6 billion people in more than 70 countries voting for national leaders—one of the largest election years in recorded history. However, as artificial intelligence (AI) continues to shape the way voters receive information on social media, policy experts are considering ways to protect voters from targeted advertising campaigns that may manipulate future democratic elections.

In June, Emma Briant, visiting professor at the University of Notre Dame’s Lucy Family Institute for Data & Society, was invited by the European Parliament’s Special Committee on the European Democracy Shield, to discuss new geopolitical challenges, digital technologies and threats to the democratic process within the European Union (EU).
This latest engagement builds on her longstanding involvement in EU policymaking, including a 2019 Safeguarding Democracy Debate focused on Facebook, Cambridge Analytica, and social media regulation, as well as a 2022 panel on the impact of political advertising on European elections.
Briant was joined by Alberto Fernandez Gibaja, head of digitalisation and democracy programme at International IDEA, and Arielle Garcia, chief operating officer at Check My Ads, for a public hearing on the “EU Digital Rulebook and Political Advertising” to share insights into systemic risks of new technologies and the effectiveness of recent EU policies and enforcement.
During her opening statement, Briant emphasized that an important end goal of the European Democracy Shield should be to “tackle the issue of engagement-based recommender systems and banning the use of targeted advertising.”
Under current EU regulations, targeted political advertising is restricted by the General Data Protection Regulation (GDPR), which requires explicit consent to target special sensitive categories. These rules are intended to prevent the manipulative use of mental health, race, and other protected sensitive data. Still, Briant argued that recent political advertising transparency and targeting regulations, though well-intentioned, fall short in effectively stopping such abuses.

Briant illustrated how advertisers can utilize AI to identify and target sensitive data categories, such as audiences prone to anxiety, with manipulative messaging, without violating the GDPR and Digital Services Act. She then demonstrated how AI can be used to formulate careful language for the EU’s required legal compliance and transparency statements, suggesting the EU’s new AI Act may not go far enough.
Targeted advertising, Briant suggests, can be used to influence voting behavior on the basis of personal data in ways that are manipulative or that suppress freedom of expression. During her presentation to Parliament, she emphasized that social media platforms use algorithms that “continually assess and determine what we should see” based on profit, not veracity, and thus run the risk of stifling unprofitable voices. “Manipulation occurs when the information imbalance is sufficient to impact audience autonomy, or their freedom to choose, reflect, decide, and deliberate effectively,” she said.
A key area of her work has examined how Cambridge Analytica used social media personality tests to shape U.S. political campaigns, revealing that data from users with traits like anxiety or neuroticism were used to target fear-based messaging on Facebook and amplified through Meta’s lookalike audiences—findings she shared during the public hearing to the EU.
She has conducted extensive research on propaganda campaigns, information warfare, and surveillance capitalism – where companies collect and monetize user data to predict and influence behavior. Her latest book, the Routledge Handbook of the Influence Industry (with Vian Bakir, 2024), examines the techniques, impacts, and ethics of this evolving ‘Influence Industry’ around the world, as well as its regulation.
The Special Committee on the European Democracy Shield is actively pursuing frameworks to audit, enforce and regulate online political advertising based on discussions from the June engagement.
“Emma Briant’s engagement with the European Parliament exemplifies the Lucy Family Institute’s commitment to addressing the societal impacts of data and emerging technologies,” said Nitesh Chawla, founding director of the Lucy Family Institute for Data & Society and the Frank M. Freimann Professor of Computer Science and Engineering.“ “Her research is helping shape global conversations about safeguarding democracy in the digital age—offering evidence-based insight into how AI and data-driven advertising are reshaping public discourse, elections, and civil liberties.”
While at Notre Dame, Briant is focusing on finalizing her fourth book, Propaganda Machine, while also developing new projects with University colleagues focused on understanding influence operations, their impacts, and ways to increase ethics, transparency and accountability in online information systems.
Briant’s presentation slides and a full recording of the Special Committee on the European Democracy Shield can be found on the European Parliament website.
To learn more about the Lucy Family Institute for Data & Society, please visit lucyinstitute.nd.edu.
To learn more about the University of Notre Dame’s Democracy Initiative, please visit the Strategic Framework website.
Contact:
Christine Grashorn, Program Director, Engagement and Strategic Storytelling
Lucy Family Institute for Data & Society / University of Notre Dame
cgrashor@nd.edu / 574.631.4856
lucyinstitute.nd.edu / @lucy_institute
About the Lucy Family Institute for Data & Society
Guided by Notre Dame’s Mission, the Lucy Family Institute adventurously collaborates on advancing data-driven and artificial intelligence (AI) convergence research, translational solutions, and education to ethically address society’s vexing problems. As an innovative nexus of academia, industry, and the public, the Institute also fosters data science and AI access to strengthen diverse and inclusive capacity building within communities.