The capacities of LLM-powered chatbots have been progressing on the order of months and have recently passed into mainstream public awareness and adoption. These tools have been used for a variety of scientific and policy interventions, but these advances call for a significant re-thinking of their place in society. Psychological research suggests that "intentionality" is a key factor in persuasion and social norm enforcement, and the proliferation of LLMs represents a significant shock to the "intentionality" contained in text and particularly in immediate, personalized chat. This talk argues that we are in a period of "informational disequilibrium," where different actors have different levels of awareness of this technological shock. This period may thus represent a golden age for actors aiming to use these technologies at scale, for any number of normative ends. More broadly, the talk suggests that the "ethical" frameworks for evaluating research practices using LLM-powered chatbots are insufficient to the scale of the current challenge. This is a potentially revolutionary technology that requires thinking in moral and political terms: given the power imbalances involved, it is of paramount importance that chatbots for good do not inadvertently become chatbots for evil.
Bio:
Kevin Munger is the Jeffrey L. Hyde and Sharon D. Hyde and Political Science Board of Visitors Early Career Professor of Political Science and assistant professor of political science and social data analytics at Penn State University. Kevin’s research focuses on the implications of the internet and social media for the communication of political information. His specialty is the investigation of the economics of online media; current research models “clickbait media” and uses digital experiments to test the implications of these models on consumers of political information.