Bots increase online user engagement but stifle meaningful discussion, study shows
Last July, Meta introduced AI Studio, a tool for users of Meta platforms Facebook and Instagram to create chatbots powered by artificial intelligence (AI). The bots can be used for specific tasks such as generating captions for posts, or more generally as an “avatar” — engaging directly with platform users via messages and comments. Tools similar to AI Studio have also been rolled out for Snapchat and TikTok.
In an interview with the Financial Times in December, Meta’s vice president of product for generative AI said, “We expect these AIs to actually, over time, exist on our platforms, kind of in the same way that accounts do. … They’ll have bios and profile pictures and be able to generate and share content powered by AI on the platform.”
As AI bots become more prevalent on platforms, especially bots that are able to generate new content, there are risks that these bots will share false information and overwhelm users’ social feeds with automatically generated content.
This sparked a discussion about the role of bots on social media platforms. Although Meta removed some of its internally developed AI bots from its platforms, there are still user-created bots on the platforms. Additionally, with heavy investment in generative AI (GenAI) technologies — software programs that use AI to create content and interact conversationally with users — firms may continue to look for ways to increase user engagement on platforms through the use of AI bots.
GenAI bots are not the only bots that can interact with users. Bot accounts on platforms such as Reddit and X follow a series of pre-programmed rules to interact with users or moderate discussion.
Within Reddit communities, those bot accounts profoundly influence human-to-human interactions, according to new research from the University of Notre Dame.
Bots increase user engagement, but at the cost of deeper human-to-human interactions, according to “The Effect of Bots on Human Interaction in Online Communities,” recently published in MIS Quarterly from John Lalor, assistant professor of IT, analytics and operations, and Nicholas Berente, professor of IT, analytics and operations, both at Notre Dame’s Mendoza College of Business, along with Hani Safadi from the University of Georgia.
Recent work has identified a taxonomy of bots — a system of classifying and categorizing different types of bots based on their functionalities, behaviors and operating environments.
Bots can be very simple or very advanced. At one end of the spectrum, rules-based bots perform simple tasks based on specific guidelines. For example, the WikiTextBot account on Reddit replies to posts that contain a Wikipedia link with a summary of the Wikipedia page. The bot’s automated nature allows it to see every post on Reddit via an application programming interface (API) to check each post against its hard-coded rule: “If the post includes a Wikipedia link, scrape the summary from the wiki page and post it as a reply.” These bots are called “reflexive” bots.
Other bots on Reddit moderate conversations in communities by, for example, deleting posts that contain content that goes against community guidelines based on specifically defined rules. These are known as “supervisory” bots.
“While these bots are rigid because of their rules-based nature, bots can and will become more advanced as they incorporate generative AI technologies,” said Lalor, who specializes in machine learning and natural language processing. “Therefore, it’s important to understand how the presence of these bots affects human-to-human interactions in these online communities.”
Lalor and his team analyzed a collection of Reddit communities (subreddits) that experienced increased bot activity between 2005 and 2019. They analyzed the social network structure of human-to-human conversations in the communities as bot activity increased.
The team noticed that as the presence of reflexive bots (those that generate and share content) increases, there are more connections between users. The reflexive bots post content that facilitates more opportunities for users to find novel content and engage with others. But this happens at the cost of deeper human-to-human interactions.
“While humans interacted with a wider variety of other humans, their interactions involved more single posts and fewer back-and-forth discussions,” Lalor explained. “If one user posts on Reddit, there is now a higher likelihood that a bot will reply or interject itself into the conversation instead of two human users engaging in a meaningful back-and-forth discussion.”
At the same time, the inclusion of supervisory bots coded to enforce community policies led to the diminished roles of human moderators who establish and enforce community norms.
With fewer bots, key community members would coordinate with each other and the wider community to establish and enforce norms. With automated moderation, this is less necessary, and those human members are less central to the community.
As AI technology — especially generative AI — improves, bots can be leveraged by users to create new accounts and by firms to coordinate content moderation and push higher levels of engagement on their platforms.
“It is important for firms to understand how such increased bot activity affects how humans interact with each other on these platforms,” Lalor said, “especially with regard to their mission statements — for example, Meta’s statement to ‘build the future of human connection and the technology that makes it possible.’ Firms should also think about whether bots should be considered ‘users’ and how best to present any bot accounts on the platform to human users.”
Contact: John Lalor, 574-631-5104, john.lalor@nd.edu
Originally published by news.nd.edu on January 29, 2025.
atLatest Research
- Liam O’Connor selected as 2025 Richard H. Driehaus Prize Laureate at the University of Notre Dame; Philippe Rotthier wins Henry Hope Reed AwardLiam O’Connor has been named the recipient of the 2025 Richard H. Driehaus Prize in honor of his lifelong dedication to and outstanding achievements in creating distinctive private, public and civic projects. In conjunction with the Driehaus Prize, Philippe Rotthier was named the next Henry Hope Reed Award laureate for his lifelong success in elevating new traditional architecture and urbanism to public prominence.
- Aspects of marriage counseling may hold the key to depolarizing, unifying the country, study findsResearch has shown that polarization undermines democracy by driving citizens to prioritize partisan preferences over democratic principles, encourages democratic gridlock and threatens democratic attitudes and norms, such as tolerance for opposition. A recent study from the University of Notre Dame found that “reciprocal group reflection” — an intervention inspired by marriage counseling — helped reduce affective polarization among opposing political parties.
- Emily Wedel pursues ecological research across communities and continents“Plants can’t move when times get tough.” These were the words of Emily Wedel, a postdoctoral research fellow in the Department of Biological Sciences at Notre Dame. She studies the field of plant ecology, which she described as the analysis of strategies…
- Notre Dame marketing professor wins early career research awardMarketing Professor Vamsi Kanuri is the winner of the 2024 Varadarajan Award for Early Career Contributions to Marketing Strategy Research.
- Notre Dame Law School delegation to visit Strathmore Law School, the Constitutional Court of South Africa, and the University of Cape TownThis week, a delegation from Notre Dame Law School will embark on a 10-day visit to Nairobi, Johannesburg, and Cape Town, starting Wednesday, January 29. Led by G. Marcus Cole, the Joseph A. Matson Dean of Notre Dame Law School, the delegation…
- Notre Dame researchers partner with Mexico hospital to develop childhood cancer care monitoring toolA study from the University of Notre Dame and the Hospital Infantil de México Federico Gómez is evaluating the preliminary design and usability of a cooperative cancer monitoring tool with successful findings that may address this gap in pediatric cancer care.