A recipe for trustworthy artificial intelligence
Notre Dame researchers share a new model for developing artificial intelligence systems users can trust.
This week, a group of tech industry leaders issued an open letter warning of the looming threats posed by artificial intelligence (AI), comparing them to the risk of "pandemics and nuclear war."
The open letter is just one of many recent attempts to draw attention to situations in which AI cannot be trusted and to raise questions about AI’s potential unfair or harmful effects.
A group of researchers at the University of Notre Dame say it is important to ask a slightly different question: What would it look like to develop artificial intelligence we can trust?
Working alongside technology experts within the U.S. military as well as with researchers at Indiana University – Purdue University Indianapolis (IUPUI) and Indiana University, they are developing in a comprehensive, systematic approach to creating trustworthy AI.
Their project, called “Trusted AI,” has identified six widely shared values they call the “dimensions of Trusted AI.” The six dimensions are:

- Explainability - Can we explain how the AI arrives at inferences?
- Safety and robustness - Will the AI will work as expected—not just in the lab but in real, live contexts?
- Fairness - Can we ensure the AI will not reproduce patterns of bias and discrimination?
- Privacy - Are we confident that the data the AI uses will be held safely and confidentially?
- Environmental wellbeing - Can the AI be trained and developed with minimal negative environmental impact?
- Accountability and auditability - Can we identify who is responsible, and can we confirm that the AI is working as expected?
The key challenge in developing trustworthy AI, the researchers say, is to ensure that each of the dimensions informs every stage in the process, from the initial collection of data to the output, or “inference,” the AI provides. Only when there is an unbroken “chain of trust” can we be sure the end result is trustworthy.
The lead principal investigator behind the Trusted AI project is Christopher Sweet, the CRC’s associate director for cyberinfrastructure development.
Sweet, who has a concurrent appointment as an assistant research professor in the Department of Computer Science and Engineering, emphasizes that the process of developing Trusted AI is more a cycle than a one-and-done effort.
“It is an iterative process,” Sweet explains. “These technologies are constantly evolving—as are the data sets they depend on and the social contexts in which they are used. It is not about declaring victory. It is about showing that Trusted AI is an ongoing practice that requires that all stakeholders are involved and engaged.”
Charles Vardeman, a computational scientist at the CRC and a research assistant professor in the Department of Computer Science and Engineering, leads a sub-project of Trusted AI. Vardeman says the team is working to prevent harm by AI far beyond the technologies and applications that receive public attention.
“People are aware that AI powers things like Alexa and ChatGPT, but that is really just the tip of the iceberg,” he says. “Most people are interacting with AI regularly without knowing it. It is shaping their purchasing decisions online, and it is even helping to determine the medical care they receive.”
Adam Czajka, an assistant professor in the Department of Computer Science and Engineering, is leading a Trusted AI sub-project that focuses on ways humans and machines, when paired together, can arrive at highly trustworthy decisions. He and his colleagues have developed a way of training AI to recognize fake images by training it to mimic human perception.
Another Trusted AI subproject, led by CRC senior associate director and professor of the practice Paul Brenner, applies the Trusted AI recipe to create technology for the U.S. Navy.
Brenner, who is a faculty affiliate of iNDustry Labs, ND Energy, and the Wireless Institute, explains, “If a mission or weapon system fails, there is often more data available about that failure than any one person could read. New machine learning tools like natural language processing and knowledge graphs could help mine the data to identify the underlying cause of the failure.”
The obstacle, Brenner says, is that most commercially available machine learning tools are a “black box.” They create inferences on the basis of large sets of data. What they do not provide is an explanation about how or why they arrived at a particular inference.
Brenner’s team is developing a new approach for military applications that goes beyond the “black box.” In collaboration with the U.S. Navy installation near Crane, Indiana (Crane NSWC), Brenner and a group of ten Notre Dame undergraduate student researchers are building machine learning tools that are trained with a set of special, pre-labeled data for more accurate and more explainable outcomes.
Brenner emphasizes that in addition to the new tools and techniques his project develops, it will also include broader impacts that will continue to reverberate through the coming decades.
“We are looking forward to sharing what we learn with a broad group of students,” Brenner says. In addition to training the students directly involved in the research, the team will also educate younger students in the principles of Trusted AI through presentations and by welcoming 40 high school students to Notre Dame’s campus for the CRC’s Summer Scholars program.
“We are developing a new approach to AI that is urgently needed,” Brenner says, “and at the same time, we’re developing people—the future military officers, scholars, and tech industry leaders who will make trusted AI a reality.”
Trusted AI is part of the Scalable Asymmetric Lifecycle Engagement (SCALE) workforce development program funded by the Office of the Undersecretary of Defense for Research and Engineering Trusted & Assured Microelectronics program.
About the Center for Research Computing
The Center for Research Computing (CRC) at University of Notre Dame is an innovative and multidisciplinary research environment that supports collaboration to facilitate multidisciplinary discoveries through advanced computation, software engineering, artificial intelligence, and other digital research tools. The Center enhances the University’s innovative applications of cyberinfrastructure, provides support for interdisciplinary research and education, and conducts computational research. Learn more at crc.nd.edu.
About Notre Dame Research:
The University of Notre Dame is a private research and teaching university inspired by its Catholic mission. Located in South Bend, Indiana, its researchers are advancing human understanding through research, scholarship, education, and creative endeavor in order to be a repository for knowledge and a powerful means for doing good in the world. For more information, please see research.nd.edu or @UNDResearch.
Contact:
Brett Beasley / Writer and Editorial Program Manager
Notre Dame Research / University of Notre Dame
bbeasle1@nd.edu / +1 574-631-8183
research.nd.edu / @UNDResearch
Originally published by crc.nd.edu on June 02, 2023.
atLatest Research
- ND graduate and professional students share how HBCUs helped prepare themHistorically Black Colleges and Universities have filled an important role in the U.S. educational landscape for generations. Over the years, a number of Notre Dame’s graduate and professional students have arrived in South Bend by way of an HBCU.
- J.S.D. student Perla Khattar’s article achieves top SSRN rankingsNotre Dame Law School J.S.D student Perla Khattar’s new article, “What you don’t know won’t hurt you: Fighting the privacy paradox by designing for privacy and enforcing protective technology” has recently achieved top rankings in numerous top-ten lists in the Social Science Research Network. These categories include: Consumer Law eJournal…
- Kroc Institute master’s students advancing human rights at the United Nations in GenevaThe University of Notre Dame’s master of global affairs students have been actively advancing gender equality and the rights of women and girls at the United Nations through field placements at UN Women in Geneva.
- Notre Dame Dublin and ITMA launch new partnership, host Irish traditional music concertOn August 25, 2023, the University of Notre Dame Dublin and the Irish Traditional Music Archive presented a concert of Irish traditional music, song, and dance in Dublin’s historic Pepper Canister Church. With a large number of Notre Dame alumni and friends in town for the ND vs. Navy football game being played in Dublin the following day, Saturday, August 26th, the concert was an opportunity to showcase the Irish musical tradition to a mainly international audience.…
- The Legacy Project nabs research award, encourages scholarly work with fellow Legacy Project grantors – deadline of Oct. 1The Legacy Project, an initiative of the Kroc Institute’s Peace Accords Matrix (PAM), was named the winner of this year’s Lucy Family Institute for Data and Society's Global Breakthrough Award. The $1,000 win celebrates the research achievements of Josefina Echavarría Alvarez, director of PAM and principal investigator for The Legacy Project, and the entire PAM research team.…
- Biology postdoc wins annual Lightning Talk Competition titlePostdoctoral researcher Tolulope “Kay” Kayode won the top prize in the second annual Postdoc Lightning Talk Competition on Thursday, September 21, during Postdoc Appreciation Week. Tolulope “Kay” Kayode holds the Lightning Talk trophy Fourteen postdoc finalists from five departments in the College of Science presented their research in "lightning talk" format—three minutes or less using a singular presentation slide. …