Fosca Giannotti
Scuola Normale Superiore, Pisa & Information Science and Technology Institute “A. Faedo” of CNR., Pisa, Italy
Towards a Synergistic human-machine Interaction and Collaboration
XAI and Hybrid Decision-Making Systems. State-of-the-art and research questions.
Abstract
The future of AI lies in enabling people to collaborate with machines to solve complex problems. Like any efficient collaboration, this requires good communication, trust, clarity, and understanding. Explaining to humans how AI reasons are only a part of the problem, we must then be able to design AI systems that understand and collaborate with humans: Hybrid decision-making systems aim at leveraging the strengths of both human and machine agents to overcome the limitations that arise when either agent operates in isolation.
This lecture provides a reasoned introduction to the work of Explainable AI (XAI) to date and then will focus on paradigms in support of synergistic human-machine interaction and collaboration to improve joint performance in high-stake decision-making. Three distinct paradigms, characterized by a different degree of human agency will be discussed: i) human oversight, with a human expert monitoring AI prediction augmented with explanation; ii) Learning to defer, in which the machine learning model is given the possibility to abstain from making a prediction when it receives an instance where the risk of making a misprediction is too large; iii) collaborative and interactive learning, in which human and AI engage in communication to integrate their distinct knowledge and facilitate the human’s ability to make informed decisions.
This lecture is joint work with Clara Punzi, Mattia Setzu, and Roberto Pellungrini
Bio
Fosca Giannotti is professor of Computer Science at Scuola Normale Superiore, Pisa and associate at the Information Science and Technology Institute “A. Faedo” of CNR., Pisa, Italy. She co-leads the Pisa KDD Lab – Knowledge Discovery and Data Mining Laboratory, a joint research initiative of the University of Pisa and ISTI-CNR. Her current research focuses on advancing AI methods on trustworthiness and habilitating human-machine collaboration and interaction. She is the PI of the ERC project “XAI – Science and Technology for the Explanation of AI Decision Making”.
She is the author of more than 300 papers. She has coordinated tens of European projects and industrial collaborations. Professor Giannotti is deputy director of SoBigData++, the European research infrastructure on Big Data Analytics and Social Mining, an ecosystem of tens of cutting edge European research centers providing an open platform for interdisciplinary data science and data-driven innovation. She is a member of the spoke Human-Centered AI of the national PNRR project on AI: FAIR (Future AI research).
On March 8, 2019, she has been featured as one of the 19 Inspiring women in AI, BigData, Data Science, Machine Learning by KDnuggets.com, the leading site on AI, Data Mining and Machine Learning. Since February 2020 F.G. has been the Italian Delegate of Cluster4 (Digital, Industry and Space) in Horizon Europe.
Mark Riedl
Georgia Tech School of Interactive Computing
Dungeons and DQNs
The Serious Quest for Human-AI Interaction via Storytelling and Role Playing Games
Abstract
As artificial intelligence increases in capacity, we might one day hope to introduce agents into open-ended environments. In those environments, we might expect our agents to interact with, and be directed by, human teammates. What can we learn from teaching AI agents to play computer games? For conventional digital computer games, arguably not much. However, language-based role-playing games requires agents to both understand the world the way humans do but also present behavior that is understandable to humans. Likewise, story generation presents research challenges with regard to AI systems needing to understand humans and be understood by humans. I present a case for studying human-AI interaction through storytelling and role-playing game, using research projects from my lab to illustrate the potential.