EXPLAINABLE AI IN HEALTHCARE
The next AI Monday Berlin will be a satellite event to the in parallel running conference DMEA. Hence we will be focusing on the Healthcare sector, specifically on the topic of Explainable AI.
Four crisp presentations incl. Q&A. Some speaker also demo their AI solutions. Followed by networking. Please bring your own snacks and drinks as long as we are virtual.
AI curious people, change leaders, businesses with passion for data and disruption.
Share AI-knowledge, exchange and encourage each other on our change journeys.
Please register below to receive details.
19:15 PM – Berlin timezone
No Sales Pitches. No Math lectures or deep tech dives. No shallow consulting or marketing talks.
Dr. Wojciech Samek
Head of the Department of Artificial Intelligence and the Explainable AI Group at Fraunhofer Heinrich Hertz Institute (HHI), AI for Good
Wojciech Samek studied computer science at Humboldt University of Berlin, Heriot-Watt University and University of Edinburgh and received the Dr. rer. nat. degree with distinction (summa cum laude) from the Technical University of Berlin. After his PhD he founded the Machine Learning Group at Fraunhofer HHI, which he has directed until 2020. Dr. Samek is associated faculty at the Berlin Institute for the Foundation of Learning and Data (BIFOLD), the ELLIS Unit Berlin and the DFG Graduate School BIOQIC. He is co-editor of the Springer book “Explainable AI: Interpreting, Explaining and Visualizing Deep Learning” and has organized various sessions, workshops and tutorials on explainable AI, neural network compression, and federated learning.
Dr. Sven Schmeier
Chief Engineer and Associate Head of Language Technology Lab Berlin of DFKI
In the XAINES project, the aim is not only to ensure explainability, but also to provide explanations (narratives). The central question is whether the AI can explain in one sentence why it acted the way it did or whether it has to explain it interactively to the user. To clarify this, one of the project focuses is the exploration of narrative and interactive narratives, which are particularly suitable for humans to assimilate knowledge in any form, in their application with AI systems. To obtain explanatory narratives, (speech) labeled sensor data streams and predictive models are used. Sensor information is combined with speech information, from which the AI system develops so-called scene understanding, which then generates explanations.
Dr. Günther Hoffmann
CEO – MedaPlus – AI powered health analytics platform
Guenther Hoffmann is co-founder and CEO of the Berlin based startup MedaPlus (https://www.medaplus.health), which develops explainable AI driven solutions for pulmonary diseases. Günther holds a PhD. from Humboldt Universität Berlin, international research at Massachusetts Institute of Technology (MIT) and at Duke University; USA.