Summer School: Trust and Machine Learning

Course/Event Essentials

Event/Course Start
Event/Course End
Event/Course Format
In person
Live (synchronous)

Venue Information

Country: Germany
Venue Details: Click here

Training Content and Scope

Level of Instruction
Beginner
Intermediate
Advanced
Other
Sector of the Target Audience
Research and Academia
Industry
Public Sector
HPC Profile of Target Audience
Application Users
Application Developers
Data Scientists
System Administrators
Language of Instruction

Other Information

Organiser
Event/Course Description

The Philosophy of Computational Sciences group is pleased to announce this year’s Summer School on Trust and Machine Learning.

The trustworthiness of machine learning (ML) methods has recently come to focus not only due to the ongoing ChatGPT hype but also as a result of algorithmic mishaps of self-driving cars, legislative attempts by the EU, and the general sense that there has been a breakthrough in the field.

Trust plays a central role in evaluating possible consequences of ML methods.

This summer school offers three one-day sessions, which will engage with different aspects of trust and ML.

The first session is dedicated to the ethical and normative aspects of trust in ML contexts. There is a rich literature on trust concepts to be navigated. This session will give participants the means to select and apply a fitting trust concept for their ML problem. This session will be taught by Andreas Kaminski and Sebastian Hallensleben.

The second session is a hands-on tutorial in the HLRS training center, where participants will learn how to code a simple ML application. This session will be taught by Lorenzo Zanon, Khatuna Kakhiani and Layal Ali from the HLRS Training and Scalable Algorithms department.

The third session will discuss epistemic aspects of trust. It will connect trust to inductive problems and will show how solutions to problems of epistemic trust in inductive methods are transferred to solutions in ML contexts. The session will be taught by Nic Fillion and Tom Sterkenburg.