Attention mechanisms in artificial intelligence
These days, applications of machine learning (where computer algorithms make new predictions based on data they’ve already seen before) are increasingly finding their way from AI labs into other research and business contexts. Machine learning applications enable for instance the prediction of protein function and consumer preference. But what comes after?
The AI research community continues to improve and refine the possibilities of Artificial Intelligence. In time, these developments will unlock the next level of applications in other research fields and industries. So where is AI headed?
About this webinar
In this webinar, AI researchers from UM’s Department of Data Science and Knowledge Engineering will introduce their work on state-of-the-art AI research and discuss potential applications.
On March 15, we will discuss attention mechanisms. These enable AI to focus on informative parts of data automatically, and are inspired by the way humans subconsciously focus on things like faces in an image. To give you a grasp on the possibilities, the Affective and Visual Computing Lab will present its own work related to video analysis and natural language processing (the processing of human language by computers). We will then address applications in the broader research community, such as biological sequence modelling.
The webinar is open to researchers at UM who feel that AI can help them, but are unsure how.
About the speakers
The Affective and Visual Computing Lab is part of the Department of Data Science and Knowledge Engineering (DKE). DKE has a 30-year history in artificial intelligence research and teaching, and currently houses over 60 academic staff members and 800 students.
The Affective and Visual Computing Lab builds techniques that allow machines to combine data from different sources and interpret human behavior as accurately as possible. The scope of the lab encompasses both fundamental research and research into a wide range of innovative applications.
During the webinar, you will meet Dr. Jan Niehues, Assistant Professor with a focus on natural language processing, and Esam Ghaleb, a research associate on the Horizon 2020 PROCare4Life project who focuses on human behaviour recognition in videos.