SM445/ CS 707: Seminar “Machine Learning and Time” (FSS 2025)
This term's seminar: Machine Learning and Time.
Time plays a crucial role in virtually all aspects of machine learning. This seminar focuses on recent research in relevant areas where machine learning meets time, including time series analysis, temporal knowledge representation, causality and time, online and data stream learning, forecasting, change point detection, and handling distribution shifts.
Organization
- This seminar is organized by Prof. Dr. Rainer Gemulla, Simon Forbat, and Julie Naegelen.
- Available for up to 8 Master students (4 ECTS) and up to 4 Bachelor students (5 ECTS).
- Prerequisites: Solid background in machine learning (MSc students), Einführung in Data Science (BSc students)
Goals
In this seminar, you will
- Read, understand, and explore scientific literature
- Summarize a current research topic in a concise report (10 single-column pages + references)
- Give two presentations about your topic (3 minutes flash presentation, 15 minutes final presentation)
- Moderate a scientific discussion about the topic of one of your fellow students
- Review a (draft of a) report of a fellow student
Schedule
- Register as described below.
- Attend the kickoff meeting on February 19th, at TBD (tentative).
- Work individually throughout the semester according to the seminar schedule (tentative).
- Meet your advisor for guidance and feedback.
Registration
Please register via Portal 2 until February 10th.
If you are accepted into the seminar, provide at least 4 topics areas of your preference (your own and / or example topics; see below) by February 16th via email to Julie Naegelen. The actual topic assignment takes place soon afterwards; we will notify you via email. Our goal is to assign one of your preferred topic areas to you.
Topic areas and topics
You will be assigned a topic area in an active, relevant field of machine learning based your preferences. Your goals in this seminar are
- Provide a short, concise overview of this topic area (1/4). A good starting point may be a book chapter, survey paper, or recent research paper. Here you take a birds-eyes view and are expected to discuss the main goals, challenges, and relevance of your topic area. Topic areas are selected at the beginning of the seminar.
- Present a self-selected topic within this area in more detail (3/4). A good starting point is a recent or highly-influential research paper. Here you dive deep into one particular topic and are expected to discuss and explain the concrete problem statement, concrete solution or contribution, as well as your own thoughts. The actual topic is selected before the first tutor meeting.
You are generally free to propose your topic area of interest as long as it aligns with the overall theme and objectives of the seminar.
Suggested topics, grouped by area:
Time Series Forecasting (TSF)
Linear State-Space Models for TSF
(BSc students preferred)
Book Chapter: Introduction to Time Series and Forecasting, Brockwell and Davis, 2016, Ch. 9
Example paper: Are Transformers Effective for Time Series Forecasting?
Venue: AAAI 2023Selective State-Space Models for TSF
Example paper: Mamba: Linear-Time Sequence Modeling with Selective State Spaces
Venue: COLM 2024Graph Neural Networks for TSF
Example paper: FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure Graph Perspective
Venue: NeurIPS 2023LLMs for TSF
Example paper: Are Language Models Actually Useful for Time Series Forecasting?
Venue: NeurIPS 2024 (Spotlight)Zero-Shot and Few-Shot TSF
Example paper: Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series
Venue: NeurIPS 2024
Time Series Representation Learning
Time Series Representation Learning via Patching
Example paper: Learning to Embed Time Series Patches Independently
Venue: ICLR 2024Robust Time Series Learning and Adversarial Attacks
Example paper: Black-Box Adversarial Attack on Time Series Classification
Venue: AAAI 2023
Online Learning and Concept Drift
Online Continual Learning
Example paper:Forgetting, Ignorance or Myopia: Revisiting Key Challenges in Online Continual Learning
Venue: NeurIPS 2024Concept-Drift Adaptation
Example paper: RDumb: A simple approach that questions our progress in continual test-time adaptation
Venue: NeurIPS 2023
Online Algorithms
Time Fairness in Online Algorithms
(BSc students preferred)
Example paper:Time Fairness in Online Knapsack Problems
Venue: ICLR 2024
Single Time Points in (Time Series) Context
Anomaly Detection
Example paper: Revisiting VAE for Unsupervised Time Series Anomaly Detection: A Frequency Perspective
Journal: WWW 2024
OR (BSc students preferred):
Example Paper:Matrix profile XXIV: scaling time series anomaly detection to trillions of datapoints and ultra-fast arriving data streams
Venue: ACM SIGKDD 2022Missing Data Imputation
Example paper:CSDI: Conditional Score-based Diffusion Models for Probabilistic Time Series Imputation
Venue: NeurIPS 2021Time Series Mining and Similarity Measures
(BSc students preferred)
Example paper:Time works well: Dynamic time warping based on time weighting for time series data mining
Journal: Information Sciences 2021
Causality and Time Series
Finding Causality in Time Series Data
Example Paper: Causal Discovery from Subsampled Time Series with Proxy Variables
Venue: NeurIPS 2023
Dynamic Temporal Representations
Time-Editable LLMs
Example paper:DyKnow: Dynamically Verifying Time-Sensitive Factual Knowledge in LLMs
Venue: EMNLP Findings 2024Spatio-Temporal Graph Forecasting
Example paper: Towards Dynamic Spatial-Temporal Graph Learning: A Decoupled Perspective
Venue: AAAI 2024
Supplementary materials and references
- “Giving Conference Talks”by Prof. Dr. Rainer Gemulla
- “Writing for Computer Science” by Justin Zobel, Springer, 2014