Time Series in the Age of Large Models
Workshop at the Conference on Neural Information Processing Systems (NeurIPS) 2024
The first NeurIPS workshop on Time Series in the Age of Large Models will be held at the Vancouver Convention Center on December 15, 2024. We look forward to welcoming you in Vancouver.
The list of accepted papers is available here and the PDFs are available at the linked OpenReview pages.
Introduction
Foundation models have revolutionized the approach to building machine learning models in areas like natural language processing, where models are pretrained on large amounts of diverse data and then adapted for downstreams tasks, often in a zero-shot fashion. This approach has begun to gain traction in the time series community. Recent works have developed and open-sourced foundation models for time series tasks, particularly forecasting. Additionally, some studies have shown positive results by either leveraging pretrained models from other modalities, such as text, for time series tasks or enhancing time series analysis through exogenous information from other modalities. These advancements have opened new research directions and challenges related to the development, analysis, evaluation, and real-world applications of large models for time series tasks. This workshop aims to provide a forum for researchers and practitioners to understand the progress made and push the frontier of time series research in the era of large models.
The key topics of this workshop include, but are not limited to:
- Building Time Series Foundation Models
- Analysis of Pretrained Time Series Models
- Critiques on Time Series Foundation Models
- Faster and Better Inference Schemes for Autoregressive Time Series Models
- Leveraging Pretrained Models of Other Modalities for Time Series
- Multimodal Time Series Models
- Large-Scale Time Series Datasets and Benchmarks
- Time Series Evaluation
- Real-World Applications of Large Time Series Models
Please see the Call for Papers for details.
Schedule
Sunday 15th December 2024, West Meeting Room 220-222, Vancouver Convention Center
Refer to the NeurIPS website for the detailed schedule.
Time (PST) | Event | ||
---|---|---|---|
08:15 - 08:25 | 🎤 Opening Remarks | ||
08:25 - 09:00 | 🎓 Invited Talk by Tomas Pfister Multimodal Time Series Modeling | ||
09:00 - 09:12 | 📢 Contributed Oral Talk Partial Channel Dependence with Channel Masks for Time Series Foundation Model | ||
09:12 - 09:17 | 💡 Contributed Spotlight Talk Time Series under Temporal Label Noise | ||
09:17 - 09:29 | 📢 Contributed Oral Talk PaPaGei: Open Foundation Models for Optical Physiological Signals | ||
09:29 - 09:34 | 💡 Contributed Spotlight Talk TimeSeriesExam: A Time Series Understanding Exam | ||
09:34 - 10:35 | 🖼️ Poster Session 1 | ||
10:35 - 11:10 | 🎓 Invited Talk by Christoph Bergmeir Fundamental limitations of foundational forecasting models: The need for multimodality and rigorous evaluation | ||
11:10 - 11:15 | 💡 Contributed Spotlight Talk TimePFN: Effective Multivariate Time Series Forecasting with Synthetic Data | ||
11:15 - 11:50 | 🎓 Invited Talk by Valentina Zantedeschi Beyond Forecasting: Intelligent Decision Support for Complex Systems | ||
12:00 - 13:00 | 🥗 Lunch Break | ||
13:00 - 14:00 | 🖼️ Poster Session 2 | ||
14:00 - 14:35 | 🎓 Invited Talk by Qingsong Wen LLM and Foundation Models for Time Series Analysis | ||
14:35 - 14:47 | 📢 Contributed Oral Talk Towards Time-Series Reasoning with LLMs | ||
14:47 - 14:53 | 💡 Contributed Spotlight Talk Benchmarking out-of-the-box forecasters of varying scales in biology | ||
14:53 - 15:30 | ☕ Coffee Break | ||
15:30 - 15:42 | 📢 Contributed Oral Talk Scaling-laws for Large Time-series Models | ||
15:42 - 15:47 | 💡 Contributed Spotlight Talk Towards Resolution-Aware Retrieval Augmented Zero-Shot Forecasting | ||
15:47 - 16:22 | 🎓 Invited Talk by Mihaela van der Schaar From Data to Discovery: LLM’s Role in Advancing Science | | |
16:22 - 16:34 | 📢 Contributed Oral Talk Maven: A Multimodal Foundation Model for Supernova Science | ||
16:34 - 16:39 | 💡 Contributed Spotlight Talk Mamba4Cast: Efficient Zero-Shot Time Series Forecasting with State Space Models | ||
16:39 - 17:14 | 🎓 Invited Talk by Andrew Gordon Wilson Why Should We Develop Language Models for Time Series Forecasting? | ||
17:14 - 17:19 | 🎬 Closing Remarks |