LLM/FoundationModels Collection Summary
- Collection key:
P5WKAC5Y - Total items: 114
By Year
| Year | Count |
|---|---|
| 3 | |
| 2021 | 1 |
| 2022 | 1 |
| 2023 | 1 |
| 2024 | 49 |
| 2025 | 53 |
| 2026 | 6 |
By Type
| Type | Count |
|---|---|
| preprint | 86 |
| journalArticle | 20 |
| conferencePaper | 7 |
| podcast | 1 |
By Theme
| Theme | Count |
|---|---|
| Forecasting | 75 |
| Reasoning/Agent | 16 |
| LLM/TSFM | 11 |
| Multimodal | 6 |
| Other | 6 |
Duplicate Titles
| Title | Count |
|---|---|
| Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook | 2 |
| Timer-S1: A Billion-Scale Time Series Foundation Model with Serial Scaling | 2 |
All Papers
| # | Year | Type | First Author | Title | Venue |
|---|---|---|---|---|---|
| 1 | 2025 | preprint | Li | Language in the Flow of Time: Time-Series-Paired Texts Weaved into a Unified Temporal Narrative | arXiv |
| 2 | 2024 | preprint | Liu | Time-FFM: Towards LM-Empowered Federated Foundation Model for Time Series Forecasting | arXiv |
| 3 | 2024 | preprint | Shi | Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts | arXiv |
| 4 | 2025 | preprint | Tang | LLM-PS: Empowering Large Language Models for Time Series Forecasting with Temporal Patterns and Semantics | arXiv |
| 5 | 2024 | preprint | Dong | Metadata Matters for Time Series: Informative Forecasting with Transformers | arXiv |
| 6 | 2024 | journalArticle | Lee | TESTAM: A TIME-ENHANCED SPATIO-TEMPORAL ATTENTION MODEL WITH MIXTURE OF EXPERTS | Zotero |
| 7 | 2024 | preprint | Kowsher | LLM-Mixer: Multiscale Mixing in LLMs for Time Series Forecasting | arXiv |
| 8 | 2024 | preprint | Chattopadhyay | Context Matters: Leveraging Contextual Features for Time Series Forecasting | arXiv |
| 9 | 2024 | preprint | Liu | Time-MMD: A New Multi-Domain Multimodal Dataset for Time Series Analysis | arXiv |
| 10 | 2024 | preprint | Li | FoundTS: Comprehensive and Unified Benchmarking of Foundation Models for Time Series Forecasting | arXiv |
| 11 | 2024 | journalArticle | Ren | 工业大模型:体系架构、关键技术与典型应用 | SCIENTIA SINICA Informationis |
| 12 | 2024 | preprint | Bian | Multi-Patch Prediction: Adapting LLMs for Time Series Representation Learning | Semantic Scholar |
| 13 | 2024 | preprint | Ansari | Chronos: Learning the Language of Time Series | arXiv |
| 14 | 2024 | preprint | Wang | ROSE: Register Assisted General Time Series Forecasting with Decomposed Frequency Learning | arXiv |
| 15 | 2024 | conferencePaper | Liu | Generative Pretrained Hierarchical Transformer for Time Series Forecasting | KDD '24: The 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining |
| 16 | 2024 | preprint | Liu | Timer-XL: Long-Context Transformers for Unified Time Series Forecasting | arXiv |
| 17 | journalArticle | Zhao | Retrieval Augmented Generation (RAG) and Beyond: A Comprehensive Survey on How to Make your LLMs use External Data More Wisely | Zotero | |
| 18 | 2024 | preprint | Wang | LongLLaVA: Scaling Multi-modal LLMs to 1000 Images Efficiently via Hybrid Architecture | arXiv |
| 19 | 2024 | journalArticle | Palaskar | AutoMixer for Improved Multivariate Time-Series Forecasting on Business and IT Observability Data | Proceedings of the AAAI Conference on Artificial Intelligence |
| 20 | 2024 | preprint | Darlow | DAM: Towards A Foundation Model for Time Series Forecasting | arXiv |
| 21 | 2024 | preprint | Li | OpenCity: Open Spatio-Temporal Foundation Models for Traffic Prediction | arXiv |
| 22 | 2023 | preprint | Jin | Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook | arXiv |
| 23 | 2024 | preprint | Jin | Position: What Can Large Language Models Tell Us about Time Series Analysis | arXiv |
| 24 | 2024 | preprint | Liu | Timer: Generative Pre-trained Transformers Are Large Time Series Models | arXiv |
| 25 | 2024 | preprint | Woo | Unified Training of Universal Time Series Forecasting Transformers | arXiv |
| 26 | 2024 | preprint | Ren | AIGC for Industrial Time Series: From Deep Generative Models to Large Generative Models | arXiv |
| 27 | 2024 | preprint | Chuang | Understanding Different Design Choices in Training Large Time Series Models | arXiv |
| 28 | 2024 | preprint | Rasul | Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting | arXiv |
| 29 | 2024 | preprint | Liu | AutoTimes: Autoregressive Time Series Forecasters via Large Language Models | arXiv |
| 30 | 2024 | preprint | Ye | A Survey of Time Series Foundation Models: Generalizing Time Series Representation with Large Language Model | arXiv |
| 31 | 2024 | preprint | Liu | CALF: Aligning LLMs for Time Series Forecasting via Cross-modal Fine-Tuning | arXiv |
| 32 | 2024 | preprint | Tan | Are Language Models Actually Useful for Time Series Forecasting? | arXiv |
| 33 | 2024 | preprint | Qin | Lightning Attention-2: A Free Lunch for Handling Unlimited Sequence Lengths in Large Language Models | Semantic Scholar |
| 34 | 2024 | preprint | Jin | Time-LLM: Time Series Forecasting by Reprogramming Large Language Models | arXiv |
| 35 | 2024 | preprint | Kamarthi | Large Pre-trained time series models for cross-domain Time series analysis tasks | arXiv |
| 36 | 2024 | preprint | Zhang | TimeRAF: Retrieval-Augmented Foundation model for Zero-shot Time Series Forecasting | arXiv |
| 37 | 2024 | preprint | Wen | Measuring Pre-training Data Quality without Labels for Time Series Foundation Models | arXiv |
| 38 | 2024 | preprint | Tire | Retrieval Augmented Time Series Forecasting | arXiv |
| 39 | 2024 | preprint | Liu | Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts | arXiv |
| 40 | 2024 | preprint | Liu | iTransformer: Inverted Transformers Are Effective for Time Series Forecasting | arXiv |
| 41 | 2024 | preprint | Gao | UNITS: A Unified Multi-Task Time Series Model | arXiv |
| 42 | 2024 | preprint | Jin | Efficient Multimodal Large Language Models: A Survey | arXiv |
| 43 | 2021 | journalArticle | Jin | Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook | Zotero |
| 44 | 2024 | preprint | Raposo | Mixture-of-Depths: Dynamically allocating compute in transformer-based language models | Semantic Scholar |
| 45 | 2024 | preprint | Goswami | MOMENT: A Family of Open Time-series Foundation Models | Semantic Scholar |
| 46 | 2025 | preprint | Zhang | TimesBERT: A BERT-Style Foundation Model for Time Series Understanding | arXiv |
| 47 | 2025 | preprint | Liu | Evaluating System 1 vs. 2 Reasoning Approaches for Zero-Shot Time Series Forecasting: A Benchmark and Insights | arXiv |
| 48 | 2024 | preprint | Liu | TimeCMA: Towards LLM-Empowered Multivariate Time Series Forecasting via Cross-Modality Alignment | arXiv |
| 49 | 2025 | preprint | Hu | Context-Alignment: Activating and Enhancing LLM Capabilities in Time Series | arXiv |
| 50 | 2024 | preprint | Zeng | How Much Can Time-related Features Enhance Time Series Forecasting? | arXiv |
| 51 | 2025 | preprint | Feofanov | Mantis: Lightweight Calibrated Foundation Model for User-Friendly Time Series Classification | arXiv |
| 52 | 2025 | preprint | Wiliński | Exploring Representations and Interventions in Time Series Foundation Models | arXiv |
| 53 | 2025 | preprint | Kong | Achieving Time Series Reasoning Requires Rethinking Model Design, Tasks Formulation, and Evaluation | arXiv |
| 54 | 2025 | preprint | Zhong | Time-VLM: Exploring Multimodal Vision-Language Models for Augmented Time Series Forecasting | arXiv |
| 55 | 2024 | preprint | Ma | A Survey on Time-Series Pre-Trained Models | arXiv |
| 56 | 2024 | preprint | Wang | ChatTime: A Unified Multimodal Time Series Foundation Model Bridging Numerical and Textual Data | arXiv |
| 57 | 2024 | preprint | Ekambaram | Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series | arXiv |
| 58 | 2025 | preprint | Hoo | From Tables to Time: Extending TabPFN-v2 to Time Series Forecasting | arXiv |
| 59 | 2024 | preprint | Yin | Apollo-Forecast: Overcoming Aliasing and Inference Speed Challenges in Language Models for Time Series Forecasting | arXiv |
| 60 | conferencePaper | Yao | Towards Neural Scaling Laws for Time Series Foundation Models | openreview.net | |
| 61 | 2025 | preprint | Kong | Time-MQA: Time Series Multi-Task Question Answering with Context Enhancement | arXiv |
| 62 | 2025 | journalArticle | Ren | Industrial Foundation Model | IEEE Transactions on Cybernetics |
| 63 | 2025 | journalArticle | Ren | Foundation Models for the Process Industry: Challenges and Opportunities | Engineering |
| 64 | 2025 | preprint | Cheng | Can Slow-thinking LLMs Reason Over Time? Empirical Studies in Time Series Forecasting | arXiv |
| 65 | 2024 | conferencePaper | Lu | In-context time series predictor | The Thirteenth International Conference on Learning Representations |
| 66 | 2025 | preprint | Zhao | Less is More: Unlocking Specialization of Time Series Foundation Models via Structured Pruning | arXiv |
| 67 | 2025 | preprint | Xie | ChatTS: Aligning Time Series with LLMs via Synthetic Data for Enhanced Understanding and Reasoning | arXiv |
| 68 | 2025 | preprint | Fang | Unraveling Spatio-Temporal Foundation Models via the Pipeline Lens: A Comprehensive Review | arXiv |
| 69 | 2025 | preprint | Wang | LightGTS: A Lightweight General Time Series Forecasting Model | arXiv |
| 70 | 2025 | preprint | Luo | Time Series Forecasting as Reasoning: A Slow-Thinking Approach with Reinforced LLMs | arXiv |
| 71 | 2025 | preprint | Zhang | TimeMaster: Training Time-Series Multimodal LLMs to Reason via Reinforcement Learning | arXiv |
| 72 | 2025 | preprint | Bumb | Forecasting Time Series with LLMs via Patch-Based Prompting and Decomposition | arXiv |
| 73 | 2025 | preprint | Wang | Output Scaling: YingLong-Delayed Chain of Thought in a Large Pretrained Time Series Forecasting Model | arXiv |
| 74 | 2025 | preprint | Cao | TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting | arXiv |
| 75 | 2025 | podcast | Carzaniga | A foundation model with multi-variate parallel attention to generate neuronal activity | |
| 76 | 2022 | preprint | Wu | Rating Quality of Diverse Time Series Data by Meta-learning from LLM Judgment | arXiv |
| 77 | 2025 | journalArticle | 邵泽志 | 决策智能中的时间序列预测大模型 | 指挥与控制学报 |
| 78 | 2025 | journalArticle | 姜峒伯 | 大语言模型赋能钢铁行业:技术与应用展望 | 冶金自动化 |
| 79 | 2025 | preprint | Liu | Sundial: A Family of Highly Capable Time Series Foundation Models | arXiv |
| 80 | 2025 | preprint | Wang | ITFormer: Bridging Time Series and Natural Language for Multi-Modal QA with Large-Scale Multitask Dataset | arXiv |
| 81 | 2025 | journalArticle | Wang | MetaIndux-TS: Frequency-Aware AIGC Foundation Model for Industrial Time Series | IEEE Transactions on Neural Networks and Learning Systems |
| 82 | 2025 | conferencePaper | Shao | BLAST: Balanced Sampling Time Series Corpus for Universal Forecasting Models | KDD '25: The 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining |
| 83 | 2025 | preprint | Tao | From Values to Tokens: An LLM-Driven Framework for Context-aware Time Series Forecasting via Symbolic Discretization | arXiv |
| 84 | 2025 | preprint | Liu | Time-R1: Towards Comprehensive Temporal Reasoning in LLMs | arXiv |
| 85 | 2024 | journalArticle | Ye | DualTime: a dual-adapter language model for time series multimodal representation learning | openreview.net |
| 86 | 2025 | journalArticle | 侯卫锋 | 工业大模型赋能的新型流程工业智能工厂核心工业软件体系 | 中国科学:信息科学 |
| 87 | 2025 | journalArticle | Wang | CoLLM: Industrial Large-Small Model Collaboration with Fuzzy Decision-making Agent and Self-Reflection | IEEE Transactions on Fuzzy Systems |
| 88 | 2025 | preprint | Ye | When LLM Meets Time Series: Can LLMs Perform Multi-Step Time Series Reasoning and Inference | arXiv |
| 89 | 2025 | journalArticle | 王宇航 | 能源行业AI大模型工程化实践研究 | 信息通信技术与政策 |
| 90 | 2025 | journalArticle | Tao | From prompt design to iterative generation: Leveraging LLMs in PSE applications | Computers & Chemical Engineering |
| 91 | 2025 | preprint | Zhang | Adapting Large Language Models for Time Series Modeling via a Novel Parameter-efficient Adaptation Method | arXiv |
| 92 | 2024 | preprint | Das | A decoder-only foundation model for time-series forecasting | arXiv |
| 93 | 2025 | preprint | Zhou | CaTS-Bench: Can Language Models Describe Time Series? | arXiv |
| 94 | 2025 | preprint | Qiao | Multi-Scale Finetuning for Encoder-based Time Series Foundation Models | arXiv |
| 95 | 2025 | preprint | Meyer | Rethinking Evaluation in the Era of Time Series Foundation Models: (Un)known Information Leakage Challenges | arXiv |
| 96 | 2025 | preprint | Li | TSFM-Bench: A Comprehensive and Unified Benchmark of Foundation Models for Time Series Forecasting | arXiv |
| 97 | 2025 | preprint | Cohen | This Time is Different: An Observability Perspective on Time Series Foundation Models | arXiv |
| 98 | 2025 | preprint | Zhao | TimeSeriesScientist: A General-Purpose AI Agent for Time Series Analysis | arXiv |
| 99 | 2025 | preprint | Xiao | TimeFound: A Foundation Model for Time Series Forecasting | arXiv |
| 100 | 2025 | preprint | Hou | A Theoretical Analysis of Detecting Large Model-Generated Time Series | arXiv |
| 101 | 2025 | preprint | Wu | SciTS: Scientific Time Series Understanding and Generation with LLMs | arXiv |
| 102 | 2025 | conferencePaper | Jing | Foresail: LLM Sensor Knowledge Empowered Status-guided Network for Multivariate Time-series Classification | ACM Digital Library |
| 103 | 2025 | journalArticle | 陈致蓬 | 工业垂域具身智控大模型构建新范式探索 | 自动化学报 |
| 104 | 2025 | preprint | Wu | Aurora: Towards Universal Generative Multimodal Time Series Forecasting | arXiv |
| 105 | 2025 | conferencePaper | He | GTM: a general time-series model for enhanced representation learning of time-series data | The Fourteenth International Conference on Learning Representations |
| 106 | 2026 | preprint | Guo | T-LLM: Teaching Large Language Models to Forecast Time Series via Temporal Distillation | arXiv |
| 107 | 2026 | preprint | Guan | TimeOmni-1: Incentivizing Complex Reasoning with Time Series in Large Language Models | arXiv |
| 108 | 2026 | journalArticle | Han | Unica: Unified covariate adaptation for time series foundation model | Zotero |
| 109 | 2026 | preprint | Liu | Timer-S1: A Billion-Scale Time Series Foundation Model with Serial Scaling | arXiv |
| 110 | 2025 | conferencePaper | Liu | Semantic-enhanced time-series forecasting via large language models | The Fourteenth International Conference on Learning Representations |
| 111 | 2026 | journalArticle | Chu | A$^{2}$RA-NSMTSllm: Adversarially Aligning Retrieval-Augmented LLMs for Nonstationary Multivariate Time Series Forecasting | IEEE Transactions on Industrial Informatics |
| 112 | 2025 | journalArticle | Ye | TS-reasoner: domain-oriented time series inference agents for reasoning and automated analysis | Transactions on Machine Learning Research |
| 113 | 2026 | preprint | Liu | Timer-S1: a billion-scale time series foundation model with serial scaling | arXiv |
| 114 | journalArticle | Shen | Multi-modal view enhanced large vision models for long-term time series forecasting | Zotero |