Collected molecules will appear here. Add from search or explore.
A billion-scale time-series foundation model (TSFM) utilizing a Mixture-of-Experts (MoE) architecture to handle diverse data distributions and frequencies in zero-shot forecasting.
stars
943
forks
110
Time-MoE represents a significant milestone in the shift from domain-specific time-series models to unified foundation models. With nearly 1,000 stars and an ICLR 2025 Spotlight designation, it carries high academic and practitioner credibility. Its primary moat is the 'scaling law' application to time series; while many models like Lag-Llama or Chronos exist, scaling to 2.4B parameters using MoE (specifically with TSMixer experts) is a non-trivial engineering feat that requires substantial compute and curated data. The defensibility is a 7 because, while the code is open, the cost of training such a model from scratch is a barrier for most. However, it faces high platform risk from Google (TimesFM) and Amazon (Chronos), who are incentivized to bake these capabilities directly into their cloud-based AutoML suites. The market for TSFMs is rapidly consolidating around models that can handle 'zero-shot' forecasting across diverse industries, and Time-MoE is currently a front-runner in the open-source MoE niche. The 0.0/hr velocity reflects its status as an academic release, but the fork count (110) indicates active community experimentation and derivative work.
TECH STACK
INTEGRATION
reference_implementation
READINESS