We released a family of open-weight models today, 4M to 2.5B params.
It's the first time we've been able to scale time series foundation models this large and train them long enough to see performance keep improving with size.
These were trained on internal Datadog observability data (from monitoring ourselves) and synthetic data.
More details in our blog post (https://www.datadoghq.com/blog/ai/toto-2/), technical report is coming soon.
Happy to answer questions.