Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Question, apologize if slightly off-topic, it's something I'd like to use this project for: Is there an example of how to train GPT-2 on time series, in particular with covariates?

As my understanding of LLM goes at a basic level it's predicting the next token from previous tokens, which sounds directionally similar to time series (perhaps letting aside periodicity).



Yes there are many attempts in applying a transformers to timeseries forecasting. For instance (but there are many more): - Timegpt https://arxiv.org/abs/2310.03589 - Chronos https://github.com/amazon-science/chronos-forecasting

These kind of papers often talk the world, but often lack a proper baseline model. They only compare against very simple (naive forecast), or non tuned models. In my experience a gradient boosting model will probably solve 95% of your forecasting problems, and trying to get fancy with a transformer (or even just a simple neural net) is more trouble then it is worth.


Yes general LLM models can be used for time series forecasting:

https://github.com/KimMeen/Time-LLM




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: