mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
[Time-Series] Added blog-post to tips (#24482)
* [Time-Series] Added blog-post to tips * added Resources to time series models docs * removed "with Bert"
This commit is contained in:
parent
e16191a8ac
commit
fc7ce2ebc5
@ -29,6 +29,12 @@ The abstract from the paper is the following:
|
||||
This model was contributed by [elisim](https://huggingface.co/elisim) and [kashif](https://huggingface.co/kashif).
|
||||
The original code can be found [here](https://github.com/thuml/Autoformer).
|
||||
|
||||
## Resources
|
||||
|
||||
A list of official Hugging Face and community (indicated by 🌎) resources to help you get started. If you're interested in submitting a resource to be included here, please feel free to open a Pull Request and we'll review it! The resource should ideally demonstrate something new instead of duplicating an existing resource.
|
||||
|
||||
- Check out the Autoformer blog-post in HuggingFace blog: [Yes, Transformers are Effective for Time Series Forecasting (+ Autoformer)](https://huggingface.co/blog/autoformer)
|
||||
|
||||
## AutoformerConfig
|
||||
|
||||
[[autodoc]] AutoformerConfig
|
||||
@ -43,4 +49,4 @@ The original code can be found [here](https://github.com/thuml/Autoformer).
|
||||
## AutoformerForPrediction
|
||||
|
||||
[[autodoc]] AutoformerForPrediction
|
||||
- forward
|
||||
- forward
|
||||
|
@ -29,7 +29,10 @@ The abstract from the paper is the following:
|
||||
This model was contributed by [elisim](https://huggingface.co/elisim) and [kashif](https://huggingface.co/kashif).
|
||||
The original code can be found [here](https://github.com/zhouhaoyi/Informer2020).
|
||||
|
||||
Tips:
|
||||
## Resources
|
||||
|
||||
A list of official Hugging Face and community (indicated by 🌎) resources to help you get started. If you're interested in submitting a resource to be included here, please feel free to open a Pull Request and we'll review it! The resource should ideally demonstrate something new instead of duplicating an existing resource.
|
||||
|
||||
- Check out the Informer blog-post in HuggingFace blog: [Multivariate Probabilistic Time Series Forecasting with Informer](https://huggingface.co/blog/informer)
|
||||
|
||||
## InformerConfig
|
||||
|
@ -29,7 +29,6 @@ The Time Series Transformer model is a vanilla encoder-decoder Transformer for t
|
||||
|
||||
Tips:
|
||||
|
||||
- Check out the Time Series Transformer blog-post in HuggingFace blog: [Probabilistic Time Series Forecasting with 🤗 Transformers](https://huggingface.co/blog/time-series-transformers)
|
||||
- Similar to other models in the library, [`TimeSeriesTransformerModel`] is the raw Transformer without any head on top, and [`TimeSeriesTransformerForPrediction`]
|
||||
adds a distribution head on top of the former, which can be used for time-series forecasting. Note that this is a so-called probabilistic forecasting model, not a
|
||||
point forecasting model. This means that the model learns a distribution, from which one can sample. The model doesn't directly output values.
|
||||
@ -60,6 +59,12 @@ which is then fed to the decoder in order to make the next prediction (also call
|
||||
|
||||
This model was contributed by [kashif](https://huggingface.co/kashif).
|
||||
|
||||
## Resources
|
||||
|
||||
A list of official Hugging Face and community (indicated by 🌎) resources to help you get started. If you're interested in submitting a resource to be included here, please feel free to open a Pull Request and we'll review it! The resource should ideally demonstrate something new instead of duplicating an existing resource.
|
||||
|
||||
- Check out the Time Series Transformer blog-post in HuggingFace blog: [Probabilistic Time Series Forecasting with 🤗 Transformers](https://huggingface.co/blog/time-series-transformers)
|
||||
|
||||
|
||||
## TimeSeriesTransformerConfig
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user