mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-28 08:42:23 +06:00
![]() * initial files * initial model via cli * typos * make a start on the model config * ready with configuation * remove tokenizer ref. * init the transformer * added initial model forward to return dec_output * require gluonts * update dep. ver table and add as extra * fixed typo * add type for prediction_length * use num_time_features * use config * more config * typos * opps another typo * freq can be none * default via transformation is 1 * initial transformations * fix imports * added transform_start_field * add helper to create pytorch dataloader * added inital val and test data loader * added initial distr head and loss * training working * remove TimeSeriesTransformerTokenizer Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * Update src/transformers/__init__.py Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * Update src/transformers/models/time_series_transformer/__init__.py Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * fixed copyright * removed docs * remove time series tokenizer * fixed docs * fix text * fix second * fix default * fix order * use config directly * undo change * fix comment * fix year * fix import * add additional arguments for training vs. test * initial greedy inference loop * fix inference * comment out token inputs to enc dec * Use HF encoder/decoder * fix inference * Use Seq2SeqTSModelOutput output * return Seq2SeqTSPredictionOutput * added default arguments * fix return_dict true * scale is a tensor * output static_features for inference * clean up some unused bits * fixed typo * set return_dict if none * call model once for both train/predict * use cache if future_target is none * initial generate func * generate arguments * future_time_feat is required * return SampleTSPredictionOutput * removed unneeded classes * fix when params is none * fix return dict * fix num_attention_heads * fix arguments * remove unused shift_tokens_right * add different dropout configs * implement FeatureEmbedder, Scaler and weighted_average * remove gluonts dependency * fix class names * avoid _variable names * remove gluonts dependency * fix imports * remove gluonts from configuration * fix docs * fixed typo * move utils to examples * add example requirements * config has no freq * initial run_ts_no_trainer * remove from ignore * fix output_attentions and removed unsued getters/setters * removed unsed tests * add dec seq len * add test_attention_outputs * set has_text_modality=False * add config attribute_map * make style * make fix-copies * add encoder_outputs to TimeSeriesTransformerForPrediction forward * Improve docs, add model to README * added test_forward_signature * More improvements * Add more copied from * Fix README * Fix remaining quality issues * updated encoder and decoder * fix generate * output_hidden_states and use_cache are optional * past key_values returned too * initialize weights of distribution_output module * fixed more tests * update test_forward_signature * fix return_dict outputs * Update src/transformers/models/time_series_transformer/configuration_time_series_transformer.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * Update src/transformers/models/time_series_transformer/configuration_time_series_transformer.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * Update src/transformers/models/time_series_transformer/configuration_time_series_transformer.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * Update src/transformers/models/time_series_transformer/configuration_time_series_transformer.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * Update src/transformers/models/time_series_transformer/modeling_time_series_transformer.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * Update src/transformers/models/time_series_transformer/modeling_time_series_transformer.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * Update src/transformers/models/time_series_transformer/modeling_time_series_transformer.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * removed commented out tests * added neg. bin and normal output * Update src/transformers/models/time_series_transformer/configuration_time_series_transformer.py Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * move to one line * Add docstrings * Update src/transformers/models/time_series_transformer/configuration_time_series_transformer.py Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * add try except for assert and raise * try and raise exception * fix the documentation formatting * fix assert call * fix docstring formatting * removed input_ids from DOCSTRING * Update input docstring * Improve variable names * Update order of inputs * Improve configuration * Improve variable names * Improve docs * Remove key_length from tests * Add extra docs * initial unittests * added test_inference_no_head test * added test_inference_head * add test_seq_to_seq_generation * make style * one line * assert mean prediction * removed comments * Update src/transformers/models/time_series_transformer/modeling_time_series_transformer.py Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * Update src/transformers/models/time_series_transformer/modeling_time_series_transformer.py Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> * fix order of args * make past_observed_mask optional as well * added Amazon license header * updated utils with new fieldnames * make style * cleanup * undo position of past_observed_mask * fix import * typo * more typo * rename example files * remove example for now * Update docs/source/en/_toctree.yml Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * Update src/transformers/models/time_series_transformer/configuration_time_series_transformer.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * Update src/transformers/models/time_series_transformer/modeling_time_series_transformer.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * Update src/transformers/models/time_series_transformer/modeling_time_series_transformer.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * Update modeling_time_series_transformer.py fix style * fixed typo * fix typo and grammer * fix style Co-authored-by: NielsRogge <48327001+NielsRogge@users.noreply.github.com> Co-authored-by: NielsRogge <niels.rogge1@gmail.com> Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> |
||
---|---|---|
.. | ||
albert | ||
auto | ||
bart | ||
barthez | ||
bartpho | ||
beit | ||
bert | ||
bert_generation | ||
bert_japanese | ||
bertweet | ||
big_bird | ||
bigbird_pegasus | ||
blenderbot | ||
blenderbot_small | ||
bloom | ||
bort | ||
byt5 | ||
camembert | ||
canine | ||
clip | ||
codegen | ||
conditional_detr | ||
convbert | ||
convnext | ||
cpm | ||
ctrl | ||
cvt | ||
data2vec | ||
deberta | ||
deberta_v2 | ||
decision_transformer | ||
deformable_detr | ||
deit | ||
detr | ||
distilbert | ||
dit | ||
donut | ||
dpr | ||
dpt | ||
electra | ||
encoder_decoder | ||
ernie | ||
esm | ||
flaubert | ||
flava | ||
fnet | ||
fsmt | ||
funnel | ||
glpn | ||
gpt_neo | ||
gpt_neox | ||
gpt_neox_japanese | ||
gpt2 | ||
gptj | ||
groupvit | ||
herbert | ||
hubert | ||
ibert | ||
imagegpt | ||
layoutlm | ||
layoutlmv2 | ||
layoutlmv3 | ||
layoutxlm | ||
led | ||
levit | ||
longformer | ||
longt5 | ||
luke | ||
lxmert | ||
m2m_100 | ||
marian | ||
markuplm | ||
maskformer | ||
mbart | ||
mbart50 | ||
mctct | ||
megatron_bert | ||
megatron_gpt2 | ||
mluke | ||
mobilebert | ||
mobilevit | ||
mpnet | ||
mt5 | ||
mvp | ||
nezha | ||
nllb | ||
nystromformer | ||
openai | ||
opt | ||
owlvit | ||
pegasus | ||
pegasus_x | ||
perceiver | ||
phobert | ||
plbart | ||
poolformer | ||
prophetnet | ||
qdqbert | ||
rag | ||
realm | ||
reformer | ||
regnet | ||
rembert | ||
resnet | ||
retribert | ||
roberta | ||
roformer | ||
segformer | ||
sew | ||
sew_d | ||
speech_encoder_decoder | ||
speech_to_text | ||
speech_to_text_2 | ||
splinter | ||
squeezebert | ||
swin | ||
swinv2 | ||
t5 | ||
tapas | ||
tapex | ||
time_series_transformer | ||
trajectory_transformer | ||
transfo_xl | ||
trocr | ||
unispeech | ||
unispeech_sat | ||
van | ||
videomae | ||
vilt | ||
vision_encoder_decoder | ||
vision_text_dual_encoder | ||
visual_bert | ||
vit | ||
vit_mae | ||
vit_msn | ||
wav2vec2 | ||
wav2vec2_conformer | ||
wav2vec2_phoneme | ||
wav2vec2_with_lm | ||
wavlm | ||
x_clip | ||
xglm | ||
xlm | ||
xlm_prophetnet | ||
xlm_roberta | ||
xlm_roberta_xl | ||
xlnet | ||
yolos | ||
yoso | ||
__init__.py |