mirror of
https://github.com/huggingface/transformers.git
synced 2025-07-31 02:02:21 +06:00
parent
50f5266b2c
commit
d818dd3a41
@ -25,6 +25,21 @@ Several smaller versions of the models have been trained on the same dataset. BL
|
||||
- [bloom-7b1](https://huggingface.co/bigscience/bloom-7b1)
|
||||
- [bloom](https://huggingface.co/bigscience/bloom) (176B parameters)
|
||||
|
||||
## Resources
|
||||
|
||||
|
||||
A list of official Hugging Face and community (indicated by 🌎) resources to help you get started with BLOOM. If you're interested in submitting a resource to be included here, please feel free to open a Pull Request and we'll review it! The resource should ideally demonstrate something new instead of duplicating an existing resource.
|
||||
|
||||
<PipelineTag pipeline="text-generation"/>
|
||||
|
||||
- [`BloomForCausalLM`] is suppported by this [causal language modeling example script](https://github.com/huggingface/transformers/tree/main/examples/pytorch/language-modeling#gpt-2gpt-and-causal-language-modeling) and [notebook](https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/language_modeling.ipynb).
|
||||
|
||||
⚡️ Inference
|
||||
- A blog on [Optimization story: Bloom inference](https://huggingface.co/blog/bloom-inference-optimization).
|
||||
- A blog on [Incredibly Fast BLOOM Inference with DeepSpeed and Accelerate](https://huggingface.co/blog/bloom-inference-pytorch-scripts).
|
||||
|
||||
⚙️ Training
|
||||
- A blog on [The Technology Behind BLOOM Training](https://huggingface.co/blog/bloom-megatron-deepspeed).
|
||||
|
||||
## BloomConfig
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user