transformers/docs/source/en/main_classes
Joao Gante b0f0c61899
Add SynthID (watermerking by Google DeepMind) (#34350)
* Add SynthIDTextWatermarkLogitsProcessor

* esolving comments.

* Resolving comments.

* esolving commits,

* Improving SynthIDWatermark tests.

* switch to PT version

* detector as pretrained model + style

* update training + style

* rebase

* Update logits_process.py

* Improving SynthIDWatermark tests.

* Shift detector training to wikitext negatives and stabilize with lower learning rate.

* Clean up.

* in for 7B

* cleanup

* upport python 3.8.

* README and final cleanup.

* HF Hub upload and initiaze.

* Update requirements for synthid_text.

* Adding SynthIDTextWatermarkDetector.

* Detector testing.

* Documentation changes.

* Copyrights fix.

* Fix detector api.

* ironing out errors

* ironing out errors

* training checks

* make fixup and make fix-copies

* docstrings and add to docs

* copyright

* BC

* test docstrings

* move import

* protect type hints

* top level imports

* watermarking example

* direct imports

* tpr fpr meaning

* process_kwargs

* SynthIDTextWatermarkingConfig docstring

* assert -> exception

* example updates

* no immutable dict (cant be serialized)

* pack fn

* einsum equivalent

* import order

* fix test on gpu

* add detector example

---------

Co-authored-by: Sumedh Ghaisas <sumedhg@google.com>
Co-authored-by: Marc Sun <marc@huggingface.co>
Co-authored-by: sumedhghaisas2 <138781311+sumedhghaisas2@users.noreply.github.com>
Co-authored-by: raushan <raushan@huggingface.co>
2024-10-23 21:18:52 +01:00
..
agent.md Decorator for easier tool building (#33439) 2024-09-18 11:07:51 +02:00
backbones.md doc: fix broken BEiT and DiNAT model links on Backbone page (#32029) 2024-07-17 20:24:10 +01:00
callback.md Update CometCallback to allow reusing of the running experiment (#31366) 2024-07-05 08:13:46 +02:00
configuration.md Migrate doc files to Markdown. (#24376) 2023-06-20 18:07:47 -04:00
data_collator.md Enhancing SFT Training Efficiency Using Packing and FlashAttention2 with Position IDs (#31629) 2024-07-23 15:56:41 +02:00
deepspeed.md [docs] DeepSpeed (#28542) 2024-01-24 08:31:28 -08:00
executorch.md Fix flax failures (#33912) 2024-10-11 14:38:35 +02:00
feature_extractor.md Fixed typos (#26810) 2023-10-16 09:52:29 +02:00
image_processor.md Fast image processor (#28847) 2024-06-11 15:47:38 +01:00
keras_callbacks.md Migrate doc files to Markdown. (#24376) 2023-06-20 18:07:47 -04:00
logging.md Fixed Majority of the Typos in transformers[en] Documentation (#33350) 2024-09-09 10:47:24 +02:00
model.md Speedup model init on CPU (by 10x+ for llama-3-8B as one example) (#31771) 2024-07-16 09:32:01 -04:00
onnx.md Migrate doc files to Markdown. (#24376) 2023-06-20 18:07:47 -04:00
optimizer_schedules.md Fixed Majority of the Typos in transformers[en] Documentation (#33350) 2024-09-09 10:47:24 +02:00
output.md Fixed Majority of the Typos in transformers[en] Documentation (#33350) 2024-09-09 10:47:24 +02:00
pipelines.md Allow FP16 or other precision inference for Pipelines (#31342) 2024-07-05 17:21:50 +01:00
processors.md [docs] fixed links with 404 (#27327) 2023-11-06 19:45:03 +00:00
quantization.md FEAT : Adding BitNet quantization method to HFQuantizer (#33410) 2024-10-09 17:51:41 +02:00
text_generation.md Add SynthID (watermerking by Google DeepMind) (#34350) 2024-10-23 21:18:52 +01:00
tokenizer.md [PretrainedTokenizer] add some of the most important functions to the doc (#27313) 2023-11-06 15:11:00 +01:00
trainer.md Fixed Majority of the Typos in transformers[en] Documentation (#33350) 2024-09-09 10:47:24 +02:00