A PyTorch library of transformer models and components
State-of-the-art transformers, brick by brick
Curated Transformers is a transformer library for PyTorch. It provides state-of-the-art models that are composed from a set of reusable components. The stand-out features of Curated Transformer are:
bitsandbytes
library
and each model can use the PyTorch meta
device to avoid unnecessary
allocations and initialization.Curated Transformers has been production-tested by Explosion and will be used as the default transformer implementation in spaCy 3.7.
Supported encoder-only models:
Supported decoder-only models:
Generator wrappers:
All types of models can be loaded from Huggingface Hub.
spaCy integration for curated transformers is provided by the
spacy-curated-transformers
package.
pip install curated-transformers
The default Linux build of PyTorch is built with CUDA 11.7 support. You should explicitly install a CUDA build in the following cases:
In both cases, you can install PyTorch with:
pip install torch --index-url https://download.pytorch.org/whl/cu118
>>> import torch
>>> from curated_transformers.generation import AutoGenerator, GreedyGeneratorConfig
>>> generator = AutoGenerator.from_hf_hub(name="tiiuae/falcon-7b-instruct", device=torch.device("cuda"))
>>> generator(["What is Python in one sentence?", "What is Rust in one sentence?"], GreedyGeneratorConfig())
['Python is a high-level programming language that is easy to learn and widely used for web development, data analysis, and automation.',
'Rust is a programming language that is designed to be a safe, concurrent, and efficient replacement for C++.']
You can find more usage examples
in the documentation. You can also find example programs that use Curated Transformers in the
examples
directory.
You can read more about how to use Curated Transformers here:
curated-transformers
supports dynamic 8-bit and 4-bit quantization of models by leveraging the bitsandbytes
library.
Use the quantization variant to automatically install the necessary dependencies:
pip install curated-transformers[quantization]