Huggingface transformers: cannot import BitsAndBytesConfig from transformers
Matthew Barrera
Following through the Huggingface quantization guide, I installed the following:
pip install transformers accelerate bitsandbytes(It yielded transformers 4.26.0, accelerate 0.16.0, bitsandbytes 0.37.0, which seems to match the guide’s requirements.)
Then ran the first line of the offload code in Python:
from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfigIt however resulted in the following error: ImportError: cannot import name 'BitsAndBytesConfig' from 'transformers' (/usr/local/lib/python3.10/dist-packages/transformers/__init__.py).
Doing grep BitsAndBytesConfig -r /usr/local/lib/python3.10/dist-packages yields nothing.
Is there a step I might have skipped, or a version inconsistency I could work around?
2 Answers
BitsAndBytesConfig was added only recently, and the latest release dates back to earlier.
The online documentation is generated from the source’s mdx, so it sometimes references things that are not yet released. However, it can be tried by installing from source:
pip install git+ Install from the original github repository
pip install git+
from transformers import BitsAndBytesConfig