Velvet Star Monitor

Standout celebrity highlights with iconic style.

updates

Huggingface transformers: cannot import BitsAndBytesConfig from transformers

Writer Matthew Barrera

Following through the Huggingface quantization guide, I installed the following:

pip install transformers accelerate bitsandbytes

(It yielded transformers 4.26.0, accelerate 0.16.0, bitsandbytes 0.37.0, which seems to match the guide’s requirements.)

Then ran the first line of the offload code in Python:

from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig

It however resulted in the following error: ImportError: cannot import name 'BitsAndBytesConfig' from 'transformers' (/usr/local/lib/python3.10/dist-packages/transformers/__init__.py).

Doing grep BitsAndBytesConfig -r /usr/local/lib/python3.10/dist-packages yields nothing.

Is there a step I might have skipped, or a version inconsistency I could work around?

2 Answers

BitsAndBytesConfig was added only recently, and the latest release dates back to earlier.

The online documentation is generated from the source’s mdx, so it sometimes references things that are not yet released. However, it can be tried by installing from source:

pip install git+

Install from the original github repository

pip install git+
from transformers import BitsAndBytesConfig

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge that you have read and understand our privacy policy and code of conduct.