How to use zai-org/chatglm2-6b with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zai-org/chatglm2-6b", trust_remote_code=True, dtype="auto")
Add(override) the get_output_embeddings() in model class to return the correct LM_head module.This is useful for applications that need to find out the name of LM_head module.
· Sign up or log in to comment