Tiny Models
Collection
7 items • Updated • 1
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("TitanML/tiny-mixtral")
model = AutoModelForCausalLM.from_pretrained("TitanML/tiny-mixtral")YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
This is a tiny-random mixtral, useful for testing and CI/CD pipelines. It is not trained at all, and not suitable for inferencing in any application.
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="TitanML/tiny-mixtral")