Model Card for Melinoe-14B

Model Information

  • Model Name: Melinoe-14B
  • Base Model: Qwen3-14B
  • Model Type: A causal language model fine-tuned for [e.g., instruction following, dialogue, code generation].
  • License: [e.g., Apache 2.0, MIT]

Intended Use

This model is designed for [briefly describe the primary use case, e.g., 'serving as a conversational chatbot on technology-related topics']. It should not be used for high-stakes decisions or generating harmful content. Fact-check important information.

Limitations

The model may produce factually incorrect or biased information. Its knowledge is limited to its training data and it can be prone to hallucination.

How to Use

from transformers import AutoModelForCausalLM, AutoTokenizer

# Load model and tokenizer
model_name = "[your-model-name-on-huggingface-or-local-path]"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

# Generate text
prompt = "Your prompt here"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Downloads last month
17
Safetensors
Model size
15B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for bgg1996/Melinoe-14B

Finetuned
Qwen/Qwen3-14B
Finetuned
(155)
this model
Quantizations
2 models

Collection including bgg1996/Melinoe-14B