Model Gallery

1 models from 1 repositories

Filter by type:

Filter by tags:

salamandra-7b-instruct
Transformer-based decoder-only language model that has been pre-trained on 7.8 trillion tokens of highly curated data. The pre-training corpus contains text in 35 European languages and code. Salamandra comes in three different sizes — 2B, 7B and 40B parameters — with their respective base and instruction-tuned variants. This model card corresponds to the 7B instructed version.

Repository: localaiLicense: apache-2.0