Sarvam-1 is a 2-billion parameter language model designed for Indian languages, optimized for token efficiency and high-quality training data. It outperforms larger models in several benchmarks, providing superior performance on Indic language tasks with enhanced computational efficiency. Built with a custom tokenizer and 4 trillion tokens from diverse sources, Sarvam-1 is ideal for applications like translation and edge device deployment.
CC BY-SA 4.0
sarvamai
Multilingual Language Model
Open
Sarvam AI
Science, Technology and Research
24/02/25 07:45:49
Aashay Sachdeva
0
CC BY-SA 4.0
© 2025 - Copyright AIKosha. All rights reserved. This portal is developed by National e-Governance Division for IndiaAI mission.