IndicBARTSS is a multilingual, sequence-to-sequence pre-trained model designed for natural language generation tasks in 11 Indic languages
IndicBARTSS is a multilingual, sequence-to-sequence pre-trained model based on the mBART architecture, designed for Indic language generation tasks such as machine translation, summarization, and question generation. Supporting 11 Indic languages and English, it is smaller and more efficient than mBART50 and mT5. Trained on 452 million sentences, it eliminates the need for script mapping by using native scripts, making it ideal for low-resource language processing and fine-tuning.
MIT
Raj Dabre and Himani Shrotriya and Anoop Kunchukuttan and Ratish Puduppully and Mitesh M. Khapra and Pratyush Kumar
Summarization Model
open
AI4Bharat
Sector Agnostic
21/02/25 13:21:57
Nikhil Narasimhan
0
MIT
© 2025 - Copyright AIKosha. All rights reserved. This portal is developed by National e-Governance Division for IndiaAI mission.