Image for Article: Paras Chopra’s Lossfunk gets AI models to speak Tulu through prompts, not training

Article Details

Title
Article: Paras Chopra’s Lossfunk gets AI models to speak Tulu through prompts, not training
Impact Score
5 / 10
AI Summary (Processed Content)

AI research lab Lossfunk has developed a prompting method that enables large language models to generate text in Tulu, a low-resource Indian language, without prior training on it. The technique uses a detailed, multi-layered prompt incorporating grammar rules and negative constraints to avoid words from dominant languages like Kannada.

This approach achieved up to 85% grammatical accuracy across several major AI models, a significant improvement from an initial 18% accuracy with high language contamination. The success suggests the models are applying the provided linguistic structure rather than memorizing data.

The development could serve as a template for incorporating other underrepresented languages into AI systems, potentially bypassing the need for expensive data collection and specialized model training.

Main Topics: AI research, low-resource languages, prompt engineering, Tulu language, grammatical accuracy, AI adoption in India.

Original URL
https://economictimes.indiatimes.com/tech/artificial-intelligence/paras-chopras-lossfunk-gets-ai-models-to-speak-tulu-through-prompts-not-training/articleshow/129432315.cms
Source Feed
Tech-Economic Times
Published Date
2026-03-11 02:37
Fetched Date
2026-03-11 00:30
Processed Date
2026-03-11 00:31
Embedding Status
Present
Cluster ID
Not Clustered
Raw Extracted Content