A Karnataka-based software firm has moved court against US artificial intelligence company Anthropic, alleging that the latter's entry into India has caused customer confusion due to the shared use of the company name.
Anthropic Software Private Ltd, founded in 2017, has filed a complaint in the Commercial Division of the District Court at Belagavi, accusing the San Francisco-based AI company of causing confusion, misrepresenting itself and weakening the Indian firm's brand in India.
"I am exercising my legal right as it's causing huge confusion to my customers," Mohammad Ayyaz Mulla, founder and director of Anthropic Software, told TechCrunch. He said the company is seeking clarity and recognition rather than a confrontation.
The complaint, filed in January, says that the Indian company has operated under the “Anthropic” name since its inception, while the US AI firm was established in 2021. It seeks recognition of prior use, measures to prevent further confusion, and Rs 90 lakh (roughly $110,000) in damages.
Although the court has issued summons to the US-based firm, it declined to grant an interim injunction in an order dated January 20. The case will be heard next on February 16.
Anthropic Software develops digital platforms across education, connectivity, and safety. Its products include an AI-driven education ERP, a Wi-Fi monetisation platform, and a patented driving safety solution. The company works with government bodies, educational institutions and student communities, particularly in rural and underserved areas.
The dispute coincides with Anthropic's aggressive expansion in India and South Asia. In October 2024, the US-based AI company announced the opening of its India office and appointed former Microsoft India Managing Director Irina Ghose to lead its local operations.
Anthropic has also disrupted the global IT services sector through rapid advances in its Claude AI platform, with traditional IT firms, including Indian majors such as Infosys, Wipro and TCS, facing pressure from increased automation.














