Grok Row: X is responsible for the controversial content of the AI tool 'Grok'; Government sources make a big claim amid the controversy

Posted on 21st Mar 2025 by rohit kumar

Social media platform X (formerly Twitter) may be responsible for all the content created by its artificial intelligence (AI) tool Grok. According to a government source, a legal opinion will be decided on this matter soon. Recently, some users asked questions related to Indian leaders on X to Grok, whose answers were found to be controversial at times. The source said that the Ministry of Electronics and IT is talking to social media platform X about this so that its way of working can be understood and assessed.

 

IT Ministry did not send any notice to Grok-X

 

Meanwhile, government sources claimed that the Ministry of Electronics and Information Technology has given information about X or Grok. The ministry has said that it has not sent any notice to Grok or X. The ministry is in talks with X and Grok. Officials of the Ministry of Electronics and Information Technology are talking to the officials of X and investigating at what level Indian law has been violated.

 

Government's strictness and old cases

Last year, Google's AI tool Gemini made objectionable comments about Prime Minister Narendra Modi, after which the government took immediate action and issued new guidelines on AI content. A government source said that guidelines are already in place regarding content on social media and companies must follow them.

 

X vs Government of India - case in court

Earlier today, X, a company owned by Grok AI, filed a case against the Government of India in the Karnataka High Court regarding Section 79(3) of the IT Act. The company says that the government is using this section to block content arbitrarily, which is against the decision of the Supreme Court. X has claimed that any content should be blocked only under Section 69A, but the government is creating a parallel system, which is against the law.

 

Section 79: Responsibility of social media platforms

Section 79(1) of the IT Act gives protection to social media platforms for the content posted by their users. But under Section 79(3), if a platform does not remove objectionable content despite government orders, it can face legal action. If a platform does not remove content within 36 hours, it risks losing 'safe harbor' protection and can be prosecuted under other laws, including the IPC.

Other news