Supreme Court Observes Judges Exercise Caution in Using AI, Allows Suggestions on Administrative Side
Lexpedia · 4 December 2025, 12:00 am

The Supreme Court, while hearing a PIL seeking guidelines to check the misuse of Artificial Intelligence (AI) in courts, observed that judges exercise extreme caution in using AI. The bench comprising Chief Justice of India Surya Kant and Justice Joymalya Bagchi heard a petition seeking regulation of the “unregulated” use of generative AI in court proceedings.
CJI Rejects Claim of Unregulated AI Usage
The Chief Justice rejected the assumption of unregulated use, stating: “There is no question of unregulated use by us. I, my brothers and sisters have spoken on this that we are using it in a very careful manner. We don’t want AI and machine learning to overpower the judicial decision-making process many times we have highlighted.”
Concerns Over AI-Generated Fake Precedents
Senior Advocate Anupam Lal Das, appearing for petitioner Karthikeya Rawal, highlighted instances where AI-generated fake precedents were cited by advocates. The CJI responded that AI tools likely produced these fake precedents because advocates cited such case laws and emphasized that lawyers must remain alert to misuse, noting that reliance on fabricated material is contrary to professional responsibility.
Kerala High Court Policy and White Paper
The counsel referred to the Supreme Court white paper on AI and the Kerala High Court’s policy on responsible AI use. The CJI remarked: “You are speaking as if we do not know what is happening in Kerala High Court,” adding that such policies require thorough consultations with the judiciary.
Petition Withdrawn, Suggestions Allowed
The bench allowed the petitioner to submit suggestions on the administrative side. The order stated: “Sr counsel seeks permission to withdraw the matter, and is permitted to withdraw this petition; however, the petitioner is allowed to submit the suggestions to us on the administrative side."
Case Title: Karthikeya Rawal v. Union of India, 2025








