Breaching AI policy may see lawyers fired: Singapore law firms
A landmark High Court case recently saw a lawyer ordered to personally pay $800 in costs for citing a hallucinated case.
Law firms here are treating breaches of generative artificial intelligence (GenAI) policies as potentially sackable offences, following a landmark High Court case that ordered a lawyer to personally pay $800 in costs for citing a hallucinated case, that is, one generated by AI and which does not actually exist.
Several firms approached by The Business Times said violations of their AI guidelines would be treated as serious disciplinary matters, with consequences from restricted tool access to dismissal.
Ms Stephanie Yuen Thio, joint managing partner at TSMP Law, said key to the firm's AI policy is that lawyers must continue to meet their professional responsibilities.
"AI, like any other software we use, is only a tool; and it is our responsibility to ensure that the final work product is accurate and appropriate," she added.
Lawyers who violate this policy will be in breach of employment terms with "attendant consequences", she said.
At Withers KhattarWong, non-compliance with AI policy may result in disciplinary action, from retraining and restricted access to tools to formal sanctions under its human resource and compliance procedures.
"The principle is clear," said Mr Chenthil Kumarasingam, the firm's regional division leader for dispute resolution in Asia. "Lawyers remain personally accountable for their work, whether AI is used or not."
Drew & Napier will treat AI violations "as any other transgression and (deal with them) according to firm policies, which can result in disciplinary action", said chief technology officer Rakesh Kirpalani.
Fictitious citation
The firms' responses come after a Sept 29 High Court judgment where Assistant Registrar Tan Yu Qing found that counsel for the claimants acted "improperly, unreasonably and negligently" by citing a fictitious authority generated by a GenAI tool in written submissions.
The claimants' counsel Lalwani Anil Mangan from DL Law Corporation initially characterised the error as "clerical" or "typographical" in correspondence with the court. He admitted the case "did not exist" only after questioning during a hearing on July 22.
This is likely to be among the first cases in Singapore where an AI hallucination was cited as a legal precedent.
Mr Lalwani later revealed that a junior lawyer used an AI app to generate the citation, which he failed to verify before filing the submissions on behalf of his clients.
Assistant Registrar Tan ordered Mr Lalwani to personally pay the defendant $800 in costs - separate from the usual costs of the application - to compensate for the unnecessary time and expense incurred due to his improper conduct.
Addressing the issue of supervisory responsibility highlighted in the case, the view of the law firms is that the lead lawyer bears full responsibility for all work output, even if AI-assisted work was first prepared by a junior.
Withers KhattarWong adopts a two-tier oversight system. Junior lawyers may use GenAI tools for efficiency but senior lawyers are responsible for validating outputs and ensuring compliance before anything is filed or shared externally.
The firm is running training modules on prompt engineering, hallucinations and the risks of fabricated citations, emphasising that lawyers must never assume outputs are correct without independent validation, said Mr Kumarasingam.
TSMP holds weekly team meetings and monthly firm-wide meetings to share experiences using AI and problems that have surfaced.
"Legal AI is still in the early stages of development and adoption, so I think responsible law firm leadership requires a hands-on approach to implementation," said Ms Yuen Thio.
At Drew & Napier, lawyers have been expressly instructed to verify all GenAI output and explain the positions they take in their work, said Mr Kirpalani. They are also expected to disclose AI use when asked.
The requirement to file bundles of authorities - cases that lawyers intend to rely on in their submissions - provides an important opportunity to verify all citations, he added.
Safeguards and checks
Allen & Gledhill has custom-built its own AI tool, A&GEL, with safeguards to mitigate the risks of hallucinations, said Mr Stanley Lai, head of the intellectual property practice and co-head of the cyber security and data protection practice.
During development, the firm identified more than 100 potential AI use cases and selected those with the greatest impact and highest likelihood of success, while optimising outputs to be easily verified by lawyers.
"Knowing when and how to utilise AI is a skill in itself," said Mr Lai. "We expect our lawyers to understand that while AI can augment their existing workflows, it cannot replace the nuanced judgment, ethical reasoning and interpersonal skills that define effective legal practice."
Said Ms Yuen Thio: "The bottom line is this: Legal AI tools should be used in the same way we would use the work product of a good intern - we need to check the work and ensure that our professional obligations to our clients are fulfilled."
"Relying on an AI work product without checking is asking for trouble," she added.