Ethics in Brief: Ethical Considerations When Using Artificial Intelligence to Perform Discovery
By Alara T. Chilton
This article discusses key ethical considerations for lawyers when using artificial intelligence (AI)[i], particularly generative AI[ii] tools, to respond to discovery. Consider the following conversation between two friends who are solo-practitioners: Miguel, a consumer rights attorney incorporating AI[iii] into his practice, and Maria, a business litigation attorney who has been researching a lawyer’s ethical obligations under the Rules of Professional Conduct and the State Bar Act when using AI.
Miguel: For the last two weeks, I’ve been inundated with answering discovery requests. But I finally found a way to significantly reduce the number of hours I spend on this part of my practice.
Maria: That’s great. How did you do that? Did you hire an associate?
Miguel: No, even better—I decided to use an AI tool to prepare objections to Requests for Admission (RFAs) in a federal case I filed alleging violations of the Fair Debt Collection Practices Act (FDCPA) against a debt collector. The AI tool drafted all the relevant objections, saving me a great deal of time.
Maria: Hmmm . . . Can you explain the steps you took to use the AI tool to draft those objections?
Miguel: First, I asked the AI if it was familiar with responding to RFAs in FDCPA cases filed in the southern district of California. After it confirmed it was, I provided prompts—instructions specifying exactly what I wanted the AI tool to do. For this case, I instructed AI to act as the Plaintiff’s attorney and insert appropriate[iv] objections to the RFAs propounded by the debt collector. I then uploaded the RFAs to the AI tool, and in less than a minute, the tool produced a draft response with objections. Of course, I reviewed this draft, made necessary edits and then finalized the responses with my client.
Maria: AI’s efficiency is impressive, but have you considered your ethical obligations when using it?
Miguel: I try to be mindful of my ethical obligations, but AI is such a fast-moving train, with many lawyers using it to gain a competitive advantage. It’s apparent that if I don’t adopt it, my practice will be at a disadvantage.
Maria: I see your point, but as fast as the AI train is moving, we can’t lose sight of our ethical obligations when using it. One very helpful resource is the State Bar of California’s Practical Guidance For the Use of Generative Artificial Intelligence in the Practice of Law (State Bar’s AI Guide).[v] Although issued in 2023, it should help lawyers ethically navigate the use of AI for years to come, as the State Bar’s website notes it will be periodically updated as the technology evolves.
Miguel: I will definitely read the State Bar’s AI Guide. Based on your understanding of this guide, however, should I have done anything differently when I used the AI tool?
Maria: Unfortunately, yes. Consistent with California Rule 1.6 regarding the duty of confidentiality, the State Bar’s AI Guide states “A lawyer must not input any confidential client information of the client into any generative AI solution that lacks adequate confidentiality and security protections.” [vi] This means before using an AI tool, you should review the AI’s terms of use, privacy policies and security protections. Additionally, “a lawyer must anonymize client information and avoid entering details that can be used to identify the client.” This is important because AI tools may retain and utilize confidential information to train their systems, creating a risk that such information may be subsequently disclosed to other parties who use the AI tool.
Miguel: I used the free version of Grok, which is similar to the free version of ChatGPT. I’m not aware of its confidentiality or security protections. But I was not concerned about violating the duty of confidentiality because the client information I uploaded was already included in the RFAs—this same information became public when I filed the lawsuit.
Maria: The duty of confidentiality also applies to matters of public record.[vii] In fact, the California State Bar Committee on Professional Responsibility and Conduct has stated in an ethics opinion that “client information does not lose its confidential nature merely because it is publicly available.” [viii]
Miguel: That’s a broad and vague standard for confidentiality of client information. At least I didn’t violate the duty of competence, since I reviewed and edited the AI-generated objections to ensure they were appropriate.
Maria: Not necessarily. California Rule of Professional Conduct 1.1— the duty of competence —requires lawyers to “keep abreast of the changes . . . [regarding] the benefits and risks associated with relevant technology.” [ix] Using AI tools likely falls within this rule. Since your unaware of Grok’s confidentiality or security protections, it could be argued you are not sufficiently informed of the risks of using it when preparing discovery responses.
Miguel: How can I be well informed about the risks when AI experts do not fully understand how AI works?
Maria: You don’t need to know exactly how AI tools work, just as you don’t need to know exactly how a car’s engine works to understand the risks of driving. The State Bar’s AI Guide acknowledges your point that even experts do not understand how AI works.[x] But similarly to following driving laws, we must also follow the California Rules of Professional Conduct and the State Bar Act when using AI tools.
Miguel: Thanks for helping me better understand the risks of using AI. This conversation has made it clear that I—and all lawyers—need to pay close attention to the intersection of AI and our ethical duties, including confidentiality and competence. I recognize there are many more ethical obligations and I will need to research them to protect not just my client’s interests, but also my bar card.
[i] While there is no universal definition of AI, one definition is: “Any artificial system that performs tasks under varying and unpredictable circumstances without significant human oversight, or that can learn from experience and improve performance. . . .” John S. McCain National Defense Authorization Act for Fiscal Year 2019, Pub. L. No. 115-232, § 238(g), 132 Stat. 1636, 1699 (2018) (10 U.S.C. § 2358 note).
[ii] Generative AI has been described as “creat[ing] various types of new content, including text, images, audio, video and software code in response to a user’s prompts and questions. GAI tools that produce new text are prediction tools that generate a statistically probable output when prompted. To accomplish this, these tools analyze large amounts of digital text culled from the internet or proprietary data sources. Some GAI tools are described as ‘self-learning,’ meaning they will learn from themselves as they cull more data.” American Bar Association Standing Committee on Ethics and Professional Responsibility, Formal Opinion 512, “Generative Artificial Intelligence Tools” (2024) p. 1.
[iii] For simplicity, the author of this article will use the term “AI” to refer to generative AI.
[iv] This article does not address the propriety of boilerplate objections, whether or not they are generated by AI.
[v] California State Bar, Board of Trustees, Practical Guidance For the Use of Generative Artificial Intelligence in the Practice of Law (Jan. 2024).
[vi] Id. at p. 2.
[vii] In the Matter of Johnson, (Rev. Dept. 2000) 4 Cal. State Bar Ct. Rptr. 179, 189 (Lawyer disciplined for disclosing a client’s prior felony conviction, which was a matter of public record.)
[viii] Cal. State Formal Opn. No. 2016-195.
[ix] Comment 1 to California Rule 1.1 states “[t]he duties set forth in this rule include the duty to keep abreast of the changes in the law and its practice, including the benefits and risks associated with relevant technology.
[x] See the Executive Summary of California State Bar, Board of Trustees, Practical Guidance For the Use of Generative Artificial Intelligence in the Practice of Law (Jan. 2024) [“[E]ven for those who create generative AI products, there is a lack of clarity as to how it works.”].

