
Managing Litigation and Cybersecurity Risks of AI Notetakers
I am a huge fan of AI-powered notetaking and transcription apps. These tools have streamlined the way organizations capture and share meeting content. I was just on an introductory call where four distinct AI notetaking apps were present (fireflies, fathom, read, and Gemini). However, these conveniences come with significant litigation discovery and cybersecurity risks. These are risks that every company must address, especially in the context of board meetings, and that consultants should consider in the context of strategy sessions with their clients.
Litigation Discovery Risks
AI-generated meeting transcripts and recordings are legally-recognized records. In litigation, barring an exception (e.g., attorney-client privilege), these materials can be subpoenaed and must be produced during discovery, even if the organization is not a party to the lawsuit. The creation of such records—where previously only handwritten notes or formal minutes existed—means more information may be subject to legal scrutiny. Failure to preserve or produce these records when required can result in legal penalties.
Consider that, even without subpoenas, maintaining recordings or transcripts of meetings can expose participants to risk. Cadet Legal has seen recent litigations where parties factored the existence of favorable recordings of board meetings into their litigation strategy. For our consultant clients, especially those who are not attorneys, consider whether you want full recordings or transcripts of your strategy sessions available into perpetuity.
Cybersecurity and Privacy Risk
AI notetaking apps process sensitive business conversations and personal information, often storing data in cloud platforms with data handling practices that vary by app. This can create risk of unauthorized access, data breaches, and exposure of confidential information, by both employees of the AI notetaking app and third party hackers. There is also the potential for accidental sharing of sensitive transcripts, as we have seen clients and colleagues accidentally send private discussions to unintended recipients.
Recommendations
We love AI notetakers and see them as an obvious and practical use case for leveraging AI to make our and our clients’ businesses more efficient. Given the risk, we would offer the following common sense advice:
Stop and Think. Consider whether you want/need your AI notetaking app for a given call. For board meeting for which Cadet Legal serves as corporate secretary, we include our own notetaking app, but ask all other participants (our client and other meeting attendees) to stick to the old-school pen and paper approach. When you join a call, take note of whether others have included their own AI notetaking apps and whether you have reason to ask for their removal for that particular session. Keep in mind that some videoconferencing platforms (e.g., Zoom) have their own internal recording/transcribing capabilities.
Check Transcripts and Summaries for Accuracy: While AI notetaking apps do not create the same risks as AI used to generate case citations, the tendency of AI to hallucinate and get things wrong still remains. To the extent that you do use AI summaries or transcripts, give them a review for accuracy. We have seen AI notetakers butcher clients’ names in ways that are egregious and embarrassing because a simple proofread could avoid the misspelling.
Review and Update Records Retention Policies: Ensure that company policies address AI-generated notes, transcripts, and recordings. Specify how long these records should be retained and when they should be deleted. Make sure your AI app setting are consistent with your company policies.
Delete Recordings and Transcripts After Minutes/Summaries Are Drafted: Once official board meeting minutes are approved or, alternatively, you have filed away a summary of your consulting call for your records, consider deleting the raw recordings and AI-generated transcripts to limit litigation exposure and reduce the risk of unauthorized access. This is Cadet Legal’s firm policy, especially for board meetings.
Implement Secure Access Controls: Restrict access to meeting recordings and transcripts to authorized personnel only. Again – check the settings of your app. Many apps default to providing a copy of everything (recordings, transcripts, and summaries) to all users. Use robust authentication and encryption to protect sensitive data.
Choose AI Tools with Strong Security Features: Opt for solutions that offer end-to-end encryption, on-premises processing, and strict compliance with data protection regulations. If you need assistance reviewing an AI tool’s security, we are happy to help.
Obtain Consent for Recording: Best practices (and the laws of some states) dictate that you inform meeting participants when meetings will be recorded and transcribed, and obtain their consent where required. Some apps, Fathom for example, allow you to have an automatic consent go out with your meeting invites in advance of the meeting.
Educate Staff on Risks and Best Practices: Train employees on the importance of data security, proper handling of AI-generated records, and the risks of accidental sharing.
By thoughtful implementation, including managing app settings and company records policies, organizations and individual consultants can harness the productivity benefits of AI notetaking while mitigating legal and cybersecurity risks. At Cadet Legal, we are actively using, and advising clients on, AI in various contexts. Please reach out with any questions regarding your or a counterparty’s use of AI.
Disclaimer. The contents of this article should not be construed as legal advice or a legal opinion on any specific facts or circumstances. Your viewing and/or use of the contents of this article do not create an attorney-client relationship with Cadet Legal. The contents are intended for general informational purposes only, and you are urged to consult with counsel concerning your situation and specific legal questions you may have.