In case your authorized group is exploring the right way to use generative AI in follow, and let’s be trustworthy, most are at the least testing the waters, the dialog usually jumps straight to instruments and outputs. Which chatbot ought to we use? Can we belief it to draft one thing actual? How can we management hallucinations?
However earlier than you even get there, there’s a extra fundamental query that many authorized departments overlook. What are we feeding the AI within the first place?
In a latest episode of “Notes to My (Authorized) Self,” Linsey Krolik, a regulation professor at Santa Clara College and longtime in-house counsel at firms like PayPal and Arm, made a compelling case for what she calls AI literacy. However the greater perception was between the strains: you possibly can’t use AI successfully in a authorized setting with out understanding the inputs, and meaning contracts.
Watch the episode right here:
The AI Second Isn’t Coming. It’s Already Right here.
“We’re utilizing generative AI at this time, whether or not we need to admit it or not,” Linsey stated throughout the interview. “It’s occurring. So get on board and we will study collectively.”
That sense of collective studying, and the hole between curiosity and confidence, is one thing many in-house groups are experiencing firsthand. There’s strain to maneuver shortly, scale back turnaround time, and do extra with much less. AI guarantees all of that. However as Linsey identified, we have to begin with the fundamentals.
She’s coaching regulation college students to construct real-world authorized paperwork like phrases of service and privateness insurance policies for early-stage startups. These college students are already experimenting with AI. They’re studying the place it helps, the place it fails, and the right way to critically assess its output. They’re growing muscle reminiscence not simply in drafting, however in understanding why contracts are structured the best way they’re.
That foundational ability, contract literacy, is what too many practising groups are lacking.
AI Is A Mirror. If Your Contract Knowledge Is A Mess, It Will Present.
When attorneys take into consideration AI instruments, it’s simple to concentrate on the output. What can it draft? What questions can it reply?
However what issues simply as a lot is the underlying information. In case your group can’t simply reply questions like “What are our customary fee phrases throughout all NDAs?” or “Which vendor contracts auto-renew within the subsequent 90 days?” then any AI answer you implement shall be looking for patterns in chaos.
Linsey emphasised that in-house groups are more and more being requested, “Did you utilize AI for this?” And when the reply isn’t any, the follow-up is usually, “Why not?” That strain to discover and undertake is rising. However AI isn’t magic. It gained’t clear up your contract portfolio for you. It would solely floor what’s already there, or worse, what’s lacking.
Contract Literacy Isn’t Simply Figuring out Authorized Phrases. It’s Figuring out The Enterprise.
One of many sharpest observations Linsey made throughout the dialog was about how contract training has advanced. She’s moved past conventional authorized writing assignments to incorporate issues like AI-assisted drafting and brief business-style shows.
Why? As a result of she understands that attorneys at this time don’t simply write contracts. They clarify them. They negotiate them. They implement them. And more and more, they design workflows and information programs round them.
AI can assist that work, however solely when the lawyer understands what the enterprise wants from the contract. In the event you can’t articulate the distinction between what a procurement supervisor needs to know and what your finance lead must see, no AI software will bridge that hole for you.
The Actual Danger Isn’t AI. It’s Staying Unprepared.
Linsey acknowledged the moral considerations round AI, confidentiality, accuracy, and unauthorized reliance, however she additionally made it clear that the larger threat is paralysis.
“There’s a whole lot of uncertainty now,” she stated. “However I believe we have to begin being extra curious and fewer scared.”
She teaches her college students to reveal once they use AI, to replicate on why they used it, and to guage the standard of the output. In doing so, they learn to construct belief within the instruments and in their very own judgment.
That very same framework applies to in-house authorized groups. As a substitute of asking whether or not AI is ideal, begin asking whether or not your group is prepared. Are you able to clarify what your customary indemnity clause seems like? Are you able to audit vendor agreements for renewal triggers? Do you may have a structured strategy to evaluate phrases throughout contracts?
These are contract literacy questions. And till you possibly can reply them confidently, AI will stay a shiny answer searching for an issue.
Need To Get AI-Prepared? Begin With Your Contracts.
Linsey Krolik is coaching the following technology of attorneys to suppose critically, use rising instruments responsibly, and work immediately with the enterprise. If at this time’s regulation college students are studying to draft, construction, and analyze contracts with AI as a companion, then the remainder of the authorized world must catch up quick.
AI readiness begins with figuring out what you may have, what it means, and the right way to use it. That begins not with software program, however with ability. Not with automation, however with understanding.
Contract literacy isn’t the tip objective. It’s the beginning line.
Watch the full interview with Linsey here.
Olga V. Mack is the CEO of TermScout, an AI-powered contract certification platform that accelerates income and eliminates friction by certifying contracts as truthful, balanced, and market-ready. A serial CEO and authorized tech govt, she beforehand led an organization via a profitable acquisition by LexisNexis. Olga can be a Fellow at CodeX, The Stanford Center for Legal Informatics, and the Generative AI Editor at regulation.MIT. She is a visionary govt reshaping how we regulation—how authorized programs are constructed, skilled, and trusted. Olga teaches at Berkeley Law, lectures broadly, and advises firms of all sizes, in addition to boards and establishments. An award-winning common counsel turned builder, she additionally leads early-stage ventures together with Virtual Gabby (Better Parenting Plan), Product Law Hub, ESI Flow, and Notes to My (Legal) Self, every rethinking the follow and enterprise of regulation via know-how, information, and human-centered design. She has authored The Rise of Product Lawyers, Legal Operations in the Age of AI and Data, Blockchain Value, and Get on Board, with Visible IQ for Legal professionals (ABA) forthcoming. Olga is a 6x TEDx speaker and has been acknowledged as a Silicon Valley Girl of Affect and an ABA Girl in Authorized Tech. Her work reimagines folks’s relationship with regulation—making it extra accessible, inclusive, data-driven, and aligned with how the world really works. She can be the host of the Notes to My (Authorized) Self podcast (streaming on Spotify, Apple Podcasts, and YouTube), and her insights repeatedly seem in Forbes, Bloomberg Regulation, Newsweek, VentureBeat, ACC Docket, and Above the Regulation. She earned her B.A. and J.D. from UC Berkeley. Comply with her on LinkedIn and X @olgavmack.