×
Legal experts slam Bluebook’s new AI citation rule as confusing
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The 22nd edition of The Bluebook, released in May, introduces Rule 18.3 for citing AI-generated content, but legal experts are calling the new citation standard fundamentally flawed and confusing. The Bluebook acts a foundational guide for the legal profession, offering best practices. Critics argue the new rule treats AI as a citable authority rather than a research tool, creating more confusion than clarity for legal professionals navigating AI citations.

What the rule requires: Authors must save screenshots of AI output as PDFs when citing generative AI content like ChatGPT conversations or Google search results.

  • The rule has three sections covering large language models, search results, and AI-generated content, each with slightly different citation requirements.
  • All AI citations must include specific formatting and preservation requirements.

Why experts are critical: Legal scholars argue the rule misunderstands how AI should be used in legal research and writing.

  • “In 99% of cases, we shouldn’t be citing AI at all. We should cite the verified sources AI helped us find,” wrote Susan Tanner, a University of Louisville law professor who called the rule “bonkers.”
  • Jessica R. Gunder from the University of Idaho College of Law noted that AI citations should only document what was said by the AI, not the truth of what was said.

Technical problems identified: The rule contains internal inconsistencies and unclear distinctions between AI categories.

  • Cullen O’Keefe from the Institute for Law & AI pointed out that the rule differentiates between large language models and “AI-generated content,” even though large language models are a type of AI-generated content.
  • The rule shows inconsistencies about when to use company names with model names and when to require generation dates and prompts.

What appropriate AI citations look like: Experts suggest AI should only be cited in rare cases where the AI’s output itself is the subject of discussion.

  • Tanner’s example: “OpenAI, ChatGPT-4, ‘Explain the hearsay rule in Kentucky’ (Oct. 30, 2024) (conversational artifact on file with author) (not cited for accuracy of content).”
  • Gunder provided another scenario: citing AI output to highlight its unreliability, such as when an AI tool suggested adding glue to pizza recipes to prevent cheese from falling off.

The bottom line: While experts commend The Bluebook editors for addressing AI citations, they argue the rule “lacks the typical precision for which The Bluebook is (in)famous” and may create more confusion than guidance for legal professionals.

Latest 'Bluebook' has 'bonkers' rule on citing to artificial intelligence

Recent News

AEO, or Answer Engine Optimization, is the ultimate epistemic bundler

When truth becomes manufactured rather than discovered, the highest bidders define facts.

Meta launches “Vibes,” a dedicated feed for AI-generated short videos

Critics worry the feature could flood feeds with low-quality "AI slop."

iOS 18 uses Apple Intelligence to fix Wallet’s broken order tracking

AI extracts purchase details from emails, bypassing years of merchant reluctance.