AI Governance Staff Training

Enter your information to begin.

Estimated time: 45–60 minutes including knowledge check
1
What AI Does in Your Practice

Your practice uses AI tools that listen to patient visits and generate clinical notes. The AI creates a DRAFT — not a finished record.

Your provider reviews it, corrects errors, and signs it. Only then does it become a legal medical record.

Your role depends on your position, but everyone has a part in making sure AI is used safely and correctly.

Key point: AI generates a draft. Your provider's review and signature are what make it a medical record.
2
What Can Go Wrong

AI can fabricate clinical findings that never happened. Published studies show error rates above 25% across major platforms. Every error in a signed note is a potential liability.

Fabricated Findings

Exam details you never performed documented as if they happened.

Invented Medications

Drugs never prescribed listed in the patient's record.

False History

Past medical history or allergies that don't exist.

Wrong Codes

Billing codes inflated beyond what the visit supports.

Every hallucination you miss = potential malpractice liability and patient harm.
3
Your Signature = Your Liability

When a provider signs an AI-generated note, it carries the same legal weight as a handwritten note.

The AI vendor is not liable — their contracts explicitly disclaim accuracy. The provider who signed it is liable.

Insurance carriers are now adding AI exclusion endorsements. If your policy has one and an AI-related claim is filed, the claim gets denied. You bear the full cost.

Governance documentation is what carriers, attorneys, and regulators evaluate. This training is part of that documentation.
4
Provider Responsibilities
THE DEPOSITION TEST: If this note was presented in court as evidence against you, could you defend every sentence?
  • Review every note word-for-word — not a skim
  • Verify all exam findings — did you actually perform them?
  • Verify all medications — did you actually prescribe them?
  • Document your verification (amendment, addendum, or signature note)
  • Never sign a note you can't fully defend
5
Staff Responsibilities

Medical Assistant

AI notes are drafts with errors. Flag obvious problems before the provider signs. Make sure patient disclosure is documented. Know how to disable the AI system in an emergency.

Front Desk

Give every patient the AI disclosure notice. Document their consent or opt-out. If they opt out, flag the record immediately.

Billing

Never submit AI-suggested codes without provider verification. AI frequently recommends inflated codes. $28,619 per false claim under the False Claims Act.

6
The Verification Standard
NOT ENOUGH ✗
Skim the note and sign
Trust AI to be accurate
Assume EMR captured everything
Sign without reading
Rely on vendor assurance
DEFENSIBLE ✓
Read every sentence carefully
Verify findings you performed
Check completeness against your exam
Amend if you find errors
Document your verification process
Can you defend every word in court?
7
Patient Disclosure and Opt-Out

Every patient must be informed AI is used in their care.

1
Notice
Provide AI disclosure
2
Consent
Patient decides
3
Document
Record in EHR
4
Review
Provider checks
5
Proceed
AI or manual
If a patient opts out: Document immediately. Flag the record. Switch to manual documentation. Notify all staff. Patient choice is legally binding.
8
When Something Goes Wrong

Error Correction Protocol:

1
STOP
Don't sign
2
IDENTIFY
Note the error
3
CORRECT
Fix it
4
DOCUMENT
Record the fix
5
SIGN
Corrected version
6
REPORT
If pattern

Kill Switch — disable AI immediately when:

Patient harm event

Systematic errors detected

Data breach suspected

Regulatory inquiry received

9
Incident Reporting
REPORT IMMEDIATELY
Patient harm or adverse outcome
Critical AI system failure
Suspected data breach
WITHIN 24 HOURS
Caught and corrected error
Note hallucination or fabrication
Systematic AI inaccuracy
Include in every report: date, time, which AI system, what the error was, what impact it had, what you did about it.
10
Why This Protects You

Following these protocols isn't just about the practice — it protects you personally.

  • Malpractice claims: Documented governance shows you followed proper procedures
  • Insurance reviews: Governance documentation demonstrates responsible AI use
  • Regulatory audits: Your compliance record is already built
  • Personal liability: Following protocols gives you individual evidence
Certification under AI Governance Shield™ means your practice's governance has been independently validated. This training is part of that certification. Annual renewal keeps it current as laws and AI tools evolve.

Questions about your practice's certification? Contact us at sentinelriskgrp.com

11
Behavioral Health & Substance Use — Extra Protections

If your practice provides any behavioral health, psychiatric, or substance use disorder (SUD) treatment — or if you see patients via telehealth who receive those services — two additional layers of regulation apply.

42 CFR Part 2 — Federal Confidentiality of SUD Records

Substance use disorder treatment records have stricter federal protections than standard HIPAA. Key points every staff member must know:

  • Separate consent required. A general HIPAA authorization does NOT cover SUD records. Patients must give specific, written consent before SUD information can be disclosed — even to other providers.
  • AI systems don't know the difference. An AI scribe will document substance use history, treatment details, and medication-assisted therapy the same way it documents anything else. If that note is shared without proper Part 2 consent, the practice is in violation.
  • Re-disclosure prohibition. Anyone who receives Part 2 information cannot re-disclose it. If an AI-generated note containing SUD information is sent to a referring provider or insurer without consent, every downstream disclosure is also a violation.
  • Practical rule: Before signing any note involving SUD treatment, verify the patient's Part 2 consent is on file and current. Flag SUD records in your EHR so they are not automatically shared through health information exchanges.
Tennessee SB 1580 — First Private Right of Action for Mental Health AI

Effective July 1, 2026, Tennessee law creates the first statute allowing patients to sue directly over AI misuse in behavioral health:

$5,000 Per Violation

Each improper use of AI in behavioral health treatment is a separate violation.

Treble Damages

Willful violations = triple damages. No annual cap.

No Independent Decisions

AI cannot make therapeutic decisions without direct provider oversight and approval.

Telehealth Reach

Applies if the patient is in Tennessee, regardless of where the provider is located.

What this means for you: If you treat any patient in Tennessee via telehealth — or if your practice provides behavioral health services anywhere — AI-generated therapy notes, treatment plans, and psychiatric evaluations require the same word-for-word verification as medical notes, plus you must confirm the AI did not generate independent clinical recommendations. The provider makes every therapeutic decision. The AI only documents what the provider decided.
Knowledge Check — 12 Questions

You need 10 out of 12 correct (80%) to pass.

Question 1
Before an AI-generated note becomes a legal medical record, what must happen?
Question 2
You notice the AI note includes an exam finding you never performed. What should you do?
Question 3
A patient says they don't want AI used during their visit. What do you do?
Question 4
Who is legally liable for errors in a signed AI-generated note?
Question 5
AI suggests a Level 5 billing code for a routine follow-up. What should happen?
Question 6
What is the "Deposition Test"?
Question 7
An AI note lists a medication you never prescribed. What is your FIRST action?
Question 8
Your practice does telehealth with patients in other states. Which state's AI laws apply?
Question 9
A new patient checks in. What should front desk do regarding AI?
Question 10
When should you activate the kill switch and disable AI?
Question 11
A patient is receiving substance use disorder (SUD) treatment. What must be in place before their AI-generated treatment notes can be shared with another provider?
Question 12
Under Tennessee SB 1580, what is AI prohibited from doing in behavioral health treatment?