← Back to Blog

What "HIPAA Compliant" Actually Means When an AI Vendor Says It

·10 min read

Every AI documentation tool claims to be "HIPAA compliant." Most therapists assume that means their data is protected. Here is what HIPAA actually requires, what that phrase really means, and what you should look for before trusting any AI tool with clinical content.


If you have spent any time looking at AI tools for clinical documentation, you have seen the phrase "HIPAA compliant" on almost every product page. It is the magic words. The two-word shorthand that is supposed to tell you: your data is safe, you are covered, do not worry about it.

The problem is that "HIPAA compliant" is not a certification. There is no HIPAA seal of approval. No government agency certifies products as HIPAA compliant. There is no test you pass, no badge you earn, no registry you appear on. When a vendor says their product is "HIPAA compliant," they are making a self-declaration — and unless you understand what HIPAA actually requires, you have no way to evaluate whether that claim means anything.

This guide is not legal advice. It is a practical breakdown of what HIPAA requires when you use AI tools for clinical work, written for therapists who want to understand the rules well enough to make informed decisions.


What HIPAA Actually Is (And What It Is Not)

HIPAA — the Health Insurance Portability and Accountability Act of 1996 — is a US federal law that sets standards for protecting sensitive patient health information. The parts that matter for therapists using AI tools are primarily the Privacy Rule and the Security Rule.

The Privacy Rule establishes who can access protected health information (PHI), under what circumstances, and what rights patients have over their own data. It covers any individually identifiable health information — which includes virtually everything in a clinical note: names, dates, diagnoses, treatment content, session summaries, and anything that could identify a specific person.

The Security Rule sets technical and administrative requirements for protecting electronic PHI (ePHI). This is where the specifics get relevant for AI tools: encryption, access controls, audit logs, and breach notification procedures.

What HIPAA is not is a product specification. HIPAA does not say "use this encryption algorithm" or "store data on this type of server." It sets standards and requires covered entities — which includes you, as a healthcare provider — to implement "reasonable and appropriate" safeguards. What counts as reasonable depends on the context, the sensitivity of the data, and the state of available technology.


The BAA: Why It Matters and What It Actually Does

If there is one concrete HIPAA requirement that therapists should understand when evaluating AI tools, it is the Business Associate Agreement (BAA).

Under HIPAA, any third party that handles PHI on behalf of a covered entity is a business associate. If you use an AI tool to process clinical notes, and that tool receives, stores, or transmits PHI, the company behind that tool is your business associate. HIPAA requires a signed BAA before you share any PHI with them.

A BAA is a legal contract that obligates the business associate to:

  • Use and disclose PHI only as permitted by the agreement
  • Implement appropriate safeguards to prevent unauthorised use or disclosure
  • Report security incidents and breaches to the covered entity
  • Make PHI available for patient access requests
  • Return or destroy PHI when the agreement ends

Without a BAA, sharing PHI with an AI vendor is a HIPAA violation. Full stop. It does not matter how good their encryption is, how reassuring their marketing copy sounds, or how many times the word "secure" appears on their website. If there is no BAA, you are not covered.

This is worth emphasising because most consumer AI tools — including ChatGPT's free tier, Claude's free tier, and Google's Gemini — do not offer BAAs to individual users. Some offer them through enterprise plans, but those typically require organisational contracts, not individual practitioner sign-ups.


What "HIPAA Compliant" Actually Requires

When you see an AI vendor claim HIPAA compliance, here is what that claim should mean in practice. Use this as a checklist — not to take the vendor's word for it, but to ask the right questions.

1. A signed BAA is available

This is non-negotiable. If the vendor does not offer a BAA, the conversation ends. Ask for it specifically. Read it. If they say "our terms of service cover that," they probably do not have a real BAA.

2. Encryption in transit and at rest

HIPAA requires that ePHI be encrypted both when it is being transmitted (in transit) and when it is stored (at rest). For AI tools, this means:

  • All data sent between your browser and the server should use TLS/HTTPS encryption
  • Any stored data — including conversation logs, generated documents, and uploaded files — should be encrypted on the server

This is a minimum standard, and most reputable cloud services meet it. But encryption in transit and at rest has an important gap: it does not protect data during processing. When an AI model processes your clinical notes, the data is decrypted in server memory to perform the computation. During that window, anyone with access to the server — the cloud provider, the AI company's engineers, a compromised insider — can theoretically access the unencrypted data.

3. Access controls

The vendor should implement role-based access controls that limit who within their organisation can access your data. Ask: who at the company can see my clinical content? Under what circumstances? Is access logged?

4. Audit logging

HIPAA requires that access to ePHI be logged. The vendor should maintain audit trails showing who accessed what data, when, and why. This is important for breach investigations and for your own records if you are ever audited.

5. Breach notification

If the vendor experiences a data breach involving your clients' PHI, HIPAA requires them to notify you — and you are then required to notify affected clients. The BAA should specify the notification timeline (HIPAA requires notification within 60 days of discovery, though many BAAs specify shorter windows).

6. Data retention and disposal

What happens to your data when you stop using the service? HIPAA requires that PHI be returned or securely destroyed when the business relationship ends. Many AI tools retain conversation data indefinitely for model training or product improvement. This is a direct conflict with HIPAA requirements unless the BAA specifically addresses it.


The Three Biggest Misconceptions

1. "HIPAA compliant" is a certification you can verify

It is not. There is no HIPAA certification body, no official audit program, and no government registry of compliant products. Some vendors undergo third-party audits (SOC 2, HITRUST) that assess security controls relevant to HIPAA, and those are meaningful — but they are not HIPAA certification. They are independent assessments that evaluate whether a company's practices align with security standards.

When a vendor says "HIPAA compliant," they are saying "we believe our practices meet HIPAA requirements." That may be true. It may also be marketing. The only way to evaluate the claim is to understand the requirements yourself and ask specific questions.

2. If the vendor is compliant, you are covered

HIPAA compliance is not something you can outsource entirely. You are the covered entity. You are responsible for ensuring that your use of any tool meets HIPAA requirements. That means:

  • You need a signed BAA with every vendor that handles PHI
  • You need to conduct a risk assessment of the tools you use
  • You need to verify that the vendor's practices actually meet the standards they claim
  • You need to train your staff (even if that staff is just you) on proper use of the tools

A vendor's compliance does not automatically make your use of their product compliant. If you paste an entire session transcript into a tool that has a BAA but also trains on user data, you may still have a problem.

3. De-identifying data makes HIPAA irrelevant

Some therapists believe they can avoid HIPAA concerns by removing client names before pasting notes into AI tools. The HIPAA Safe Harbor method for de-identification requires removing 18 specific identifiers — not just names, but dates, ages, locations, and any other information that could be combined to identify a person.

In clinical documentation, true de-identification is nearly impossible without stripping the content of its clinical value. "A 45-year-old woman in Denver going through a custody dispute involving allegations of substance abuse" is identifying even without a name. The combination of age, location, legal situation, and clinical details can narrow the pool of possible individuals to one.

If you are relying on de-identification as your HIPAA strategy for AI tools, you are almost certainly not de-identifying thoroughly enough.


The AI-Specific Gaps in HIPAA

HIPAA was written in 1996 and updated in 2013. It was not designed with AI in mind, and there are several areas where its requirements do not cleanly map onto how AI tools actually work.

Data used for model training

Many AI companies use customer data to train and improve their models. Under HIPAA, using PHI for purposes beyond what is specified in the BAA is a violation. But the line between "providing the service" and "improving the service through training" is not always clear in practice. Check whether the vendor's terms allow them to use your data for model training, and whether the BAA specifically prohibits it.

Processing security gap

As noted above, standard encryption protects data in transit and at rest, but not during processing. HIPAA does not specifically require encryption during processing because when the law was written, the concept barely existed. But the Security Rule does require "reasonable and appropriate" safeguards, and as technology evolves, what counts as reasonable evolves with it.

Technologies like confidential computing — which uses hardware-secured enclaves to protect data during processing itself — represent the current frontier of what "reasonable" protection looks like. The evolution from physical security to encrypted enclaves reflects how the standard of care has changed as threats have become more sophisticated.

Subpoena and legal process

HIPAA permits disclosure of PHI in response to a court order or subpoena under certain conditions. But the interaction between HIPAA and AI vendor privacy policies creates a gap: if the vendor's terms of service broadly permit disclosure to government authorities, a court may find — as Judge Rakoff ruled in February 2026 — that data shared with that vendor has no privilege protection at all. HIPAA's permitted disclosures and the vendor's contractual disclosures can combine in ways that leave client data more exposed than either framework alone would suggest.


A Practical Evaluation Framework

When evaluating any AI tool for clinical documentation, here are the specific questions to ask. You do not need to be a security expert to use this framework — you just need to ask and expect clear answers.

Before you sign up

  • Do you offer a BAA for individual practitioners? Not just enterprise clients — individual therapists in private practice. If the answer is no, or "not yet," or "our terms of service are sufficient," stop here.
  • Where is my data stored, and for how long? Look for specific answers: which cloud provider, which region, what retention period. "Our secure cloud infrastructure" is not a specific answer.
  • Is my data used for model training? This should be a clear yes or no. If yes, that is a potential HIPAA issue and you need to understand exactly what data is used and how.
  • Who at your company can access my clinical content? The ideal answer is "nobody" or "only under specific, logged circumstances." If they cannot answer this clearly, that is a red flag.

Before you paste clinical content

  • Have you signed the BAA? Not "is it available" — have you actually signed it? A BAA sitting on a website that you have not executed is not a BAA.
  • Does this tool meet the minimum necessary standard? HIPAA's minimum necessary rule says you should share only the minimum PHI needed for the purpose. Do you need to paste a full session transcript, or can you achieve the same result with de-identified key phrases?
  • Have you documented your risk assessment? HIPAA requires covered entities to conduct and document a risk assessment of the tools they use. This does not need to be a 200-page report. A written record of what tool you use, what data you share with it, what safeguards it provides, and what risks remain is sufficient for most solo practitioners.

What Good Looks Like

Not all AI tools handle these issues the same way. The landscape ranges from consumer chatbots with no healthcare consideration at all to purpose-built clinical tools with meaningful security architecture. Here is what to look for at each level:

Baseline (minimum acceptable):

  • Signed BAA available for individual practitioners
  • TLS encryption in transit, AES-256 encryption at rest
  • No use of clinical data for model training
  • Clear data retention and deletion policies
  • Access controls and audit logging

Better:

  • Everything above, plus independent security audit (SOC 2, HITRUST)
  • Data residency controls (you choose where data is stored)
  • Minimal data retention (data deleted after processing or short retention window)
  • Transparent privacy policy written in plain language

Best available:

  • Everything above, plus encryption during processing (confidential computing)
  • Hardware-secured enclaves that prevent even the vendor from accessing clinical content
  • Cryptographic attestation that processing occurred in a secure environment
  • Architecture designed so that compliance is structural, not just policy-based

The distinction between policy-based and structural protection is worth understanding. A policy says "we will not look at your data." A structural safeguard means they cannot look at your data, even if they wanted to. As recent security research has shown, the specific architecture matters — not all "secure AI" implementations provide the same level of protection.


Your Obligations as a Practitioner

Regardless of which tools you use, HIPAA places certain obligations on you as the covered entity. These are not optional, and using a compliant tool does not eliminate them.

  • Conduct a risk assessment. Document the tools you use, the data you share, and the safeguards in place. Review this annually or when you change tools.
  • Maintain signed BAAs. Keep copies of every BAA you sign. Know what each one says.
  • Apply the minimum necessary standard. Share only the PHI needed for the task. If you can generate a useful progress note from key phrases rather than a full transcript, do that.
  • Train yourself and your staff. Even in a solo practice, document your HIPAA training. Know the rules for breach notification and patient access requests.
  • Have a breach response plan. If a vendor experiences a breach, you need to know what to do: who to notify, within what timeframe, and what documentation to maintain.

The Bottom Line

"HIPAA compliant" on a product page tells you almost nothing. It is a self-declaration with no standard definition and no external verification. The phrase has become so overused in healthcare AI marketing that it has lost most of its meaning.

What matters is whether the vendor can answer specific questions about BAAs, encryption, data handling, access controls, and breach procedures — and whether those answers satisfy your obligations as a covered entity under HIPAA.

You do not need to become a security expert. But you do need to ask better questions than "are you HIPAA compliant?" Because the answer is always yes, and the answer is not always true.


This post is for informational purposes only and does not constitute legal advice. Mental health practitioners should consult their own legal advisors regarding HIPAA compliance, BAA requirements, and the use of AI tools in their specific practice context.


References

  • U.S. Department of Health and Human Services. (2013). Summary of the HIPAA Security Rule. HHS.gov.
  • U.S. Department of Health and Human Services. (2013). Business Associate Contracts: Sample Business Associate Agreement Provisions. HHS.gov.
  • U.S. Department of Health and Human Services. (2012). Guidance Regarding Methods for De-identification of Protected Health Information. HHS.gov.
  • National Institute of Standards and Technology. (2008). An Introductory Resource Guide for Implementing the Health Insurance Portability and Accountability Act (HIPAA) Security Rule. NIST SP 800-66 Rev. 1.

ConfideAI is a documentation tool built for mental health professionals. All clinical content is processed inside hardware-secured enclaves using confidential computing. Learn more at confideai.ai.

More from ConfideAI