Short answer
Avoid uploading identifiable health information (PHI) to consumer AI apps. Use only institution‑approved tools for clinical data, or keep processing local/offline. If you must use an external tool, remove identifiers and share the least information needed for a general explanation.
What HIPAA/GDPR do and don’t do
- HIPAA (US): Protects PHI handled by covered entities (clinics, insurers) and their business associates. Many consumer AI apps are not covered. A BAAs (business associate agreement) is required for compliant processing.
- GDPR (EU): Treats health data as sensitive; requires lawful basis, minimization, and rights to access/erasure. Compliance still depends on the app’s infrastructure and contracts.
- Bottom line: If your clinic didn’t approve the app, assume your data may be stored, logged, or used for model improvement.
Safer alternatives
- Use your clinic’s portal tools or institution‑approved AI assistants if available.
- Local summaries: Copy non‑identifiable text into a local editor to draft questions for your visit (no uploads).
- De‑identify first: Remove names, dates of birth, addresses, MRNs, precise dates (or shift them), and rare conditions that could identify you.
- Minimal sharing: Ask for a general explanation of a concept (e.g., what “LVEF 45%” means) without attaching your full report.
Red flags (don’t upload)
- No privacy policy or unclear data retention.
- App says it may use your content to train models without an opt‑out.
- No way to delete data or export logs.
- Requests full documents when a short excerpt would do.
How to talk with your clinician
- Bring a one‑page summary of your questions and key findings.
- Ask whether the clinic supports any AI tools for patient education or translation.
- If you prepared content with AI, label it as AI‑assisted and verify facts together.
Updated October 2025 · Educational content only. Not medical advice.
↗ Read: Safety limits — when not to use AI · Tools · Q&A Index