What is Confidentiality in Healthcare within the Age of AI?
Confidentiality is the core promise we make to every patient: their health information is private and secure. This trust is essential for good patient care. However, the use of Artificial Intelligence (AI) is now changing everything.
AI systems are quickly being built into our hospitals and clinics to help with everything from diagnoses to scheduling. Because AI works by analysing huge amounts of sensitive data, we must urgently change how we think about, and manage, patient privacy to keep that fundamental promise secure. So what is confidentiality in healthcare with the new age of AI?
The privacy shift in specialised healthcare
In healthcare disciplines such as physical therapy, occupational therapy, and psychology; your work is built on deep personal trust, and the data you collect. From progress notes to therapy session details - is highly sensitive Protected Health Information (PHI). Historically, keeping this confidential meant locking your filing cabinets and not engaging in office gossip.
Today, with digital health records, Telehealth platforms, and the rise of AI tools that check symptoms or draft documentation, privacy is far more complex. It's now about data governance, security protocols, and strict regulatory compliance. The core ethical rule remains the same: your patient’s health information must be completely private and secure.
However, the methods for achieving this must now account for sophisticated AI tools that process, store, and often share data to function effectively, raising urgent questions about how small practices can remain compliant, ensure data is truly anonymous, and prevent the high-risk issue of re-identification.
Key rules for privacy with AI in therapy and specialty practices
1. Hiding personal details
AI algorithms, whether used for scheduling, clinical decision support, or drafting notes, work best by analysing vast amounts of data. To use your patients' PHI for training or research without breaking confidentiality, personal identifiers must be carefully removed or hidden (de-identified).
2. Following the law (Global Privacy Regulations)
Older privacy laws (like HIPAA in the US, GDPR in Europe, and the Australian Privacy Principles (APP)) are still the absolute foundation.
Applying them to AI is tricky, especially since specialty practitioners frequently use multiple digital systems for billing, scheduling, and note-taking.
This means data is constantly moving. Privacy rules must be built into the AI system and your clinical workflow, not just added later as an afterthought.
3. Checking AI partners and data rules
Most specialty healthcare practices rely on third-party software (AI platforms, electronic health record systems, cloud services). You must carefully check these outside partners to ensure their security and privacy policies are as good as or better than the legal requirements.
Partners such as splose, use data encryption via SSL using industry-standard AES-256 encryption. We create hourly backups of your data, encrypt it and store it securely off-site. All data is stored redundantly at multiple AWS data centres to ensure availability.
splose is GDPR compliant and follow the Australian Privacy Principles (APPs) to ensure security within their platform so users can protect client privacy.
4. Transparency and honesty
The relationship with your patient is your greatest asset. Patients have a fundamental right to know how their sensitive data especially detailed therapy notes is being used, particularly when AI is involved.
Being completely clear and transparent about your data handling, and offering easy ways for patients to opt-out (where possible), is the simplest way to maintain the ethical trust that is central to all these disciplines.
splose offers regional-specific downloadable guides that you can share with your patients to make them feel more at ease about the safety of their data.
Secure your practice, protect your patients
Managing patient data securely and ensuring complete compliance in a digital-first, AI-driven environment is complex. That's why we built splose - practice management software designed specifically for healthcare providers.
splose is engineered with state-of-the-art security and compliance at its core, meeting all relevant government standards to protect your patients' data.
Our built-in AI tools are designed for efficiency and are deployed within our secure, compliant environment, meaning you get the benefit of AI innovation without ever compromising the confidentiality that your patients depend on.
Don't let data security be a hurdle to your practice's success.
No. OpenAI does not keep or use your business data. splose has a contract (a BAA) with OpenAI that forces them to have a zero data retention policy.
This means your data is never used to train their AI model or stored for any time.
No. If we follow the right security steps, hide personal details, and stick to the law (like HIPAA or GDPR), AI can be used safely and privately. The danger is poor use or using partners who don't follow the rules.
AI acts as a highly efficient, secure digital scribe or assistant.
Its main purpose is to dramatically reduce the time you spend on repetitive paperwork, allowing you to focus completely on your patient during the session and leave work on time.