# User Privacy & Confidentiality

Mental-health data is among the most sensitive information an app can handle. Users must feel safe, supported, and confident that their personal wellbeing information is protected. therappai is built with privacy first, and your application plays an important role in ensuring that confidentiality is respected at every step.

This guide explains what data is collected, how it should be handled, and best practices for maintaining user trust.

***

## **Core Privacy Principles**

therappai is designed around five essential privacy commitments:

1. **User control**\
   Users decide what information they share and can update or remove it.
2. **Confidentiality**\
   AI therapy messages, moods, tasks, and content interactions are *private*.
3. **No human review**\
   therappai does not use humans to read or evaluate user messages.
4. **No sharing with employers or external parties**\
   Workplace partners only receive **aggregate adoption metrics** (never individual data).
5. **Secure, encrypted transport**\
   All data is encrypted in transit (HTTPS) and stored securely on our servers.

***

## **What Data is Considered Sensitive**

Your app must treat the following categories as highly confidential:

* AI therapy messages
* Mood logs and emotional notes
* Daily tasks and completion data
* Content interactions
* Emergency contacts
* Personal profile details
* Any data that implies emotional, psychological, or wellbeing context

This information **must not** be logged, exposed, or shared with third parties.

***

## **What Employers and Partners Can See**

If you’re integrating therappai for a workplace or enterprise:

#### Partners can see:

* Total active users
* Usage rates
* Content categories used (aggregate only)
* High-level wellbeing insights (anonymous)
* Basic engagement metrics

#### Partners cannot see:

* Messages
* Specific sessions
* Mood logs
* Daily tasks
* Individual behaviour
* Crisis Buddy contacts
* Any identifiable wellbeing data

This ensures employees can use the tool safely and privately.

***

## **Storing User Data in Your App**

Your application may store some information locally for UI purposes (chat history, mood calendar visuals, etc.), but it must be handled carefully.

#### **DO store safely:**

* mood summaries
* content progress
* cached videos (temporarily)
* chat history (optional, encrypted if stored locally)

#### **DO NOT store:**

* raw tokens in plaintext
* sensitive data in logs
* therapy messages in unprotected storage
* data in third-party analytics tools

If in doubt, **store less**, not more.

***

## **Recommended Privacy Practices**

These practices help protect users and reduce risk.

***

### **1. Avoid storing sensitive data client-side**

For mobile apps:

* use encrypted storage
* avoid AsyncStorage or local text files

For web apps:

* avoid localStorage for confidential content

***

### **2. Do not forward therapy messages to analytics tools**

Never send:

* AI messages
* user messages
* emotional notes\
  to Mixpanel, Google Analytics, Sentry, or similar tools.

Instead, only track:

* screen views
* feature toggles
* 100% non-sensitive UI events

***

### **3. Respect user opt-in**

If your app includes:

* journaling
* notifications
* data sharing\
  they should be optional and clearly explained.

***

### **4. Provide clear privacy messaging in-app**

Users engage more when they know:

* messages are private
* data isn’t shared with employers
* no humans read their conversations
* data can be deleted any time

You can display this before starting a session.

***

### **5. Secure all API communication**

Use:

* HTTPS only
* secure storage for tokens
* short-lived access tokens
* server-side protection for refresh tokens

***

### **6. Be transparent about Crisis Buddy**

Make it clear that:

* therappai doesn’t contact emergency services
* crisis contacts are stored but *not* contacted automatically
* users control who they add or remove

***

## **Privacy in Special Contexts**

#### **Workplace integrations**

Absolute anonymity for employees.\
Employers never see individual data.

#### **Coaching apps**

Coaches should not have automatic access to therapy messages.

#### **Consumer apps**

Explain upfront what data is stored and how.

***

## **Summary**

To maintain user trust and safety:

* Treat all wellbeing data as highly sensitive
* Avoid storing or logging unnecessary information
* Never expose therapy messages or moods to analytics
* Protect tokens and enforce secure session flows
* Communicate clearly that user data is private and not shared

therappai handles the AI and safety layers — you are responsible for ethical, secure presentation in your own application.
