@oxtia
When you use the OLLM Confidential AI Gateway, 3 things happen:
1. Your request runs in a hardware-isolated Trusted Execution Environment
2. The TEE generates a cryptographic proof
3. You can independently verify it
Here's how we verify your data's confidentiality π