Beta

Protect your OpenAI key with a fully managed proxy

Say goodbye to complicated cloud functions and building your own backend.

Get started with our free plan No credit card required
How it works. OpenAI requests from your app go through our proxy server using split key encryption and DeviceCheck for added security. Learn more
Setup in minutes. Integrating with AIProxy is dead simple. Setup only takes a few minutes so you can stay focused on building your app. View integration video
Swift Integration. AIProxy is built into SwiftOpenAI. Make a small change to your initialization code, and all OpenAI requests are proxied through AIProxy.Learn more
Juanjo Valiño
Wrapfast
"I see developers on Twitter every day struggling with how to secure their API keys. AIProxy makes it simple, so you no longer have to waste time overthinking a backend."
Shihab Mehboob
Bulletin
"I’ve been using AIProxy for the past few months and appreciate the simplicity and great customer support that I’ve received. Suggestions I’ve made have been added with lightning fast speed."
James Rochabrun
SwiftOpenAI
"AIProxy is an amazing contribution to the @OpenAI Swift library, thanks @louzell_ and @toddham for the great job! Open source ❤️."
Sam McGarry
Cue
"The set up was a breeze. What could have been a confusing multi-step process felt simple. I was able to get back to building immediately, while knowing my API key was safe."

Security, observability and control.

Key security and DeviceCheck.

AIProxy uses a combination of split key encryption and DeviceCheck to prevent your key and endpoint from being stolen or abused.

Device Check

Monitor usage on our dashboard.

Our dashboard helps you keep an eye on your usage and get a deeper understanding of how users are interacting with AI in your app.

Usage metrics

Hot swap models and proxy rules.

Wanna change from GPT 3.5 to 4? No problem! You can change models and parameters right from the dashboard without updating your app.

Option to switch models

Test API calls with live console.

Use the live console to test you OpenAI calls from your app. Find errors and get a better understanding of performance.

Live console

Alerts to keep you alert.

Get alerts when there's suspicious activity so you can take quick action.

Alert notifications

Built to scale.

Built on AWS, our service horizontally scales to meet demands.

AWS Logo

Frequently Asked Questions

We don't actually store any customer OpenAI keys. Instead, we encrypt your key and store one part of that encrypted result in our database. On its own, this message can't be reversed into your secret key. The other part of the encrypted message is sent up with requests from your app. When the two pieces are married, we derive your secret key and fulfill the request to OpenAI.

We have multiple mechanisms in place to restrict endpoint abuse:
1. Your AIProxy project comes with proxy rules that you configure. You can enable only endpoints that your app depends on in the proxy rules section. For example, if your app depends on /v1/chat/completions, then you would permit the proxying of requests to that endpoint and block all others. This makes your enpdoint less desireable to attackers.

2. We use Apple's DeviceCheck service to ensure that requests to AIProxy originated from your app running on legitimate Apple hardware.

3. We guarantee that DeviceCheck tokens are only used once, which prevents an attacker from replaying a token that they sniffed from the network.

The proxy is deployed on AWS Lambda, meaning we can effortlessly scale horizontally behind a load balancer.

Upon configuring your project in the developer dashboard, you'll receive initialization code to drop into the SwiftOpenAI client. Alternatively, you can use a bootstrap product like WrapFast.