API key protection and endpoint security.

Protect any API in minutes with a fully managed proxy.

Get started with our free plan No credit card required
Trusted by 100+ developers
Integrate. Requests from your app go through our proxy using split key encryption, DeviceCheck, and certificate pinning.
Configure. You can configure your endpoints, rate limiting, models, and and alerts to keep your API fully protected.
Monitor. Monitor requests and view the live console in our dashboard to get a better understanding of API usage.
Protect the APIs your app depends on.

Security, observability and control.

Security

AIProxy uses a combination of split key encryption, DeviceCheck and certificate pinning to prevent your key and endpoint from being stolen or abused.

Device Check

Monitoring

Our dashboard helps you keep an eye on your usage and get a deeper understanding of how users are interacting with AI in your app.

Usage metrics

Rate Limits

Want to change your API calls from gpt-3.5-turbo to gpt-4o? No problem! You can change models and rate limits right from the dashboard without updating your app.

Option to switch models

Live Console

Use the live console to test your OpenAI calls from your app. Find errors and get a better understanding of performance.

Live console

Notifications

Get alerts when there's suspicious activity so you can take quick action. We'll keep you informed so that you can stay protected.

Alert notifications

Built to Scale

Built on AWS, our service horizontally scales to meet demands. You can be confident that your proxy will continue to run no matter what.

AWS Logo

What customers say...

"Using AIProxySwift has been a game-changer for me. It’s incredibly easy to use and install, and since I started using it, I’ve had no more suspicious activity on my OpenAI account. Highly recommend for anyone looking to optimize their AI solutions with a real support."
Mark Evans
CEO App Launchers
"I've loved the product! It's given me the peace of mind that I wouldn't have gotten without having to spin up my own remote server. It's brilliant! The service works so well that I often forget I'm even using it."
Shihab Mehboob
Bulletin
"I’ve been using AI Proxy for the past few months and have been loving the tool. It does exactly what it says on the tin, and without any complications. I appreciate the simplicity and great customer support that I’ve received. Any questions I had have been answered immediately, and any suggestions I’ve made have been added with lightning fast speed."
Juanjo Valiño
Wrapfast
"I see developers on Twitter every day struggling with how to secure their API keys. AIProxy makes it simple, so you no longer have to waste time overthinking a backend."
Hidde van der Ploeg
Helm
"AIProxy lets us enable pro users to use AI without using their own OpenAI key. This creates a smoother user experience that aligns with our product."
James Rochabrun
SwiftOpenAI
"AIProxy is an amazing contribution to the @OpenAI Swift library, thanks @louzell_ and @toddham for the great job! Open source ❤️."
Tirupati Balan
Amigo Finance
"AIProxy has significantly improved our backend operations at Amigo AI, ensuring our GPT key and all OpenAI requests are securely managed. This security gives us peace of mind and lets us focus on delivering excellent financial management AI. The founders have been incredibly supportive and responsive throughout. We're very pleased with their service."
Sam McGarry
Cue
"The set up was a breeze. What could have been a confusing multi-step process felt simple. I was able to get back to building immediately, while knowing my API key was safe."
Luca Lupo
iirc_ai
"I thought I was safe as IIRC is small but that didn't matter; somehow they targeted my app and were able to spoof the API key by looking at network traffic (I was sending it over a Firebase remote config). I then decided to migrate to AI Proxy, it took literally 10 minutes and the new API key has been safe ever since."
Emin Grbo
@r0black
"AIProxy is really a straightforward solution, LOVIN it! "
"With AIProxy you just need to know Swift and SwiftUI, no need to learn other languages or frameworks. Additionally, users will benefit as we can eliminate the need to authenticate them. Perfect for developers and users."
Arjun
@dotarjun
"Just discovered http://aiproxy.pro by @louzell_ all thanks to Tech Twitter. It's one of those rare innovative AI product which isn't just a ChatGPT wrapper."

About Us

Lou Zell

Engineer

Lou has 15 years of industry experience building systems that scale. He previously designed the distributed system that processed all telematics data at Lyft.

Follow on 𝕏

Todd Hamilton

Design Engineer

Todd Hamilton is a design engineer who previously worked at Meta for the last 10 years. He specializes in product design, front-end development, and prototyping.

Follow on 𝕏

Protect your OpenAI key today!

Integrate with your app in minutes.
Get started with our free plan

FAQs

Have more questions?

No, we don't actually store any customer API keys. Instead, we encrypt your key and store one part of that encrypted result in our database. On its own, this message can't be reversed into your secret key. The other part of the encrypted message is sent up with requests from your app. When the two pieces are married, we derive your secret key and fulfill the request.

The key we provide you is useless on its own and can be hardcoded in your client. When you add an OpenAI key in our dashboard we don't store it on our backend. We encrypt your key and store only half, and give you the other half which you use in your client. We combine these two pieces and decrypt when a request gets made.

We have multiple mechanisms in place to restrict endpoint abuse:
1. Your AIProxy project comes with proxy rules that you configure. You can enable only endpoints that your app depends on in the proxy rules section. For example, if your app depends on /v1/chat/completions, then you would permit the proxying of requests to that endpoint and block all others. This makes your enpdoint less desireable to attackers.

2. We use Apple's DeviceCheck service to ensure that requests to AIProxy originated from your app running on legitimate Apple hardware.

3. We guarantee that DeviceCheck tokens are only used once, which prevents an attacker from replaying a token that they sniffed from the network.

The proxy is deployed on AWS Lambda, meaning we can effortlessly scale horizontally behind a load balancer.

Upon configuring your project in the developer dashboard, you'll receive initialization code to drop into the AIProxySwift client.