OpenAI Releases Free API Access via GitHub Tool as ChatGPT Ads Remain Limited Globally
Photo by Levart_Photographer (unsplash.com/@siva_photography) on Unsplash
OpenAI has launched a free API access tool on GitHub that lets users route requests through a local proxy using their ChatGPT OAuth tokens, eliminating the need for an API key, while its ChatGPT ads remain limited worldwide.
Key Facts
- •Key company: OpenAI
OpenAI’s new openai‑oauth package, published on GitHub by developer Evan Zhou, provides a locally hosted proxy that forwards requests to chatgpt.com using the user’s OAuth tokens instead of a traditional API key [GitHub repo]. The tool can be invoked with a single npx command, spawning an endpoint at http://127.0.0.1:10531/v1 that mimics the official OpenAI API surface. By default the proxy discovers every Codex‑compatible model attached to the caller’s ChatGPT account—such as gpt‑5.4 and gpt‑5.3‑codex—so developers can query the same models they see in the web UI without paying for a separate API subscription. The CLI also supports a Vercel AI SDK provider variant, allowing serverless functions to call createOpenAIOAuth and generate text via the same token‑based transport [GitHub repo].
The package’s configuration options expose the underlying transport details: developers can bind the proxy to a custom host or port, restrict the model list with a comma‑separated allowlist, or override the upstream base URL and OAuth client ID used for token refresh [GitHub repo]. This flexibility means the tool can be adapted for internal testing, rapid prototyping, or even integration into larger AI‑agent frameworks that rely on the OpenAI‑compatible endpoint format. Because the proxy reuses the user’s existing ChatGPT session, it bypasses the rate limits and billing counters associated with standard API keys, effectively turning any logged‑in ChatGPT account into a free API gateway.
OpenAI’s decision to keep its ChatGPT advertising rollout limited to a handful of markets, as reported by BleepingComputer, underscores a strategic divergence between monetization and developer outreach [BleepingComputer]. While the company experiments with ad‑supported experiences in select regions, it simultaneously lowers the barrier to entry for developers who want to embed its models in apps or services. The free OAuth proxy thus arrives at a moment when the official API pricing remains a friction point for many hobbyist and early‑stage projects, especially those that cannot justify the cost of the $0.02‑$0.12 per‑1K‑token rates that apply to GPT‑4 and newer models.
Industry observers have noted that similar “free‑access” workarounds have surfaced before; TechCrunch documented a developer who exploited an API flaw to obtain unrestricted GPT‑4 usage [TechCrunch]. OpenAI’s open‑source proxy does not rely on a vulnerability but on the legitimate authentication flow of ChatGPT itself, which the platform already permits for interactive use. Nonetheless, the approach raises questions about how OpenAI will enforce usage caps, prevent abuse, and reconcile token‑based access with its broader revenue model that includes both paid API consumption and emerging ad‑supported tiers.
The timing of the release also dovetails with OpenAI’s broader push on AI agents, highlighted in recent Ars Technica coverage of the company’s new developer API that promises “agents that will join the workforce by 2025” [Ars Technica agent]. By offering a free, locally hosted gateway that speaks the same OpenAI‑compatible schema, openai‑oauth could accelerate experimentation with those agent capabilities, allowing developers to prototype multi‑step workflows without incurring API costs. If the tool gains traction, it may pressure OpenAI to formalize token‑based billing or to expand its ad‑supported offerings beyond the current limited rollout, balancing the need for revenue with the community’s appetite for unrestricted model access.
Reporting based on verified sources and public filings. Sector HQ editorial standards require multi-source attribution.