Is it possible to configure Postman to use self hosted OpenAI API compatible AI endpoint? I am excited to see it offers AI assistance, but found no such preference in the configuration.
Hi there,
At the moment, Postman doesn’t provide a built-in option to configure a self-hosted OpenAI-compatible endpoint for its AI features. The AI assistant in Postman is tightly integrated with their own cloud service and expects an OpenAI API key managed through Postman’s environment—it’s not designed to point to arbitrary endpoints.
That said, a few points to keep in mind:
-
Official support is limited: Currently, there’s no exposed setting in Postman’s UI or configuration files to override the API endpoint.
-
Workarounds are not officially supported: Some advanced users try proxies or local API wrappers to mimic OpenAI responses, but this can break the AI features and isn’t guaranteed to work.
-
Feature requests: Postman has a feedback forum and GitHub where users can suggest this capability. Given the growing interest in self-hosted LLMs, submitting a feature request could encourage Postman to add support for custom endpoints in the future.
For now, the only supported approach is to use Postman’s AI tools via their cloud-managed OpenAI integration. If full offline or self-hosted usage is critical, you might need to build your own local workflow using the models you have and integrate them with Postman via pre-request scripts, external API calls, or test scripts instead of the built-in AI assistant.