Skip to content

Getting Started with AI Gateway

The AI Gateway in WSO2 API Manager simplifies the integration of AI services into applications by providing a seamless way to manage and expose AI APIs. With built-in support for leading AI vendors such as OpenAI, Azure OpenAI, and Mistral, as well as the flexibility to configure custom AI providers, AI Gateway enables organizations to adopt AI securely and efficiently.

AI Gateway gives you the ability to create AI APIs, which serve as a bridge between your application and AI service providers. These AI APIs allow you to interact with AI models, send requests, and retrieve AI-generated responses.

Note

This Getting Started guide will walk you through creating an OpenAI based AI API.

Create an AI API

  1. Login to the Publisher Portal (https://<hostname>:9443/publisher).

  2. Create an AI/LLM API by clicking on Create AI/LLM API.

    Select AI API

  3. Select the desired provider and version. Then, click Next.

    Select AI Service Provider and Version

    Tip

    The built-in AI service providers and versions will appear on relevant dropdowns. In addition to the default vendors, you can add custom AI vendors by following the custom AI vendor integration documentation.

  4. Fill in the AI API details and click Create.

    Field Sample value
    Name OpenAIAPI
    Context

    openaiapi

    The API context is used by the Gateway to identify the API. Therefore, the API context must be unique. This context is the API's root context when invoking the API through the Gateway.

    You can define the API's version as a parameter of its context by adding the {version} into the context. For example, {version}/openaiapi. The API Manager assigns the actual version of the API to the {version} parameter internally. For example, https://localhost:8243/2.3.0/openaiapi. Note that the version appears before the context, allowing you to group your APIs based on the versions.

    Version 2.3.0

    Create OpenAI API

    The overview page of the newly created API appears.

Configure Backend Security

Now that the AI API is successfully created, next step is to configure the backend security to ensure AI provider accessibility. You can follow along the steps mentioned below. For detailed steps, see AI Backend Security.

  1. Create an API key to access the OpenAI API.
  2. Navigate to API ConfigurationsEndpoints.
  3. Edit Default Production Endpoint and add the API key obtained from step 1. Then, click on Update.
  4. Repeat step 3 for Default Sandbox Endpoint.

Deploy, Test and Publish your AI API

Following the successful AI API creation and backend security configuration, you can proceed to deploy, test, and publish the AI API.

Invoke AI API

  1. Login to the Developer Portal (https://<hostname>:9443/devportal) and click on the OpenAIAPI that you just published.
  2. Click Try Out option available under the Overview tab.
  3. Click on Get Test Key to generate a test key.
  4. Expand the /chat/completions POST method and click on Try it out button.
  5. Replace the request body with the following:

    {
        "model": "o3-mini",
        "messages": [{"role": "user", "content": "Say this is a test!"}]
    }
    
  6. Note the successful response for the API invocation.

    AI API Invocation Success

Now, you have successfully created, deployed, published and invoked an AI API.