AI Configuration
Before configuring MCP AI in your IO application, ensure you have your API token ready. This topic walks you through obtaining your token and setting up AI configuration correctly.
Getting Your API Token
Follow these steps to retrieve your API token:
Log in to your IO application.
Navigate to Settings from the main navigation panel.
If you don’t have a token yet, click Create New Token.
Copy the generated token and store it securely—you will need it for AI configuration.
Note
To get your LLM API Key (example, for OpenAI), log in to your provider’s portal (like https://platform.openai.com), create a new secret key, and save it separately.
Setting Up AI in Virtana IO
This topic shows you exactly how to configure AI-powered queries in your Virtana IO platform using the Settings interface.

Navigate to Settings > AI Configuration.
The AI Configuration page opens, where you can set up your AI provider.
Connect to OpenAI (Basic Setup)
The Virtana MCP servers require access to a Large Language Model (LLM) to ensure accurate responses to tool requests.
By default, the system uses OpenAI models.
Enter your OpenAI API key in the masked field. This key automatically connects Virtana to OpenAI’s models using your credentials. Keys are encrypted and stored securely.
If you see “AI Connection Successful” with a green checkmark, your setup is working.
Customize Your AI Provider (Advanced Settings)
To customize the model provider, change the model, or customize the parameters, you can view the advanced settings.
Change this only if you are:
Using a custom AI proxy (for example, a LiteLLM proxy), or
Connecting to a different OpenAI-compatible provider, for example, Azure OpenAI Service and OpenRouter.
Most users should leave this value unchanged.
Proxy Configuration
If your network requires proxy access to reach external AI providers or the IO Server, enable proxy support in the MCP extension settings.
Select the “Access via Proxy” option.
Action Buttons
At the bottom of the page, you have four options:
Save: Stores your current configuration
Test Connection: Validates your API key and connection.
Delete Configuration: Removes all AI settings .
Reset: Returns all fields to default values.
Once configured, you can: Ask natural language questions about your infrastructure.
Configuring the IO MCP Extension
To enable communication between your AI tool and the IO Server, configure the MCP service as an HTTP extension. This setup allows the AI component to interact securely with underlying IO APIs.
Refer Virtana’s Model Context Protocol Server for installing the Goose client and other details.
Note
Note that the Goose client and OpenAI LLM provider are optional components; users can choose any MCP-compatible client and LLM provider that suits their environment.
Add the Virtana IO Extension
To add the Virtana MCP extension, perform the following steps:

In the client application (for example, Goose client), open the left navigation and select Extensions.
Click Add Custom Extension.
Enter the extension name and Enter the Type as Streamable HTTP.
Note
When you select HTTP as the type, the Endpoint options become available.
Enter Endpoint / MCP URL as: https://<io_server>/api/sdk/mcp
Navigate to Request Headers and add the following header:
Header Name: Authorization
Value: Bearer <Your_IO_API_Token>
To generate client-id and client-secret, see Generate OAuth credentials for the MCP client
Optionally, give the extension a meaningful name, such as Virtana IO_Server, so users can recognize it easily.
Click Add Extension or Save to keep the configuration.
When successfully activated, you will see all enabled extensions.
Start Testing in Chat.
Connection Details
The following table outlines the required connection settings for configuring the MCP extension:
Setting | Description | Example / Recommended Value |
|---|---|---|
Extension Name | Defines the identifier for the IO server connection. | io_server |
Type | Specifies the communication protocol. | HTTP |
Endpoint | The MCP service endpoint on your IO Server. | https://<io_server>/api/mcp/ |
Timeout | Maximum response wait time for extension requests. | 300 seconds (recommended) |
Implementation Summary
The MCP extension provides intelligent, AI-driven access to your IO Server data. Unlike standard API wrappers, it uses AI agents to interpret requests, perform reasoning, and return contextual results.
Item | Description |
|---|---|
Primary Endpoint | https://<io_server>/api/sdk/mcp |
Agent Intelligence | MCP endpoints function as intelligent agents that interpret and process user requests using AI reasoning. Some operations may take longer than typical API calls due to contextual processing. |
Capabilities | The AI extension supports schema queries, entity and metric retrieval, and exploration of basic relationships within the IO environment. |
Usage Warning | Because this feature leverages AI, always manually validate generated responses and data interpretations for accuracy before applying the results operationally. |
Troubleshooting
Use the table below to diagnose and resolve common issues when configuring or using the MCP extension.
Note
Because AI agents perform reasoning to generate responses, occasional delays are normal. If timeouts persist, increase the timeout value or contact Virtana Support.
Issue | Potential Cause | Resolution |
|---|---|---|
401 Unauthorized | Missing or invalid IO API Token | Verify the Authorization header in the Extension settings contains a valid Bearer token. |