Skip to main content

AI Configuration

Before configuring MCP AI in your IO application, ensure you have your API token ready. This topic walks you through obtaining your token and setting up AI configuration correctly.

Getting Your API Token

Follow these steps to retrieve your API token:

  1. Log in to your IO application.

  2. Navigate to Settings from the main navigation panel.

  3. If you don’t have a token yet, click Create New Token.

  4. Copy the generated token and store it securely—you will need it for AI configuration.

Note

To get your LLM API Key (example, for OpenAI), log in to your provider’s portal (like https://platform.openai.com), create a new secret key, and save it separately.

Setting Up AI in Virtana IO

This topic shows you exactly how to configure AI-powered queries in your Virtana IO platform using the Settings interface.

ai_conf.png
  1. Navigate to Settings > AI Configuration.

  2. The AI Configuration page opens, where you can set up your AI provider.

Connect to OpenAI (Basic Setup)

The Virtana MCP servers require access to a Large Language Model (LLM) to ensure accurate responses to tool requests.

By default, the system uses OpenAI models.

Enter your OpenAI API key in the masked field. This key automatically connects Virtana to OpenAI’s models using your credentials. Keys are encrypted and stored securely.

If you see “AI Connection Successful” with a green checkmark, your setup is working.

Customize Your AI Provider (Advanced Settings)

To customize the model provider, change the model, or customize the parameters, you can view the advanced settings.

  • Change this only if you are:

    • Using a custom AI proxy (for example, a LiteLLM proxy), or

    • Connecting to a different OpenAI-compatible provider, for example, Azure OpenAI Service and OpenRouter.

    Most users should leave this value unchanged.

Proxy Configuration

If your network requires proxy access to reach external AI providers or the IO Server, enable proxy support in the MCP extension settings.

  • Select the “Access via Proxy” option.

Action Buttons

At the bottom of the page, you have four options:

  • Save: Stores your current configuration

  • Test Connection: Validates your API key and connection.

  • Delete Configuration: Removes all AI settings .

  • Reset: Returns all fields to default values.

Once configured, you can: Ask natural language questions about your infrastructure.

Configuring the IO MCP Extension

To enable communication between your AI tool and the IO Server, configure the MCP service as an HTTP extension. This setup allows the AI component to interact securely with underlying IO APIs.

Refer Virtana’s Model Context Protocol Server for installing the Goose client and other details.

Note

Note that the Goose client and OpenAI LLM provider are optional components; users can choose any MCP-compatible client and LLM provider that suits their environment.

Add the Virtana IO Extension

To add the Virtana MCP extension, perform the following steps:

io_mcp_conf.png
  1. In the client application (for example, Goose client), open the left navigation and select Extensions.

  2. Click  Add Custom Extension.

  3. Enter the extension name and Enter the Type as Streamable HTTP.

    Note

    When you select HTTP as the type, the Endpoint options become available.

  4. Enter Endpoint / MCP URL as: https://<io_server>/api/sdk/mcp

  5. Navigate to Request Headers and add the following header:

    • Header Name: Authorization

    • Value: Bearer <Your_IO_API_Token>

    To generate client-id and client-secret, see Generate OAuth credentials for the MCP client

  6. Optionally, give the extension a meaningful name, such as Virtana IO_Server, so users can recognize it easily.

  7. Click Add Extension  or Save to keep the configuration.

  8. When successfully activated, you will see all enabled extensions.

  9. Start Testing in Chat.

Connection Details

The following table outlines the required connection settings for configuring the MCP extension:

Table 84. Connection Details

Setting

Description

Example / Recommended Value

Extension Name

Defines the identifier for the IO server connection.

io_server

Type

Specifies the communication protocol.

HTTP

Endpoint

The MCP service endpoint on your IO Server.

https://<io_server>/api/mcp/

Timeout

Maximum response wait time for extension requests.

300 seconds (recommended)



Implementation Summary

The MCP extension provides intelligent, AI-driven access to your IO Server data. Unlike standard API wrappers, it uses AI agents to interpret requests, perform reasoning, and return contextual results.

Table 85. Implementation Summary

Item

Description

Primary Endpoint

https://<io_server>/api/sdk/mcp

Agent Intelligence

MCP endpoints function as intelligent agents that interpret and process user requests using AI reasoning. Some operations may take longer than typical API calls due to contextual processing.

Capabilities

The AI extension supports schema queries, entity and metric retrieval, and exploration of basic relationships within the IO environment.

Usage Warning

Because this feature leverages AI, always manually validate generated responses and data interpretations for accuracy before applying the results operationally.



Troubleshooting

Use the table below to diagnose and resolve common issues when configuring or using the MCP extension.

Note

Because AI agents perform reasoning to generate responses, occasional delays are normal. If timeouts persist, increase the timeout value or contact Virtana Support.

Table 86. Troubleshooting

Issue

Potential Cause

Resolution

401 Unauthorized

Missing or invalid IO API Token

Verify the Authorization header in the Extension settings contains a valid Bearer token.