Skip to main content

Virtana’s Model Context Protocol Server

In today’s AI-driven world, teams expect to engage with their infrastructure as naturally as they interact with modern AI assistants. As environments grow more distributed, multi-cloud, and complex, traditional dashboards and search-based interfaces often become limiting.

To bridge this gap, Virtana introduces the Virtana MCP Server, an extension to the Hybrid Observability platform that integrates a native Model Context Protocol (MCP)–compatible server into your existing ecosystem.

MCP Clients are chat-based interfaces that connect to remote MCP servers over streamable HTTP, allowing users to access data, perform actions, and receive responses through natural conversational queries instead of navigating complex tools. Together, the Virtana MCP Server and MCP Clients enable AI agents and assistants to retrieve alerts, entities, relationships, and metrics from Virtana-monitored environments, delivering clean, contextual, and human-friendly results.

Note

The examples in this guide were tested with the Goose client, which is an MCP-compatible chat client, and the OpenAI GPT5.1 model. Goose connects to Virtana’s Alerts MCP server to provide real-time alert data through interactive, natural-language queries. However, you can use other compatible clients and models in your environment.

Prerequisites

Ensure you have the following requirements:

  • Virtana account with API access enabled.

  • API authentication token. For more information, see Get Bearer Token.

  • Admin privileges to install software on your system.

What is MCP

Model Context Protocol (MCP) is an open standard that enables AI agents to interact with external platforms and tools in a structured, consistent way.

How MCP Works

The Model Context Protocol (MCP) enables AI agents to interact with external tools and data through a standardized client-server architecture over streamable HTTP. MCP acts as a secure bridge, allowing large language models (LLMs) to discover, call, and execute tools provided by MCP servers without custom integrations

Key Components

  • MCP Client: Client that sends the user’s natural language query to the LLM. Examples include Goose or other chat clients.

  • LLM: Large Language Model, such as OpenAI or Claude. LLMs interpret the query and decide which tools to call from the MCP server.

  • MCP Server: Server that hosts registered tools and APIs, executes tool calls based on LLM requests, and returns structured results.

  • APIs/Tools: The actual functions or endpoints exposed by the MCP server, such as Virtana Alerts API for real-time data.

  • Response Flow: FlowResults from MCP server are sent back to LLM, which synthesizes a natural-language response sent to the client.

    flowchart.png

Installation and Setup

The setup process involves installing the Goose client, configuring your preferred LLM provider (such as OpenAI), and adding the Virtana MCP extension. After completing the setup, we recommend running a test chat to verify the integration.

Note

Note that the Goose client and OpenAI LLM provider are optional components; users can choose any MCP-compatible client and LLM provider that suits their environment.

Install Goose Client

To install the Goose Client, perform the following steps:

  1. Download Goose Client v1.12.0 from: Goose Client.

    The Goose Client version 1.12.0 is recommended for the best compatibility with Virtana's Alerts MCP server and can be installed on different operating systems

  2. Extract the archive if required, then run the installer executable.

  3. Select  OTHER Providers, and then choose OpenAI as the provider type (LLM backend).

    welcome_to_goose.png

Configure OpenAI Provider

Note

Monitor your LLM token/cost usage during MCP interactions. Some prompts generate multiple tool calls and back-and-forth requests, which can result in higher-than-expected token consumption. Track usage through your LLM provider's dashboard.

To configure the OpenAI provider in Goose, perform the following steps:

  1. Launch Goose.

  2. In the setup or configuration screen, go to the  Providers section.

  3. Enter your OpenAI API Key in the API key field.

  4. If you do not have an API key, contact your Virtana administrator or Virtana support to request one.

  5. Save the configuration to complete the provider setup.

    Configure_open_API.png

Add the Virtana MCP Extension

To add the Virtana MCP extension, perform the following steps::

  1. In the Goose client, open the left navigation and select Extensions.

  2. Click  Add Custom Extension.

  3. Configure the extension:

    • Type: Streamable HTTP.

    • Endpoint / MCP URL: https://app.cloud.virtana.com/mcp

  4. Navigate to Request Headers and add the following header:

    • Header Name: Authorization

    • Header Value: Bearer <YOUR_TOKEN>. Replace <YOUR_TOKEN> with the bearer token obtained from Virtana client credentials.

  5. Optionally, give the extension a meaningful name such as Virtana Alerts MCP so users can recognize it easily.

  6. Click  Add Extension  or Save to keep the configuration.

    mcp_extension.png

    When successfully activated, you will see all enabled extensions.

    visible_extensions.png

When this configuration is complete, Goose treats the Virtana Alerts MCP server as an extension that exposes tools such as alert query, alert summarization, or other APIs implemented by the Virtana Alerts service.

Token Expiration Handling

The Virtana MCP server token is time-bound. If it expires, regenerate a new token from your Virtana platform and reconfigure it using the Goose settings.

When a token expires, you may observe that the Goose client fails to load the Virtana MCP extension on startup, or the extension shows as "deactivated" in the available extensions list.

If this happens, then you can try to resolve this error by generating a new token in the Virtana platform.

  • deactivated.png

Start Testing in Chat

To start testing the integration in chat, perform the following steps:

  1. Navigate to the Chat  section in Goose.

  2. Make sure the Virtana Alerts MCP extension is enabled for the current session, alongside the OpenAI provider.

  3. Ask alert-related questions, such as:

    • “Summarize the alerts in the last 24 hours.”

    • “Which sources produced the most alerts?”

    • “What is the issue with this entity?” (for a specific entity in Virtana Global View).

  4. Observe how the MCP server responds:

    • Goose sends your query to the LLM (OpenAI).

    • The LLM uses the MCP tools exposed by the Virtana Alerts server to fetch real-time data.

    • The response is returned to the chat with contextual information derived from Virtana’s alert APIs.

Monitoring LLM Token Usage and Costs

Goose provides a built-in indicator to help you monitor LLM token consumption and approximate costs per conversation. At the bottom of the Goose window, you can hover over the token/cost icon to see a breakdown of:

token_cost.png
  • Input tokens and cost

  • Output tokens and cost

Use this view to regularly check how many tokens your current chat or recipe run is using and to identify prompts that drive unusually high usage.

Note

Virtana does not take responsibility for LLM/MCP client token mishandling or sudden cost spikes. Users must monitor token usage at the session level through their LLM provider dashboards.