Skip to main content

MCP Reporting Server overview

The OpenLM MCP Reporting Server brings your software license management data directly into your preferred AI assistant. Built on the open Model Context Protocol (MCP) standard, you can query tenant-specific reporting data using natural language.

Instead of manually running reports or building filters in a business intelligence (BI) tool, you ask your AI assistant a question in plain language. The MCP server translates your question into a structured query against your OpenLM database through GraphQL, retrieves the data, and returns the results in your conversation.

How it works

The MCP server acts as a bridge between your AI assistant and your OpenLM reporting database. The following steps describe the flow:

  1. You ask a question. Type a natural language query in your AI assistant, such as "Show me the top 10 most-used license features this month."
  2. The AI sends a request to the MCP server. Your AI client forwards your query to the OpenLM MCP server using the MCP standard protocol.
  3. The MCP server queries your data. The server translates your request into a GraphQL API call against your tenant-specific OpenLM reporting database.
  4. Results return to you. The data flows back through the MCP server to your AI assistant, which formats it as text, tables, charts, or dashboards.

Key capabilities

With the OpenLM MCP Reporting Server connected, you can:

  • Query license usage data. Ask about license consumption, feature utilization, peak usage times, and trends across your organization.
  • Generate reports and summaries. Request executive summaries, detailed breakdowns, or custom reports on demand without navigating the OpenLM dashboard.
  • Get license optimization suggestions. Receive AI-powered recommendations for unused licenses, underutilized seats, and cost-saving opportunities.
  • Create visual dashboards. Higher-tier AI subscriptions can generate interactive HTML dashboards, charts, and graphs directly in the conversation.
  • Ask follow-up questions. Refine your queries conversationally. Start broad and drill down into specifics without starting over.

Supported AI clients

The OpenLM MCP Reporting Server works with a wide range of AI platforms. You can connect through any of the following clients:

ClientTypeNotes
Claude DesktopDesktop appRequires Claude Pro subscription or higher
Claude.aiWeb appRequires Claude Pro subscription or higher
Claude CodeCLITerminal-based access through npx command
CursorDesktop appIDE with built-in MCP support
WindsurfDesktop appIDE with MCP Marketplace integration
ChatGPTWeb appRequires ChatGPT Plus subscription; Developer Mode required
Gemini CLICLITerminal-based access; OAuth authentication
LibreChatWeb appOpen-source; self-hosted; Streamable HTTP transport
warning

Connecting the MCP server to AI platforms requires a paid subscription, such as Claude Pro, ChatGPT Plus, or equivalent. Free-tier plans do not support adding custom MCP connectors.

Deployed environments

The OpenLM MCP Reporting Server is available across the following environments. Use the URL that matches your region:

EnvironmentMCP server URL
Prod UShttps://cloud-us.openlm.com/mcp
Prod EUhttps://cloud-eu.openlm.com/mcp
note

Most customers use either the Prod US or Prod EU URL. Select the environment that corresponds to where your OpenLM tenant is hosted. The QA and Dev environments are for internal testing only.

Authentication

The MCP server uses OAuth authentication. When you connect your AI client for the first time, you are redirected to the OpenLM login page in your browser. After you sign in with your OpenLM credentials, the MCP server establishes a secure, tenant-specific session. All subsequent queries are scoped to your organization's data.

Output depends on your AI subscription tier

The type of output you receive depends on the subscription tier of the AI client you use:

Subscription tierCapabilities
Basic paid plansText-based responses, tables, and data summaries
Higher-tier plans, such as Pro and PlusInteractive HTML dashboards, charts, graphs, and executive-level reports

Data accuracy and validation

AI models can occasionally produce inaccurate results. While the MCP server provides your real reporting data, the AI assistant's interpretation and presentation of that data might contain errors.

warning

Verify AI-generated insights, especially license optimization recommendations, against your OpenLM Platform BI reports and dashboards before making business decisions.

Next steps