MCP Reporting Server overview
The OpenLM MCP Reporting Server brings your software license management data directly into your preferred AI assistant. Built on the open Model Context Protocol (MCP) standard, you can query tenant-specific reporting data using natural language.
Instead of manually running reports or building filters in a business intelligence (BI) tool, you ask your AI assistant a question in plain language. The MCP server translates your question into a structured query against your OpenLM database through GraphQL, retrieves the data, and returns the results in your conversation.
How it works
The MCP server acts as a bridge between your AI assistant and your OpenLM reporting database. The following steps describe the flow:
- You ask a question. Type a natural language query in your AI assistant, such as "Show me the top 10 most-used license features this month."
- The AI sends a request to the MCP server. Your AI client forwards your query to the OpenLM MCP server using the MCP standard protocol.
- The MCP server queries your data. The server translates your request into a GraphQL API call against your tenant-specific OpenLM reporting database.
- Results return to you. The data flows back through the MCP server to your AI assistant, which formats it as text, tables, charts, or dashboards.
Key capabilities
With the OpenLM MCP Reporting Server connected, you can:
- Query license usage data. Ask about license consumption, feature utilization, peak usage times, and trends across your organization.
- Generate reports and summaries. Request executive summaries, detailed breakdowns, or custom reports on demand without navigating the OpenLM dashboard.
- Get license optimization suggestions. Receive AI-powered recommendations for unused licenses, underutilized seats, and cost-saving opportunities.
- Create visual dashboards. Higher-tier AI subscriptions can generate interactive HTML dashboards, charts, and graphs directly in the conversation.
- Ask follow-up questions. Refine your queries conversationally. Start broad and drill down into specifics without starting over.
Supported AI clients
The OpenLM MCP Reporting Server works with a wide range of AI platforms. You can connect through any of the following clients:
| Client | Type | Notes |
|---|---|---|
| Claude Desktop | Desktop app | Requires Claude Pro subscription or higher |
| Claude.ai | Web app | Requires Claude Pro subscription or higher |
| Claude Code | CLI | Terminal-based access through npx command |
| Cursor | Desktop app | IDE with built-in MCP support |
| Windsurf | Desktop app | IDE with MCP Marketplace integration |
| ChatGPT | Web app | Requires ChatGPT Plus subscription; Developer Mode required |
| Gemini CLI | CLI | Terminal-based access; OAuth authentication |
| LibreChat | Web app | Open-source; self-hosted; Streamable HTTP transport |
Connecting the MCP server to AI platforms requires a paid subscription, such as Claude Pro, ChatGPT Plus, or equivalent. Free-tier plans do not support adding custom MCP connectors.
Deployed environments
The OpenLM MCP Reporting Server is available across the following environments. Use the URL that matches your region:
| Environment | MCP server URL |
|---|---|
| Prod US | https://cloud-us.openlm.com/mcp |
| Prod EU | https://cloud-eu.openlm.com/mcp |
Most customers use either the Prod US or Prod EU URL. Select the environment that corresponds to where your OpenLM tenant is hosted. The QA and Dev environments are for internal testing only.
Authentication
The MCP server uses OAuth authentication. When you connect your AI client for the first time, you are redirected to the OpenLM login page in your browser. After you sign in with your OpenLM credentials, the MCP server establishes a secure, tenant-specific session. All subsequent queries are scoped to your organization's data.
Output depends on your AI subscription tier
The type of output you receive depends on the subscription tier of the AI client you use:
| Subscription tier | Capabilities |
|---|---|
| Basic paid plans | Text-based responses, tables, and data summaries |
| Higher-tier plans, such as Pro and Plus | Interactive HTML dashboards, charts, graphs, and executive-level reports |
Data accuracy and validation
AI models can occasionally produce inaccurate results. While the MCP server provides your real reporting data, the AI assistant's interpretation and presentation of that data might contain errors.
Verify AI-generated insights, especially license optimization recommendations, against your OpenLM Platform BI reports and dashboards before making business decisions.
Next steps
- Set up your client. Follow the step-by-step instructions for your preferred AI platform.
- Explore available tools. See the full list of supported queries and examples.
- Get help. Check the FAQ and troubleshooting guide if you run into issues.