Set up your AI client
Follow the instructions for your AI client to connect it to the OpenLM MCP Reporting Server.
Before you begin
Before you set up your AI client, verify these prerequisites:
- An active OpenLM account with access to a cloud-hosted tenant (US or EU).
- A paid AI subscription, such as Claude Pro, ChatGPT Plus, or equivalent. Free-tier plans do not support custom MCP connectors.
- Your environment URL. Identify the correct MCP server URL for your region from the following table.
| Environment | MCP server URL |
|---|---|
| Prod US | https://cloud-us.openlm.com/mcp |
| Prod EU | https://cloud-eu.openlm.com/mcp |
Replace the MCP server URL in the configurations that follow with the URL that corresponds to your target environment.
Claude Desktop
Desktop app
- Download and install Claude Desktop from the official website.
- Open Claude Desktop. Select your profile icon, then select Settings → Developer → Edit Config.
- In the
claude_desktop_config.jsonfile that opens, paste the following JSON configuration:
{
"mcpServers": {
"openlm-reporting": {
"command": "npx",
"args": [
"-y",
"mcp-remote@latest",
"https://cloud-us.openlm.com/mcp",
"9877"
]
}
}
}
- Save the file and restart Claude Desktop.
- You are redirected to the browser for authentication. Sign in with your OpenLM credentials.
- After successful login, the MCP tools icon appears in the chat input area. The server is ready to use.
After successful connection, you can start querying your OpenLM reporting data directly from Claude Desktop.
Cursor
Desktop app
- Download and install Cursor from the official website.
- Open Cursor. Go to Cursor Settings → Tools and MCP.
- Select Add a custom MCP server. In the
mcp.jsonfile that opens, paste the following JSON configuration:
{
"mcpServers": {
"openlm-reporting": {
"url": "https://cloud-us.openlm.com/mcp"
}
}
}
- After successful login, the MCP server name and the total number of available tools appear.
- Press Ctrl + L to start a new chat and begin querying.
After successful connection, you can start querying your OpenLM reporting data directly from Cursor.
Windsurf
Desktop app
- Download and install Windsurf from the official website.
- Open Windsurf. Go to Windsurf Settings → Open MCP Marketplace.
- Select Add Custom MCP. In the
mcp_config.jsonfile that opens, paste the following JSON configuration:
{
"mcpServers": {
"openlm-reporting": {
"command": "npx",
"args": [
"-y",
"mcp-remote@latest",
"https://cloud-us.openlm.com/mcp",
"9877"
]
}
}
}
- After successful login, the MCP server name and tool count appear in the MCP Marketplace.
- Open the chat panel and start querying.
After successful connection, you can start querying your OpenLM reporting data directly from Windsurf.
LibreChat
Web app
- Set up LibreChat from GitHub and open it in your browser.
- On the left side panel, go to MCP Settings.
- Select the + icon to add a new MCP server.
- Provide a name for the MCP server and enter the MCP server URL for your environment.
- Select Streamable HTTP as the transport type, check I trust the application, and select Create.
- You are redirected to the browser for authentication. After successful login, the MCP server is connected.
After successful connection, you can start querying your OpenLM reporting data directly from LibreChat.
Gemini CLI
CLI
- Install Gemini CLI globally by running:
npm install -g @google/gemini-cli@latest
- Run
geminiin your terminal to initialize it for the first time. - Open the settings file at
~/.gemini/settings.jsonand paste the following configuration:
{
"mcpServers": {
"openlm-reporting": {
"command": "npx",
"args": [
"-y",
"mcp-remote@latest",
"https://cloud-us.openlm.com/mcp",
"9877"
],
"timeout": 30000
}
},
"security": {
"auth": {
"selectedType": "oauth-personal"
}
}
}
- Save the file and run
geminiagain. You seeopenlm-reporting is connectedin the terminal confirming the connection. - Start querying tenant-specific data directly from the Gemini CLI terminal.
After successful connection, you can start querying your OpenLM reporting data directly from Gemini CLI.
Claude Code
CLI
- Open your terminal and run the following command to add the MCP server:
npx -y mcp-remote@latest https://cloud-us.openlm.com/mcp 9877
- After successful authentication, the MCP server is connected and ready to use.
- Start querying tenant-specific reporting data directly from Claude Code.
After successful connection, you can start querying your OpenLM reporting data directly from Claude Code.
ChatGPT
Web app
- Open ChatGPT in your browser and sign in to your account.
- Go to Settings → Apps → Create Apps.
- Provide a name for the MCP server and paste the MCP server URL for your environment.
- Select OAuth as the authorization type, then check the confirmation checkbox and select Create.
- You are redirected to the browser for authentication. After successful login, the OpenLM MCP server is connected.
- Start a new chat, select the + icon, go to More Options, and select the OpenLM MCP server to begin querying.
After successful connection, you can start querying your OpenLM reporting data directly from ChatGPT.
Developer Mode must be turned on in ChatGPT settings to access the Create Apps option.
Claude.ai
Web app
- Open claude.ai in your browser and sign in to your account.
- Select your profile icon and go to Settings → Connectors.
- Select Add custom connector.
- Provide a name for the MCP server and paste the MCP server URL for your environment.
- You are redirected to the browser for authentication. After successful login, the OpenLM MCP server is connected as a custom connector.
- The MCP server is available as a connector and can be used directly in any Claude.ai conversation.
After successful connection, you can start querying your OpenLM reporting data directly from Claude.ai.