MCP Server
gllm-inference | Tutorial: MCP Server | API Reference
Supported by: OpenAILMInvoker
What is MCP Server?
MCP server is a native tool that allows the language model to use remote MCP servers to give models new capabilities. When it's enabled, MCP calls are stored in the outputs attribute of the LMOutput object and can be accessed via the mcp_calls property.
MCP server tool can be enabled with several options:
import asyncio
from gllm_inference.lm_invoker import OpenAILMInvoker
from gllm_inference.model import OpenAILM
from gllm_inference.schema import NativeTool, NativeToolType
SERVER_URL = "https://mcp.deepwiki.com/mcp"
SERVER_NAME = "deepwiki"
# Option 1: as dictionary
mcp_server_tool = {"type": "mcp_server", "url": SERVER_URL, "name": SERVER_NAME, **kwargs}
# Option 2: as native tool object
mcp_server_tool = NativeTool.mcp_server(url=SERVER_URL, name=SERVER_NAME, kwargs)
lm_invoker = OpenAILMInvoker(OpenAILM.GPT_5_NANO, tools=[mcp_server_tool])Let's try it to utilize the MCP server through our LM invoker!
Output:
Last updated