Web Search

gllm-inference | Tutorial: Web Search | API Reference

Supported by: OpenAILMInvoker, XAILMInvoker

Web search is a native tool that allows the language model to search the web for relevant information. When it's enabled, web search citations are stored in the outputs attribute of the LMOutput object and can be accessed via the citations property.

Web search can be enabled either as a string, an enum, or a native tool object:

import asyncio
from gllm_inference.lm_invoker import OpenAILMInvoker
from gllm_inference.model import OpenAILM
from gllm_inference.schema import NativeTool, NativeToolType

# Option 1: as string
web_search_tool = "web_search"
# Option 2: as enum
web_search_tool = NativeToolType.WEB_SEARCH
# Option 3: as native tool object (useful for providing custom kwargs)
web_search_tool = NativeTool.web_search(**kwargs)

lm_invoker = OpenAILMInvoker(OpenAILM.GPT_5_NANO, tools=[web_search_tool])

Let's try it to use it to find information about a movie that came out after the LM's cutoff date!

Output:

Last updated