puzzle-pieceCustomization

Audience: Developers

Customization

There are two ways to customize GL Open DeepResearch capabilities:

1. Create New Research Profiles

What it is: Create custom profiles with different configurations (models, timeouts, depth, tools) using existing research engines.

When to use: When you want to customize behavior without changing the underlying research engine. For example:

  • Use a different LLM model (e.g., Claude instead of GPT-4)

  • Adjust research depth or timeout settings

  • Configure different tool sets

  • Create profiles optimized for specific use cases

How: Use the Profiles APIarrow-up-right to create, update, and manage profiles. Profiles are stored in the database and can be referenced by name in research requests.

See: Research profiles for detailed guide and examples.


2. Integrate New Open Source Deep Research Providers

What it is: Add support for new open-source deep research engines by implementing the adapter protocol.

When to use: When you want to use a different research engine that isn't currently supported (e.g., a new open-source deep research library).

Current providers: GL Open DeepResearch currently supports two open-source deep research providers:

  • Tongyi Deep Research — Multi-turn ReAct agent

  • GPT-Researcher — Automated research workflow

How: Implement the OrchestratorAdapter protocol and register the adapter in the Orchestrator Registry. See Core components and Core design principles for details.

See: Available Open Source Deep Research for provider details and Tongyi / GPT-Researcher for implementation examples.


Quick Comparison

Aspect
Research Profiles
New Providers

Complexity

Low — API-based configuration

High — Requires code changes

Use case

Customize existing engines

Add new research engines

Storage

Database (via API)

Code (adapter implementation)

Examples

Different models, timeouts, tools

Tongyi, GPT-Researcher

For most use cases, creating new profiles is the recommended approach. Only integrate new providers when you need a research engine that isn't currently supported.

Last updated