slidersProfiles

Audience: Developers

Research Profiles

Overview

Research profiles provide a convenient way to pre-configure provider settings and parameters for different research scenarios. Profiles are stored in the database and managed via the Profiles API (create, read, update, delete). The profile format is JSON in all API requests and responses. You must specify a profile name in each research request; the system loads that profile from the database and uses its provider and parameters for the run.

On first-time deployment, database migrations create the profiles table and seed four default profiles: INTERNAL, ESSENTIAL, COMPREHENSIVE, and MOCK. The default profile (e.g. ESSENTIAL) is configured via environment variable and is used when a request does not specify a profile name (where applicable). You can add more profiles at any time via the Profiles API (master API key required).


User Guide

This section is for users who want to use existing profiles in their research requests.

What are Profiles?

A profile is a named configuration that includes:

  • Provider: Which research provider to use (tongyi or gptr)

  • Parameters: Provider-specific settings (LLM models, timeouts, research depth, etc.)

  • Description: Human-readable description of the profile's purpose

Benefits

  • Simplified Requests: Use a profile name instead of specifying all parameters

  • Consistent Configuration: Ensure consistent settings across requests

  • Easy Switching: Switch between different research configurations easily

List Available Profiles

Profiles are read from the database. To list all available profiles, use the Profiles API (master API key required). See API Contract — Profiles for request/response format.

Using a Profile with the SDK

Use the GL Open DeepResearch SDK (gl-odr-sdk) for research requests. Include the profile name in each request.

Create a task group

Create a single task

Examples

Example 1: Use an existing profile

Use a profile that is already in the database (e.g. one of the seeded defaults: INTERNAL, ESSENTIAL, COMPREHENSIVE, MOCK). No setup required beyond a valid API key.

See Quick Start Guide.

Example 2: Create a new profile and use it

Create a new profile via the Profiles API (master API key required); see API Contract — Create Profile. Then run research with that profile using the SDK.

Step 1 — Create the profile

Use the Profiles API; see API Contract — Create Profile for the endpoint and request body. Example (create the profile via HTTP):

Step 2 — Run research with the new profile (SDK)

For more profile options and the full API contract, see Profiles API, Tasks API, Taskgroups API, and the Developer Guide below.

API Endpoints

List Profiles

Endpoint: GET /v1/profiles

Response: List of all available profiles with their configurations

Run research with a profile

Use the GL Open DeepResearch SDK (gl-odr-sdk): client.tasks.create(...) or client.taskgroups.create(...) with query and profile. For HTTP endpoint details, see Tasks API and Taskgroups API.


Developer Guide

This section is for developers who need to maintain, create, or modify profiles.

Where Profiles Are Stored

Profiles are stored in the database (PostgreSQL). They are not read from YAML or config files at runtime. The application loads profiles via the ProfileService from the database; the ConfigResolver and ProfileRegistry use this data for every research request.

Default Profiles on First Deploy

When you deploy GL Open DeepResearch for the first time, run the database migrations. A data migration seeds three default profiles into the database:

Profile
Provider
Description

INTERNAL

tongyi

Tongyi Deep Research via OpenRouter; model alibaba/tongyi-deepresearch-30b-a3b; requires OPENROUTER_API_KEY.

ESSENTIAL

gptr

GPT-Researcher with Smart Search; quick research report.

COMPREHENSIVE

gptr

GPT-Researcher with Smart Search; deep research (breadth/depth/concurrency).

The default profile name (e.g. ESSENTIAL) is set via the DEFAULT_PROFILE environment variable. At least one profile must exist in the database for the application to start.

Profile Structure

Each profile is represented as JSON in the database and in all API requests and responses. A profile object contains:

  • name: Unique identifier for the profile (string)

  • provider: Provider type, "tongyi" or "gptr" (string)

  • params: Object of provider-specific parameters (JSON object)

  • description: Optional description (string)

Inline MCP injection (allow_inject_mcp_server)

Research requests may send an mcp_config object whose keys are either registered MCP config UUIDs or custom server names (inline MCP). For security, inline keys (non-UUID) are allowed only when params.allow_inject_mcp_server is boolean true on the profile. If that flag is missing or false, a request that includes inline MCP keys is rejected with a validation error. UUID keys (references to tools registered via the Tools API) are still allowed. Set allow_inject_mcp_server to true only on profiles where you intentionally allow callers to supply arbitrary MCP endpoints in the request body.

Provider-Specific Parameters

When creating or updating profiles via the API, send a JSON body with the structure below. The seeded INTERNAL, ESSENTIAL, and COMPREHENSIVE profiles follow these shapes.

Tongyi profile (JSON)

Example request body for provider tongyi:

base_url and api_key are resolved from the profile's backend setting. Use OPENROUTER_API_KEY for OpenRouter or LLM_BASE_URL/LLM_API_KEY for self-hosted.

GPT-Researcher profile (JSON)

Example for a quick research profile (ESSENTIAL style):

For a deep research profile (COMPREHENSIVE style), use "report_type": "deep" and add to config_data: DEEP_RESEARCH_BREADTH, DEEP_RESEARCH_DEPTH, DEEP_RESEARCH_CONCURRENCY.

Environment Variable Fallback

Profile parameters can reference environment variables. If a parameter is empty or missing, the system falls back to environment variables:

Tongyi Environment Variables:

  • MODEL_NAMEmodel

  • MAX_TOKENSmax_tokens

  • MAX_RETRIESmax_retries

  • TEMPERATUREtemperature

  • TOP_Ptop_p

  • PRESENCE_PENALTYpresence_penalty

  • PLANNING_PORTplanning_port

GPT-Researcher Environment Variables:

  • OPENAI_API_KEY → Used for OpenAI API calls

Creating New Profiles

New profiles are created via the Profiles API (POST /v1/profiles) and stored in the database. This requires a master API key. For a full workflow example (create a profile, then run research with it), see Example 2: Create a new profile and use it in the Examples section above.

For the full API contract and request/response shapes, see the Profiles API documentation.

Profile Resolution Logic

When a research request is made:

  1. Profile is required: The request must include a profile name.

  2. Load from database: The profile is loaded from the database by name (via ProfileRegistry / ProfileService).

  3. Use provider and params: The profile’s provider and params determine which research adapter runs and with which settings (Tongyi or GPT-Researcher).

Best Practices

  1. Use Profiles for Common Scenarios: Create profiles in the database for frequently used configurations.

  2. Descriptive Names: Use clear, descriptive profile names (e.g. ESSENTIAL, COMPREHENSIVE).

  3. Document Parameters: Include descriptions when creating/updating profiles so users know when to use each.

  4. Environment Variables: Use environment variables for sensitive data (API keys, URLs); do not store secrets in profile params.

  5. Validate Before Deploying: Test new or updated profiles before relying on them in production.

  6. Default Profile: Ensure DEFAULT_PROFILE (env) matches a profile that exists in the database so the app can start and defaults work as expected.

Last updated