Deep Research Pipeline with Google Drive Connector

Overview

This section demonstrates Pipeline orchestration with additional data sources, using Google Drive as an example.

In this setup:

  • the Pipeline controls routing and execution flow

  • Google Drive access is provided via an MCP connector

  • Deep research is invoked as an encapsulated step with additional data available during execution

See complete code in GitHub

You can either:

  1. You can refer to the guide whenever you need explanation or want to clarify how each part works.

  2. Follow along with each step to recreate the files yourself while learning about the components and how to integrate them.

Both options will work—choose based on whether you prefer speed or learning by doing!

chevron-rightPrerequisiteshashtag

This example specifically requires:

  1. Completion of all setup steps listed on the Prerequisites page.

You should be familiar with these:

  1. Pipeline and orchestration:

  2. Event emitting: Event Emitter

Installation

Project Setup

1

Clone the repository

2

Set UV authentication and install dependencies

Unix-based systems (Linux, macOS):

For Windows:

3

Prepare .env file with Google Drive authentication

Get the auth token from OpenAI Connector Guidearrow-up-right

When generating the token, make sure to enable the following scopes:

  • userinfo.email

  • userinfo.profile

  • drive.readonly

Add to .env:

Implementation

In this example, Google Drive is made available to deep research through an MCP connector, while the Pipeline determines when deep research should be invoked.

The Google Drive connector extends the data available during research, but does not change the execution flow or reasoning strategy of deep research itself.

Run the script

Benefits:

  • Make documents stored in Google Drive available as research context

  • Combine private documents with public information during research

  • Integrate external data sources without changing research execution logic

Next Step

  1. Combine in an end-to-end RAG pipeline by following the guides in Your First RAG Pipeline

  2. Explore more about deep researcher subclasses and features in API reference pagearrow-up-right

Last updated

Was this helpful?