View all connectors

Gemini CLI & Code Assist

This guide explains how to connect Google Gemini CLI & Gemini Code Assist to DX by automatically processing and exporting log data from your Google Cloud project to DX directly. The method detailed in this guide is a unified approach for handling both Gemini CLI (via OTel) and Gemini Code Assist (via logs) data ingestion into DX.

Note: This guide is still in beta.

Prerequisites

  • This guide assumes a basic familiarity with Google Cloud.
  • Google Cloud project: Access to the project where Gemini Code Assist/Gemini CLI is active.
  • Google Cloud billing enabled: While costs are low, BigQuery and Cloud Run require a billing account.
  • DX Data Cloud API key: The API token provided by DX for ingestion.
  • This guide assumes that you have Gemini CLI and/or Gemini Code Assist enabled in your Google Cloud console.

Setup instructions

Step 1: Configure data sources

To begin, you will need to ensure that Gemini CLI and/or Gemini Code Assist are properly configured to send data back to Google Cloud.

Gemini Code Assist

By default, enterprise logging is disabled for privacy. You must explicitly enable it.

  1. Navigate to Google Cloud ConsoleAdmin for Gemini.
  2. Toggle Logging for Code Assist metadata to ENABLED.

Gemini CLI

Configure your developers’ Gemini CLI environments to point to Google Cloud for telemetry. Follow the Gemini CLI docs for instructions.

Step 2: Set up service account

Set up a dedicated service account to run this integration. Using a single service account simplifies permission management and follows Google Cloud best practices.

  1. Navigate to Google Cloud consoleIAM & AdminService Accounts.
  2. Click Create service account.
  3. Set the Service account name to dx-gemini-export.
  4. Click Create and continue.
  5. Add the following roles to the newly created service account:
    • BigQuery Data Viewer – to read data from the log sink dataset
    • BigQuery User – to run export queries
    • Secret Manager Secret Accessor – to read the DX API key
    • Cloud Run Invoker – to allow Cloud Scheduler to trigger the function

You will use this service account in Steps 7 and 9 when configuring Cloud Run and Cloud Scheduler.

Step 3: Create BigQuery dataset

  1. Navigate to Google Cloud consoleBigQueryBigQuery Studio.
  2. In the Explorer panel (left side), click the three dots menu next to your project ID.
  3. Select Create dataset.
  4. Set Dataset ID to dx_gemini_observability.
  5. Set Location type to Multi-region (recommended) or select a specific Region.
  6. Leave everything else to default.
  7. Click Create dataset.

Step 4: Create Log Router sink

  1. Navigate to Google Cloud consoleLoggingLog RouterCreate sink.
  2. Name the sink: dx-gemini-sink.
  3. Set the Destination to the BigQuery dataset previously created (dx_gemini_observability).
  4. Check Use Partitioned Tables (Required for cost efficiency).
  5. Set the Inclusion Filter to the snippet below:
    logName:"cloudaicompanion" OR logName:"gemini_cli"

Step 5: Grant write permissions

After creating the sink, you may need to manually grant the sink’s Writer Identity permission to write to BigQuery.

  1. On the Log Router page, find your sink (dx-gemini-sink).
  2. Click the three dots menu next to the sink and select View sink details.
  3. Copy the Writer identity service account email (e.g., service-123...@gcp-sa-logging.iam.gserviceaccount.com).
  4. Navigate to IAM & AdminIAM.
  5. Click Grant Access (at the top of the page).
  6. In the New principals field, paste the service account email.
  7. In the Select a role field, choose BigQuery Data Editor.
  8. Click Save.

Step 6: Add DX Data Cloud API key to Secret Manager

  1. Navigate to Google Cloud consoleSecret Manager.
  2. Create a new secret named dx-api-key and paste the value. API keys can be managed from the Data Cloud API settings.

Step 7: Create Cloud Run function

  1. Navigate to Google Cloud consoleCloud Run Functions and create a new Python function.
  2. Select Use an inline editor to create a function.
  3. Set Service name to dx-gemini-export.
  4. Set Runtime to Python 3.14.
  5. Select Require authentication.
  6. Expand Containers, Networking, Security.
  7. Under Security, select the dx-gemini-export service account created in Step 2.
  8. Under Variables & Secrets, add the following:
    • DX_API_URL: Your DX API URL found in the Data Cloud API settings.
    • GCP_PROJECT: Your Google Cloud Project ID.
    • DATASET_ID: dx_gemini_observability.
  9. Under Variables & Secrets, click Reference a secret and mount the dx-api-key secret as an environment variable named DX_API_KEY.
  10. Click Next to proceed to the code editor.

Step 8: Deploy function code

  1. Set Function entry point to export_gemini_metrics.
  2. In the file browser, select requirements.txt and replace its contents with:
    functions-framework==3.*
    google-cloud-bigquery>=3.10.0
    requests>=2.31.0
  3. Select main.py and replace its contents with the provided example script.
  4. Click Save and redeploy and wait for the function to deploy successfully.
  5. Once deployed, copy the Function URL from the function details page—you will need this in the next step.

Step 9: Configure Cloud Scheduler

  1. Navigate to Google Cloud consoleCloud Scheduler and click Create Job.
  2. Name the job: gemini-export-job.
  3. Set Frequency to 0 * * * * (Hourly) or 0 0 * * * (Daily).
  4. Set Target type to HTTP.
  5. Set the URL to the Cloud Run Function URL (copied from Step 8).
  6. Add an Auth Header by selecting Add OIDC Token.
  7. Select the dx-gemini-export service account created in Step 2.

Step 10: Test Cloud Run function

  1. Navigate to Google Cloud consoleCloud Scheduler.
  2. Find the row for gemini-export-job.
  3. Click the three dots menu and select Force run.
  4. Navigate to Google Cloud consoleCloud Run Functions.
  5. Click on the name of your function.
  6. Click the Logs tab.
  7. Look for a log entry with Severity INFO that says: Success: X records exported.

Step 11: Verify data in DX

Assuming you are not getting errors in Step 10, you can check whether your import was successful by querying for the AI tool metrics data:

  1. Go to DXData Studio and use the following query to look for AI tool metrics records where the tool is either Gemini CLI or Gemini Code Assist:
SELECT
    batdm.tool AS tool_name,
    batdm.email,
    COUNT(batdm.id) AS metric_count,
    MIN(batdm.date) AS earliest_date,
    MAX(batdm.date) AS latest_date,
    COUNT(CASE WHEN batdm.is_active = true THEN 1 END) AS active_days
FROM bespoke_ai_tool_daily_metrics batdm
WHERE (batdm.tool ILIKE '%gemini%'
    OR batdm.tool ILIKE '%gemini cli%'
    OR batdm.tool ILIKE '%gemini code assist%')
GROUP BY batdm.tool, batdm.email
HAVING COUNT(batdm.id) > 0
ORDER BY metric_count DESC, tool_name, email
  1. If successful, you should get results showing you active users for Gemini CLI and/or Gemini Code Assist. Note, it will take up to an hour for Gemini CLI and Gemini Code Assist metrics to start showing up in reports outside of Data Studio.

Successful metrics query