Skip to content

Conversation

@iozfiliz
Copy link

fix: forward fetch and headers options to AI SDK providers

Fixes #1296

why

When using AI SDK provider models (e.g., openai/gpt-4o-mini), custom fetch and headers options from ClientOptions are silently ignored, even though:

  • The ClientOptions TypeScript interface includes them
  • The underlying AI SDK providers support them
  • The baseURL option IS forwarded (inconsistent behavior)

This blocks important use cases:

  • Proxy authentication - Adding custom auth headers for LLM proxies
  • Request logging - Intercepting requests for monitoring/debugging
  • Custom retry logic - Application-specific retry behavior
  • Rate limiting - Custom rate limiting implementations

Users currently receive no error when these options are ignored, making this bug difficult to discover.

what changed

Modified: packages/core/lib/v3/llm/LLMProvider.ts

  1. Added ExtendedClientOptions interface for type-safe property access
  2. Updated getAISDKLanguageModel() function signature to accept headers and fetch parameters
  3. Added these options to providerConfig object (forwarded to AI SDK provider)
  4. Updated call site in getClient() to pass the options with type assertions

Changes:

  • Added ExtendedClientOptions interface for type-safe property access (lines 20-23)
  • Function signature: Added 2 optional parameters (lines 101-107)
  • Provider config: Added headers/fetch to config type and conditionals (lines 118-132)
  • Call site: Passed new parameters with type assertions (lines 169-170)

Type Safety:

  • Used ExtendedClientOptions interface to avoid @typescript-eslint/no-explicit-any errors
  • Type assertion is minimal and localized to 2 lines
  • Both OpenAI and Anthropic SDKs support these properties at runtime

Compatibility:

  • ✅ Backward compatible (new parameters are optional)
  • ✅ No breaking changes to existing code
  • ✅ Follows same pattern as existing baseURL parameter

test plan

Manual Testing

Run the provided test example:

# Set up environment
export OPENAI_API_KEY="your-key-here"

# Build Stagehand
pnpm run build

# Run test
pnpm run example test-custom-fetch

Expected output:

✅ Custom fetch called (1 times)
   URL: https://api.openai.com/v1/responses
   Custom header: x-custom-header: test-value
   Custom header: x-custom-proxy-auth: proxy-token-123
✅ SUCCESS: Custom fetch was called 1 times

Runtime Verification ✅

Verified with actual OpenAI API call:

Initializing Stagehand with custom fetch and headers...
Making a simple LLM call via act()...

✅ Custom fetch called (1 times)
   URL: https://api.openai.com/v1/responses
   Custom header: x-custom-header: test-value
   Custom header: x-custom-proxy-auth: proxy-token-123

=== Test Results ===
✅ SUCCESS: Custom fetch was called 1 times
✅ Custom headers detected: x-custom-header: test-value, x-custom-proxy-auth: proxy-token-123

This confirms:

  • Custom fetch function is called for all LLM requests
  • Custom headers are properly forwarded
  • End-to-end flow works correctly

Test File

Added examples/test-custom-fetch.ts which:

  • Creates a custom fetch function that logs when called
  • Adds custom headers to the request
  • Verifies both fetch and headers are properly forwarded
  • Reports success/failure clearly

Real-World Use Case

This fix enables our production LLM proxy integration:

// Now works correctly with this fix
const stagehand = new Stagehand({
  model: {
    modelName: "openai/gpt-4o-mini",
    baseURL: "https://my-proxy.com/v1",
    fetch: async (url, options) => {
      // Inject authentication token
      const headers = new Headers(options?.headers);
      headers.set('X-LLM-Request-Token', await getProxyToken());
      return fetch(url, { ...options, headers });
    }
  }
});

Code Quality

  • ✅ Follows existing code style (minimal comments, matches patterns)
  • ✅ Type-safe (proper TypeScript types with ExtendedClientOptions)
  • ✅ Consistent with existing parameter handling (same pattern as baseURL)
  • ✅ No additional dependencies
  • ✅ Preserves all existing functionality
  • ✅ Passes lint and build checks

Existing Tests

All existing tests continue to pass (no breaking changes).


Context

We currently use a runtime patch in production that modifies the compiled dist/index.js to work around this bug. This PR provides a proper source-code fix that:

  • Modifies TypeScript source (not compiled output)
  • Is type-safe and maintainable
  • Has been verified with runtime testing
  • Will eliminate the need for runtime patching

The fix enables LLM proxy authentication, which is critical for production deployments where all LLM requests are routed through an authenticated proxy for billing, monitoring, and security.

Happy to help test and refine this fix!


Additional Notes:

  • This is a simple bug fix (< 30 lines changed)
  • Low risk (optional parameters, follows existing patterns)
  • High value (enables important use cases for production deployments)
  • Includes changeset for CHANGELOG

@changeset-bot
Copy link

changeset-bot bot commented Nov 20, 2025

🦋 Changeset detected

Latest commit: c438746

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 2 packages
Name Type
@browserbasehq/stagehand Patch
@browserbasehq/stagehand-evals Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Nov 20, 2025

Greptile Overview

Greptile Summary

Fixed custom fetch and headers options being silently ignored when using AI SDK provider models (e.g., openai/gpt-4o-mini). Modified getAISDKLanguageModel() to accept and forward these options to the underlying AI SDK providers, enabling proxy authentication, request logging, and custom retry logic.

Key Changes:

  • Extended function signature with headers and fetch parameters
  • Refactored to always use creator functions with optional config object (works for all users including those relying on environment variables)
  • Added ExtendedClientOptions interface for type-safe property access
  • Updated call site to pass custom options using type assertions

Impact:

  • Enables production use cases requiring LLM proxy authentication
  • Backward compatible (all new parameters are optional)
  • Consistent with existing baseURL parameter handling

Confidence Score: 5/5

  • Safe to merge - well-tested bug fix with backward compatibility
  • Simple, focused fix that addresses a clear bug. Includes test verification, follows existing patterns, maintains backward compatibility, and has been refined through follow-up commits to handle edge cases
  • No files require special attention

Important Files Changed

File Analysis

Filename Score Overview
packages/core/lib/v3/llm/LLMProvider.ts 5/5 Added support for forwarding custom fetch and headers options to AI SDK providers, enabling proxy authentication and request interception
packages/core/examples/test-custom-fetch.ts 5/5 Test script that verifies custom fetch and headers are properly forwarded to AI SDK providers

Sequence Diagram

sequenceDiagram
    participant User
    participant Stagehand
    participant LLMProvider
    participant getAISDKLanguageModel
    participant AISDKCreator
    participant AIProvider

    User->>Stagehand: new Stagehand({model: {modelName, fetch, headers}})
    User->>Stagehand: act() / extract() / observe()
    Stagehand->>LLMProvider: getClient(modelName, clientOptions)
    
    alt modelName contains "/"
        LLMProvider->>LLMProvider: Parse subProvider and subModelName
        LLMProvider->>getAISDKLanguageModel: call(subProvider, subModelName, apiKey, baseURL, headers, fetch)
        
        getAISDKLanguageModel->>getAISDKLanguageModel: Build providerConfig object
        Note over getAISDKLanguageModel: Add optional fields:<br/>apiKey, baseURL, headers, fetch
        
        getAISDKLanguageModel->>AISDKCreator: creator(providerConfig)
        Note over getAISDKLanguageModel: Type assertion: providerConfig as {apiKey: string}
        
        AISDKCreator->>AIProvider: Initialize with config
        AIProvider-->>getAISDKLanguageModel: provider instance
        
        getAISDKLanguageModel->>AIProvider: provider(subModelName)
        AIProvider-->>getAISDKLanguageModel: languageModel
        
        getAISDKLanguageModel-->>LLMProvider: languageModel
        LLMProvider->>LLMProvider: new AISdkClient({model: languageModel})
    else predefined model
        LLMProvider->>LLMProvider: Create provider-specific client
    end
    
    LLMProvider-->>Stagehand: LLMClient
    
    Note over Stagehand,AIProvider: When AI SDK makes API requests,<br/>custom fetch and headers are used
Loading

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Additional Comments (1)

  1. packages/core/lib/v3/llm/LLMProvider.ts, line 136-145 (link)

    logic: the else path (when no apiKey is provided) doesn't receive headers or fetch options - users without explicit apiKeys will have their options silently ignored

2 files reviewed, 1 comment

Edit Code Review Agent Settings | Greptile

@iozfiliz iozfiliz force-pushed the fix/forward-fetch-headers branch from 6e684de to 8384272 Compare November 20, 2025 19:27
Updated getAISDKLanguageModel() to always use creator functions with
optional config object. This ensures custom fetch/headers work for
ALL users, including those relying on environment variables.

Changes:
- Removed if/else branching (addresses bot feedback)
- Build provider config with optional fields only
- Creator functions automatically use env vars when apiKey not provided
- Custom fetch/headers now forwarded in all scenarios

Testing:
- Verified with real website without explicit apiKey
- Custom fetch called successfully
- All custom headers forwarded correctly
- Environment variable fallback works as expected

Fixes browserbase#1296
@iozfiliz
Copy link
Author

@greptileai

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 files reviewed, no comments

Edit Code Review Agent Settings | Greptile

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Custom fetch and headers options not forwarded to AI SDK providers

1 participant