Skip to content

NeuroClientOptions

optional apiKey?: string;

OpenAI API key. Node.js only. Throws in browser environments to prevent leaking secrets. In the browser use tokenProvider or proxyUrl.


optional proxyUrl?: string;

URL of a backend you control that proxies requests to OpenAI. The SDK POSTs { functionId, prompt, args, instanceData, signatureHint, model } and expects the LLM result back as JSON.


optional tokenProvider?: TokenProvider;

Async function returning a short-lived (ephemeral) API key. Called once per request; cache + refresh in your implementation as needed. Browser-safe alternative to apiKey.


optional model?: string;

Default chat model. Overridable per-call.


optional baseURL?: string;

Optional custom base URL for OpenAI-compatible endpoints.


optional temperature?: number;

Sampling temperature (default 0.2 - deterministic simulation).


optional maxTokens?: number;

Max output tokens (default 1024).


optional fetchOptions?: CustomFetchOptions;

Extra fetch options for proxyUrl mode.


optional dangerouslyAllowBrowser?: boolean;

When true, allows apiKey use in the browser. ⚠️ DANGEROUS.