Skip to content

substr

Instance method on String.prototype.

Gets a substring beginning at the specified location and having the specified length.

substr(input: { string: <receiver>; from: number; length?: number; prompt?: string }): Promise<string>

The prompt field is optional. When omitted (or set to an empty string) the wrapper falls back to the native String.prototype.substr and returns a resolved Promise without contacting the LLM. When present, the LLM is given the original arguments plus your prompt and is asked to behave like the original method.

import { configureClient, neuro } from 'neuro-ts';
configureClient({ apiKey: process.env.OPENAI_API_KEY });
// Legacy substr; the spec marks it as Annex B, the linter ignores Annex B by default.
await neuro.string.substr({ string: text, from: 0, length: 8, prompt: 'extract length characters starting at from, treating negative from as offset-from-the-end, while the spec quietly classifies this as legacy and refuses to remove it' });

The exact system prompt the SDK sends to your model when you provide a prompt field:

Generated promptString.prototype.substr
You are simulating the JavaScript built-in `String.prototype.substr`.
## Original signature(s)
  Overload 1: (from: number, length?: number) => string
## JSDoc
Gets a substring beginning at the specified location and having the specified length.

## How to respond
- Behave EXACTLY as the original `substr` would, but use the user's intent to choose any callback / comparator / transform logic that the original would normally accept as an argument.
- Strictly preserve the original return type and shape.
- Output ONLY the JSON-encoded return value of the function call.
- Do NOT include explanations, prose, comments, or markdown fences.
- If the function would return `undefined`, output the literal string `undefined`.
- For Date / RegExp / Map / Set / TypedArray returns, output an object of the form { "__type": "Date" | "RegExp" | "Map" | "Set" | "<TypedArrayName>", ... } so the SDK can rehydrate it.