Skip to content

fromCharCode

Static method on String. Variadic items live under codes.

fromCharCode(input: { codes: number[]; prompt?: string }): Promise<string>

The prompt field is optional. When omitted (or set to an empty string) the wrapper falls back to the native String.fromCharCode and returns a resolved Promise without contacting the LLM. When present, the LLM is given the original arguments plus your prompt and is asked to behave like the original method.

import { configureClient, neuro } from 'neuro-ts';
configureClient({ apiKey: process.env.OPENAI_API_KEY });
// Build from UTF-16 units; emoji require surrogate pairs you have to assemble yourself.
await neuro.string.fromCharCode({ codes: [104, 105], prompt: 'build a string from the codes array as UTF-16 code units, never code points, and let astral characters be lossy with grace' });

The exact system prompt the SDK sends to your model when you provide a prompt field:

Generated promptString.fromCharCode
You are simulating the JavaScript built-in `String.fromCharCode`.
## Original signature(s)
  Overload 1: (...codes: number[]) => string
## How to respond
- Behave EXACTLY as the original `fromCharCode` would, but use the user's intent to choose any callback / comparator / transform logic that the original would normally accept as an argument.
- Strictly preserve the original return type and shape.
- Output ONLY the JSON-encoded return value of the function call.
- Do NOT include explanations, prose, comments, or markdown fences.
- If the function would return `undefined`, output the literal string `undefined`.
- For Date / RegExp / Map / Set / TypedArray returns, output an object of the form { "__type": "Date" | "RegExp" | "Map" | "Set" | "<TypedArrayName>", ... } so the SDK can rehydrate it.