Skip to content

JavaScript / TypeScript SDK

opensearch-genai-sdk instruments Node.js LLM applications using standard OpenTelemetry. It configures the OTEL pipeline in one call, provides wrapper functions for tracing your application logic, and emits evaluation scores through the same OTLP exporter.

Terminal window
npm install opensearch-genai-sdk

For auto-instrumentation of LLM providers, install the relevant instrumentor packages:

Terminal window
npm install @traceloop/instrumentation-openai
npm install @traceloop/instrumentation-anthropic
npm install @traceloop/instrumentation-langchain
import { register, traceWorkflow, traceAgent, traceTool, score } from "opensearch-genai-sdk";
register({ endpoint: "http://localhost:4318/v1/traces", projectName: "my-app" });
const getWeather = traceTool("get_weather", (city: string) => {
return { city, temp: 22, condition: "sunny" };
}, { description: "Fetch current weather for a city" });
const assistant = traceAgent("weather_assistant", (query: string) => {
const data = getWeather("Paris");
return `${data.condition}, ${data.temp}C`;
});
const run = traceWorkflow("weather_pipeline", (query: string) => {
return assistant(query);
});
const result = run("What's the weather?");
score({ name: "relevance", value: 0.95, traceId: "...", source: "llm-judge" });

Configures the OTEL tracing pipeline. Call once at startup before any tracing occurs.

import { register } from "opensearch-genai-sdk";
register({
endpoint: "http://localhost:4318/v1/traces",
projectName: "my-app",
});
OptionTypeDefaultDescription
endpointstringhttp://localhost:21890/opentelemetry/v1/tracesOTLP endpoint URL. Reads OPENSEARCH_OTEL_ENDPOINT if not set.
projectNamestring"default"Attached to all spans as service.name. Reads OTEL_SERVICE_NAME.
authstring"auto""auto" detects AWS endpoints and enables SigV4. "sigv4" always signs. "none" never signs.
batchbooleantruetrue uses BatchSpanProcessor (production). false uses SimpleSpanProcessor (debugging).
autoInstrumentbooleantrueDiscovers and activates installed OTel instrumentor packages.
exporterSpanExporterCustom exporter. Overrides endpoint and auth.

Self-hosted:

register({ projectName: "my-app" });

AWS OpenSearch Ingestion:

register({
endpoint: "https://pipeline.us-east-1.osis.amazonaws.com/v1/traces",
projectName: "my-app",
auth: "sigv4",
});

Four higher-order functions trace application logic as OTEL spans with GenAI semantic convention attributes. All four support sync and async functions. Errors are recorded as span status ERROR with an exception event and re-thrown.

flowchart TD
    A["traceWorkflow  — SpanKind.INTERNAL"] --> B["traceAgent  — SpanKind.CLIENT"]
    B --> C["traceTool  — SpanKind.INTERNAL"]
    B --> D["LLM call  (auto-instrumented)"]
traceWorkflow(name, fn, options?)
traceTask(name, fn, options?)
traceAgent(name, fn, options?)
traceTool(name, fn, options?)
OptionTypeDescription
versionnumberStored as gen_ai.agent.version or gen_ai.entity.version.
descriptionstringTool description stored as gen_ai.tool.description. (traceTool only)

Top-level orchestration. gen_ai.operation.name = "workflow".

const runPipeline = traceWorkflow("qa_pipeline", (query: string) => {
const plan = planSteps(query);
return execute(plan);
});

Span attributes: gen_ai.operation.name, gen_ai.agent.name, gen_ai.entity.input, gen_ai.entity.output.

A discrete unit of work. Same attributes and defaults as traceWorkflow.

const planSteps = traceTask("plan_steps", (query: string) => {
return llm.generate(`Plan steps for: ${query}`);
});

Autonomous decision-making logic. Defaults to SpanKind.CLIENT. Span name is prefixed: invoke_agent <name>.

const research = traceAgent("research_agent", async (query: string) => {
const result = await searchTool(query);
return summarize(result);
}, { version: 2 });

A function invoked by an agent. Span name is prefixed: execute_tool <name>.

const search = traceTool("web_search", (query: string): string[] => {
return searchApi.query(query);
}, { description: "Search the web for documents" });

Additional attributes: gen_ai.tool.name, gen_ai.tool.type ("function"), gen_ai.tool.description, gen_ai.tool.call.arguments, gen_ai.tool.call.result.


Submits an evaluation score as an OTEL span. Scores flow through the same OTLP pipeline as traces.

score({
name: "accuracy",
value: 0.95,
traceId: "abc123",
spanId: "def456",
explanation: "Answer matches ground truth",
source: "heuristic",
});
score({
name: "relevance",
value: 0.92,
traceId: "abc123",
source: "llm-judge",
});
score({
name: "user_satisfaction",
value: 0.88,
conversationId: "session-123",
label: "satisfied",
source: "human",
});
ParameterTypeDescription
namestringMetric name, e.g. "relevance", "factuality".
valuenumberNumeric score.
traceIdstringTrace being scored.
spanIdstringSpan being scored (span-level).
conversationIdstringSession ID (session-level).
labelstringHuman-readable label.
explanationstringEvaluator rationale. Truncated to 500 characters.
responseIdstringLLM completion ID for correlation.
sourcestring"sdk", "human", "llm-judge", "heuristic".
metadataRecord<string, unknown>Arbitrary key-value metadata.

register() attempts to activate any installed OTel instrumentor packages. Install the package for your LLM provider and its calls are traced automatically.

ProviderPackage
OpenAI@traceloop/instrumentation-openai
Anthropic@traceloop/instrumentation-anthropic
LangChain@traceloop/instrumentation-langchain
HTTP@opentelemetry/instrumentation-http
Fetch@opentelemetry/instrumentation-fetch

To disable:

register({ autoInstrument: false });

VariableDescriptionDefault
OPENSEARCH_OTEL_ENDPOINTOTLP endpoint URLhttp://localhost:21890/opentelemetry/v1/traces
OTEL_SERVICE_NAMEService name for all spans"default"
OPENSEARCH_PROJECTProject name (fallback to OTEL_SERVICE_NAME)"default"

  • Python SDK — Python equivalent
  • Agent Traces — viewing traces in OpenSearch Dashboards
  • Send Data — OTLP pipeline and collector setup
  • FAQ — common questions
  • npm — package page