New scale capabilities for OpenTelemetry on Azure Monitor

Microsoft has launched a collection of updates to the Azure Monitor OpenTelemetry Exporter packages for .NET, Node.js and Python purposes. New options embrace: OpenTelemetry metric export to Azure Monitor Utility Insights (AMAI), sampling enhancement management for traces and spans, and cache and retry supply of telemetry information on a short lived outage to Azure Monitor Utility Insights.

Azure Monitor is a set of instruments for amassing, analyzing, and responding to infrastructure information and software telemetry from cloud and on-premises environments. AMAI is among the instruments in Azure Monitor, and it gives software efficiency monitoring (APM) to its customers. As well as, Azure Monitor Utility Insights helps distributed tracing, one of many pillars of the monitoring paradigm, throughout a number of purposes.

OpenTelemetry is a framework that gives vendor-agnostic APIs, SDKs and instruments to eat, remodel and export telemetry information to the Observability backend. In a 2021 weblog submit, Microsoft outlined its roadmap for integrating OpenTelemetry with its broader Azure Monitor ecosystem. The fast focus of this was to create direct exports from OpenTelemetry-based purposes to AMAI versus the de facto OpenTelemetry path of OTLP exports to Azure Monitor by way of OpenTelemetry Collector.


A pattern dwell export from a Node.js software with OpenTelemetry monitoring in place could be:

const { AzureMonitorTraceExporter } = require("@azure/monitor-opentelemetry-exporter");
const { NodeTracerProvider } = require("@opentelemetry/sdk-trace-node");
const { BatchSpanProcessor } = require("@opentelemetry/sdk-trace-base");

const supplier = new NodeTracerProvider({
  useful resource: new Useful resource({
    [SemanticResourceAttributes.SERVICE_NAME]: "basic-service",

// Create an exporter occasion
const exporter = new AzureMonitorTraceExporter();

// Add the exporter to the supplier
  new BatchSpanProcessor(exporter, {
    bufferTimeout: 15000,
    bufferSize: 1000

With the discharge of recent updates to the Azure Monitor OpenTelemetry Exporter packages, exporting metrics to AMAI is now attainable, as proven under:

const { MeterProvider, PeriodicExportingMetricReader } = require("@opentelemetry/sdk-metrics");
const { Useful resource } = require("@opentelemetry/assets");
const { AzureMonitorMetricExporter } = require("@azure/monitor-opentelemetry-exporter");

// Add the exporter into the MetricReader and register it with the MeterProvider
const supplier = new MeterProvider();
const exporter = new AzureMonitorMetricExporter( "",
const metricReaderOptions = {
  exporter: exporter,
const metricReader = new PeriodicExportingMetricReader(metricReaderOptions);

To handle the quantity of telemetry information despatched to Utility Insights, packages now embrace a sampler that controls the share of traces despatched. For the Node.js hint instance from earlier, this is able to appear like this:

import { ApplicationInsightsSampler, AzureMonitorTraceExporter } from "@azure/monitor-opentelemetry-exporter";

// Sampler expects a pattern charge of between 0 and 1 inclusive
// A charge of 0.75 means roughly 75% of traces are despatched
const aiSampler = new ApplicationInsightsSampler(0.75);
const supplier = new NodeTracerProvider({
  useful resource: new Useful resource({
    [SemanticResourceAttributes.SERVICE_NAME]: "basic-service",
  sampler: aiSampler

Lastly, within the occasion of a failure to contact AMAI, direct exporters ebook their payloads for native storage and periodically try redelivery inside 48 hours. These settings could be configured on an exporter instantiation, as proven under:

const exporter = new AzureMonitorTraceExporter({
    storageDirectory: "C:SomeDirectory",     // your required location
    disableOfflineStorage: false               // enabled by default, set to disable

About the author


Leave a Comment