Skip to content

[DSM] Add technology pages for DSM documentation #28943

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 26 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
973b3aa
add recommendation for RabbitMQ integration
robcarlan-datadog Apr 24, 2025
5a0f754
dashboards -> graphs
robcarlan-datadog Apr 24, 2025
2a0edf1
use shortcode for rabbitmq info
robcarlan-datadog Apr 24, 2025
d9e51d8
updating
cswatt Apr 24, 2025
2ce5c7b
also fixing link
cswatt Apr 24, 2025
0f0b6e2
last change
cswatt Apr 24, 2025
7e47268
[DSM] Add technology tiles that go to dead file locations
lkretvix Apr 23, 2025
4610cb9
[DSM] Add java and go for kafka
lkretvix Apr 24, 2025
40c8441
[DSM] Add other libraries for kafka, add callout for kafka streams
lkretvix Apr 24, 2025
8b3f1a8
[DSM] Update disclaimer and title
lkretvix Apr 24, 2025
4268ed4
[DSM] Test adding custom table styling
lkretvix Apr 24, 2025
b7ad58b
[DSM] Use html for table structure
lkretvix Apr 24, 2025
438e559
[DSM] Add rabbitmq, sqs, sns, and kinesis technology files in data_st…
lkretvix Apr 24, 2025
2747605
[DSM] Provide links to language pages from technology pages
lkretvix Apr 24, 2025
6039853
[DSM] Update links on docs pages
lkretvix Apr 24, 2025
ef4358a
[DSM] Add shortcodes for monitoring-xxx content in data streams
lkretvix Apr 24, 2025
411e3fa
[DSM] Update more shortcodes in data_streams
lkretvix Apr 24, 2025
d04ee62
[DSM] Add technology files for google pubsub, azure service bus, and …
lkretvix Apr 24, 2025
e5ba4ee
[DSM] Update missing shortcodes and minor typos in data streams docs
lkretvix Apr 24, 2025
cc16cc0
[DSM] Add supported languages/technologies to main menu
lkretvix Apr 25, 2025
3c7f74a
[DSM] Use relative url in azure service bus .net link
lkretvix Apr 25, 2025
b07d2a2
[DSM] Convert docs links to relative links, update azure service bus …
lkretvix Apr 25, 2025
ef6c352
[DSM] Move rabbitmq shortcode to different directory
lkretvix Apr 25, 2025
81b3aae
[DSM] Rename Technology column to Library
lkretvix Apr 25, 2025
58ab19e
Merge branch 'master' of github.com:DataDog/documentation into lucas.…
lkretvix Apr 25, 2025
7274205
[DSM] Delete unused file
lkretvix Apr 25, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 14 additions & 4 deletions config/_default/menus/main.en.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4355,26 +4355,36 @@ menu:
identifier: data_streams
parent: apm_heading
weight: 50000
- name: Supported Languages
url: data_streams/#setup
identifier: data_streams_supported_languages
parent: data_streams
weight: 1
- name: Supported Technologies
url: data_streams/#setup
identifier: data_streams_supported_technologies
parent: data_streams
weight: 2
- name: Schema Tracking
url: data_streams/schema_tracking
identifier: data_streams_schema_tracking
parent: data_streams
weight: 1
weight: 3
- name: Live Messages
url: data_streams/live_messages
identifier: data_streams_live_messages
parent: data_streams
weight: 2
weight: 4
- name: Data Pipeline Lineage
url: data_streams/data_pipeline_lineage
identifier: data_streams_pipeline_lineage
parent: data_streams
weight: 3
weight: 5
- name: Guide
url: data_streams/guide
identifier: data_streams_guide
parent: data_streams
weight: 4
weight: 6
- name: Data Jobs Monitoring
url: data_jobs/
pre: data-jobs-monitoring
Expand Down
8 changes: 6 additions & 2 deletions content/en/data_streams/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,10 @@ For installation instructions and lists of supported technologies, choose your l

{{< partial name="data_streams/setup-languages.html" >}}

or choose your technology to see what languages and libraries are supported:

{{< partial name="data_streams/setup-technologies.html" >}}

<br/>

## Explore Data Streams Monitoring
Expand Down Expand Up @@ -86,7 +90,7 @@ Alternatively, click a service to open a detailed side panel and view the **Path

Slowdowns caused by high consumer lag or stale messages can lead to cascading failures and increase downtime. With out-of-the-box alerts, you can pinpoint where bottlenecks occur in your pipelines and respond to them right away. For supplementary metrics, Datadog provides additional integrations for message queue technologies like [Kafka][4] and [SQS][5].

Through Data Stream Monitoring's out-of-the-box monitor templates, you can setup monitors on metrics like consumer lag, throughput, and latency in one click.
Through Data Stream Monitoring's out-of-the-box monitor templates, you can setup monitors on metrics like consumer lag, throughput, and latency in one click.

{{< img src="data_streams/add_monitors_and_synthetic_tests.png" alt="Datadog Data Streams Monitoring Monitor Templates" style="width:100%;" caption="Click 'Add Monitors and Synthetic Tests' to view monitor templates" >}}

Expand All @@ -98,7 +102,7 @@ Click on the **Throughput** tab on any service or queue in Data Streams Monitori

By filtering to a single Kafka, RabbitMQ, or Amazon SQS cluster, you can detect changes in incoming or outgoing traffic for all detected topics or queues running on that cluster:

### Quickly pivot to identify root causes in infrastructure, logs, or traces
### Quickly pivot to identify root causes in infrastructure, logs, or traces

Datadog automatically links the infrastructure powering your services and related logs through [Unified Service Tagging][3], so you can easily localize bottlenecks. Click the **Infra**, **Logs** or **Traces** tabs to further troubleshoot why pathway latency or consumer lag has increased.

Expand Down
22 changes: 6 additions & 16 deletions content/en/data_streams/dotnet.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,35 +32,26 @@ environment:
- DD_DATA_STREAMS_ENABLED: "true"
```

### Monitoring Kafka Pipelines
Data Streams Monitoring uses message headers to propagate context through Kafka streams. If `log.message.format.version` is set in the Kafka broker configuration, it must be set to `0.11.0.0` or higher. Data Streams Monitoring is not supported for versions lower than this.
{{% data_streams/monitoring-kafka-pipelines %}}

### Monitoring SQS pipelines
Data Streams Monitoring uses one [message attribute][2] to track a message's path through an SQS queue. As Amazon SQS has a maximum limit of 10 message attributes allowed per message, all messages streamed through the data pipelines must have 9 or fewer message attributes set, allowing the remaining attribute for Data Streams Monitoring.
{{% data_streams/monitoring-sqs-pipelines %}}

{{% data-streams-monitoring/monitoring-rabbitmq-pipelines %}}
{{% data_streams/monitoring-rabbitmq-pipelines %}}

### Monitoring SNS-to-SQS pipelines
To monitor a data pipeline where Amazon SNS talks directly to Amazon SQS, you must enable [Amazon SNS raw message delivery][12].
{{% data_streams/monitoring-sns-to-sqs-pipelines %}}

### Monitoring Azure Service Bus

Setting up Data Streams Monitoring for Azure Service Bus applications requires additional configuration for the instrumented application.

1. Either set the environment variable `AZURE_EXPERIMENTAL_ENABLE_ACTIVITY_SOURCE` to `true`, or in your application code set the `Azure.Experimental.EnableActivitySource` context switch to `true`. This instructs the Azure Service Bus library to generate tracing information. See [Azure SDK documentation][11] for more details.
2. Set the `DD_TRACE_OTEL_ENABLED` environment variable to `true`. This instructs the .NET auto-instrumentation to listen to the tracing information generated by the Azure Service Bus Library and enables the inject and extract operations required for Data Streams Monitoring.
{{% data_streams/monitoring-azure-service-bus %}}

### Monitoring connectors

#### Confluent Cloud connectors
{{% dsm_confluent_connectors %}}
{{% data_streams/dsm-confluent-connectors %}}

## Further reading

{{< partial name="whats-next/whats-next.html" >}}

[1]: /agent
[2]: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-message-metadata.html
[3]: https://www.nuget.org/packages/Confluent.Kafka
[4]: https://www.nuget.org/packages/RabbitMQ.Client
[5]: https://www.nuget.org/packages/AWSSDK.SQS
Expand All @@ -70,4 +61,3 @@ Setting up Data Streams Monitoring for Azure Service Bus applications requires a
[9]: #monitoring-azure-service-bus
[10]: https://www.nuget.org/packages/Azure.Messaging.ServiceBus
[11]: https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/core/Azure.Core/samples/Diagnostics.md#enabling-experimental-tracing-features
[12]: https://docs.aws.amazon.com/sns/latest/dg/sns-large-payload-raw-message-delivery.html
9 changes: 5 additions & 4 deletions content/en/data_streams/go.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,9 @@ To start with Data Streams Monitoring, you need recent versions of the Datadog A

### Installation

### Monitoring Kafka Pipelines
Data Streams Monitoring uses message headers to propagate context through Kafka streams. If `log.message.format.version` is set in the Kafka broker configuration, it must be set to `0.11.0.0` or higher. Data Streams Monitoring is not supported for versions lower than this.
{{% data_streams/monitoring-kafka-pipelines %}}

{{% data_streams/monitoring-rabbitmq-pipelines %}}

{{% data-streams-monitoring/monitoring-rabbitmq-pipelines %}}

Expand Down Expand Up @@ -128,15 +129,15 @@ if ok {
### Monitoring connectors

#### Confluent Cloud connectors
{{% dsm_confluent_connectors %}}
{{% data_streams/dsm-confluent-connectors %}}

## Further reading

{{< partial name="whats-next/whats-next.html" >}}

[1]: /agent/
[2]: https://github.com/DataDog/dd-trace-go
[3]: https://docs.datadoghq.com/tracing/trace_collection/library_config/go/
[3]: /tracing/trace_collection/library_config/go/
[4]: https://datadoghq.dev/orchestrion/
[5]: https://datadoghq.dev/orchestrion/docs/getting-started/
[6]: https://github.com/DataDog/dd-trace-go/blob/main/datastreams/propagation.go#L37
Expand Down
11 changes: 5 additions & 6 deletions content/en/data_streams/java.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,8 +70,9 @@ Use Datadog's Java tracer, [`dd-trace-java`][6], to collect information from you
1. [Add the `dd-java-agent.jar` file][7] to your Kafka Connect workers. Ensure that you are using `dd-trace-java` [v1.44+][8].
1. Modify your Java options to include the Datadog Java tracer on your worker nodes. For example, on Strimzi, modify `STRIMZI_JAVA_OPTS` to add `-javaagent:/path/to/dd-java-agent.jar`.

### Monitoring SQS pipelines
Data Streams Monitoring uses one [message attribute][3] to track a message's path through an SQS queue. As Amazon SQS has a maximum limit of 10 message attributes allowed per message, all messages streamed through the data pipelines must have 9 or fewer message attributes set, allowing the remaining attribute for Data Streams Monitoring.
{{% data_streams/monitoring-sqs-pipelines %}}

{{% data_streams/monitoring-rabbitmq-pipelines %}}

{{% data-streams-monitoring/monitoring-rabbitmq-pipelines %}}

Expand Down Expand Up @@ -100,16 +101,15 @@ Enable [Amazon SNS raw message delivery][1].
{{% /tab %}}
{{< /tabs >}}

### Monitoring Kinesis pipelines
There are no message attributes in Kinesis to propagate context and track a message's full path through a Kinesis stream. As a result, Data Streams Monitoring's end-to-end latency metrics are approximated based on summing latency on segments of a message's path, from the producing service through a Kinesis Stream, to a consumer service. Throughput metrics are based on segments from the producing service through a Kinesis Stream, to the consumer service. The full topology of data streams can still be visualized through instrumenting services.
{{% data_streams/monitoring-kinesis-pipelines %}}

### Manual instrumentation
Data Streams Monitoring propagates context through message headers. If you are using a message queue technology that is not supported by DSM, a technology without headers (such as Kinesis), or Lambdas, use [manual instrumentation to set up DSM][5].

### Monitoring connectors

#### Confluent Cloud connectors
{{% dsm_confluent_connectors %}}
{{% data_streams/dsm-confluent-connectors %}}

#### Self-hosted Kafka connectors

Expand All @@ -123,7 +123,6 @@ Data Streams Monitoring can collect information from your self-hosted Kafka conn

[1]: /agent
[2]: /tracing/trace_collection/dd_libraries/java/
[3]: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-message-metadata.html
[4]: /agent/remote_config/?tab=configurationyamlfile#enabling-remote-configuration
[5]: /data_streams/manual_instrumentation/?tab=java
[6]: https://github.com/DataDog/dd-trace-java
Expand Down
18 changes: 6 additions & 12 deletions content/en/data_streams/nodejs.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,27 +36,23 @@ environment:
- DD_DATA_STREAMS_ENABLED: "true"
```

### Monitoring Kafka Pipelines
Data Streams Monitoring uses message headers to propagate context through Kafka streams. If `log.message.format.version` is set in the Kafka broker configuration, it must be set to `0.11.0.0` or higher. Data Streams Monitoring is not supported for versions lower than this.
{{% data_streams/monitoring-kafka-pipelines %}}

### Monitoring SQS pipelines
Data Streams Monitoring uses one [message attribute][4] to track a message's path through an SQS queue. As Amazon SQS has a maximum limit of 10 message attributes allowed per message, all messages streamed through the data pipelines must have 9 or fewer message attributes set, allowing the remaining attribute for Data Streams Monitoring.
{{% data_streams/monitoring-sqs-pipelines %}}

{{% data-streams-monitoring/monitoring-rabbitmq-pipelines %}}
{{% data_streams/monitoring-rabbitmq-pipelines %}}

### Monitoring SNS-to-SQS pipelines
To monitor a data pipeline where Amazon SNS talks directly to Amazon SQS, you must enable [Amazon SNS raw message delivery][8].
{{% data_streams/monitoring-sns-to-sqs-pipelines %}}

### Monitoring Kinesis pipelines
There are no message attributes in Kinesis to propagate context and track a message's full path through a Kinesis stream. As a result, Data Streams Monitoring's end-to-end latency metrics are approximated based on summing latency on segments of a message's path, from the producing service through a Kinesis Stream, to a consumer service. Throughput metrics are based on segments from the producing service through a Kinesis Stream, to the consumer service. The full topology of data streams can still be visualized through instrumenting services.
{{% data_streams/monitoring-kinesis-pipelines %}}

### Manual instrumentation
Data Streams Monitoring propagates context through message headers. If you are using a message queue technology that is not supported by DSM, a technology without headers (such as Kinesis), or Lambdas, use [manual instrumentation to set up DSM][7].

### Monitoring connectors

#### Confluent Cloud connectors
{{% dsm_confluent_connectors %}}
{{% data_streams/dsm-confluent-connectors %}}

## Further reading

Expand All @@ -65,8 +61,6 @@ Data Streams Monitoring propagates context through message headers. If you are u
[1]: /agent
[2]: /tracing/trace_collection/dd_libraries/nodejs
[3]: https://pypi.org/project/confluent-kafka/
[4]: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-message-metadata.html
[5]: https://www.npmjs.com/package/amqplib
[6]: https://www.npmjs.com/package/rhea
[7]: /data_streams/manual_instrumentation/?tab=nodejs
[8]: https://docs.aws.amazon.com/sns/latest/dg/sns-large-payload-raw-message-delivery.html
18 changes: 6 additions & 12 deletions content/en/data_streams/python.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,27 +36,23 @@ environment:
- DD_DATA_STREAMS_ENABLED: "true"
```

### Monitoring Kafka Pipelines
Data Streams Monitoring uses message headers to propagate context through Kafka streams. If `log.message.format.version` is set in the Kafka broker configuration, it must be set to `0.11.0.0` or higher. Data Streams Monitoring is not supported for versions lower than this.
{{% data_streams/monitoring-kafka-pipelines %}}

### Monitoring SQS Pipelines
Data Streams Monitoring uses one [message attribute][4] to track a message's path through an SQS queue. As Amazon SQS has a maximum limit of 10 message attributes allowed per message, all messages streamed through the data pipelines must have 9 or fewer message attributes set, allowing the remaining attribute for Data Streams Monitoring.
{{% data_streams/monitoring-sqs-pipelines %}}

{{% data-streams-monitoring/monitoring-rabbitmq-pipelines %}}
{{% data_streams/monitoring-rabbitmq-pipelines %}}

### Monitoring Kinesis pipelines
There are no message attributes in Kinesis to propagate context and track a message's full path through a Kinesis stream. As a result, Data Streams Monitoring's end-to-end latency metrics are approximated based on summing latency on segments of a message's path, from the producing service through a Kinesis Stream, to a consumer service. Throughput metrics are based on segments from the producing service through a Kinesis Stream, to the consumer service. The full topology of data streams can still be visualized through instrumenting services.
{{% data_streams/monitoring-kinesis-pipelines %}}

### Monitoring SNS-to-SQS pipelines
To monitor a data pipeline where Amazon SNS talks directly to Amazon SQS, you must enable [Amazon SNS raw message delivery][7].
{{% data_streams/monitoring-sns-to-sqs-pipelines %}}

### Manual instrumentation
Data Streams Monitoring propagates context through message headers. If you are using a message queue technology that is not supported by DSM, a technology without headers (such as Kinesis), or Lambdas, use [manual instrumentation to set up DSM][6].

### Monitoring connectors

#### Confluent Cloud connectors
{{% dsm_confluent_connectors %}}
{{% data_streams/dsm-confluent-connectors %}}

## Further reading

Expand All @@ -65,7 +61,5 @@ Data Streams Monitoring propagates context through message headers. If you are u
[1]: /agent
[2]: /tracing/trace_collection/dd_libraries/python
[3]: https://pypi.org/project/confluent-kafka/
[4]: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-message-metadata.html
[5]: https://pypi.org/project/kombu/
[6]: /data_streams/manual_instrumentation/?tab=python
[7]: https://docs.aws.amazon.com/sns/latest/dg/sns-large-payload-raw-message-delivery.html
32 changes: 32 additions & 0 deletions content/en/data_streams/technologies/azure_service_bus.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
---
title: Azure Service Bus for Data Streams Monitoring
---

### Prerequisites

* [Datadog Agent v7.34.0 or later][1]

{{% data_streams/monitoring-azure-service-bus %}}

### Support for Azure Service Bus in Data Streams Monitoring

<table>
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

decided to use html table styling for these for consistency. Some of the tech tables have cells that span multiple columns (kafka's Go language has multiple libraries for ex.)

<thead>
<tr>
<th>Language</th>
<th>Library</th>
<th>Minimal tracer version</th>
<th>Recommended tracer version</th>
</tr>
</thead>
<tbody>
<tr>
<td><a href="/data_streams/dotnet">.NET</a></td>
<td><a href="https://www.nuget.org/packages/Azure.Messaging.ServiceBus">Azure.Messaging.ServiceBus</a></td>
<td>2.53.0 </td>
<td>2.53.0 or later </td>
</tr>
</tbody>
</table>

[1]: /agent
36 changes: 36 additions & 0 deletions content/en/data_streams/technologies/google_pubsub.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
---
title: Google Pub/Sub for Data Streams Monitoring
---

### Prerequisites

* [Datadog Agent v7.34.0 or later][1]

### Support for Google Pub/Sub in Data Streams Monitoring

<table>
<thead>
<tr>
<th>Language</th>
<th>Library</th>
<th>Minimal tracer version</th>
<th>Recommended tracer version</th>
</tr>
</thead>
<tbody>
<tr>
<td><a href="/data_streams/java">Java</a></td>
<td><a href="https://mvnrepository.com/artifact/com.google.cloud/google-cloud-pubsub">google-cloud/pubsub</a></td>
<td>1.25.0</td>
<td>1.42.2 or later</td>
</tr>
<tr>
<td><a href="/data_streams/nodejs">NodeJs</a></td>
<td><a href="https://www.npmjs.com/package/@google-cloud/pubsub">google-cloud/pubsub</a></td>
<td>5.25.0 or 4.49.0</td>
<td>5.25.0 or later</td>
</tr>
</tbody>
</table>

[1]: /agent
30 changes: 30 additions & 0 deletions content/en/data_streams/technologies/ibm_mq.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
---
title: IBM MQ for Data Streams Monitoring
---

### Prerequisites

* [Datadog Agent v7.34.0 or later][1]

### Support for IBM MQ in Data Streams Monitoring

<table>
<thead>
<tr>
<th>Language</th>
<th>Library</th>
<th>Minimal tracer version</th>
<th>Recommended tracer version</th>
</tr>
</thead>
<tbody>
<tr>
<td><a href="/data_streams/dotnet">.NET</a></td>
<td><a href="https://www.nuget.org/packages/IBMMQDotnetClient">IBMMQDotnetClient</a></td>
<td>2.49.0</td>
<td>2.49.0 or later</td>
</tr>
</tbody>
</table>

[1]: /agent
Loading
Loading