From 03ad1ce7673bb9e46e826901a2c02459b84d07d6 Mon Sep 17 00:00:00 2001
From: Ida Adjivon <65119712+iadjivon@users.noreply.github.com>
Date: Tue, 20 May 2025 18:34:46 -0400
Subject: [PATCH 01/15] added the sync-cli file to the account management guide
and started the doc.
---
content/en/account_management/guide/_index.md | 4 +++
.../en/account_management/guide/sync-cli.md | 27 +++++++++++++++++++
2 files changed, 31 insertions(+)
create mode 100644 content/en/account_management/guide/sync-cli.md
diff --git a/content/en/account_management/guide/_index.md b/content/en/account_management/guide/_index.md
index a09355b138d..d91dfbeb957 100644
--- a/content/en/account_management/guide/_index.md
+++ b/content/en/account_management/guide/_index.md
@@ -19,3 +19,7 @@ cascade:
{{< nextlink href="account_management/guide/manage-datadog-with-terraform" >}}Manage Datadog with Terraform{{< /nextlink >}}
{{< nextlink href="account_management/guide/access-your-support-ticket" >}}Access your Support ticket{{< /nextlink >}}
{{< /whatsnext >}}
+
+{{< whatsnext desc="Transferring your data" >}}
+ {{< nextlink href="account_management/guide/sync-cli" >}}Sync your resources across Datadog organizations{{< /nextlink >}}
+{{< /whatsnext >}}
\ No newline at end of file
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
new file mode 100644
index 00000000000..b71dbe334b9
--- /dev/null
+++ b/content/en/account_management/guide/sync-cli.md
@@ -0,0 +1,27 @@
+---
+title: Sync resources across Datadog organizations
+disable_toc: false
+further_reading:
+- link: "/agent/guide/"
+ tag: "Documentation"
+ text: "Agent guides"
+---
+
+## Overview
+Use the `datadog-sync-cli` tool to copy your dashboards, monitors and other configurations from your primary Datadog account to your secondary Datadog account.
+
+You can determine the frequency and timing of syncing based on your business requirements. However, regular syncing is essential to ensure that your secondary account is up-to-date in the event of an outage.
+
+Datadog recommends performing this operation on a daily basis.
+
+
The datadog-sync-cli tool is used to migrate resources across organizations, regardless of datacenter. It cannot, nor is it intended to, transfer any ingested data, such as logs, metrics etc. The source organization will not be modified, but the destination organization will have resources created and updated by the sync command.
+
+## Setup
+
+The `datadog-sync-cli` tool can be installed from source, from Releases, or from using Docker and builing an image.
+
+### Installing from source
+
+### Installing from releases
+
+###
\ No newline at end of file
From 5e8a8d4001b2fcfe1a358a5469cda3cfd26f8d32 Mon Sep 17 00:00:00 2001
From: Ida Adjivon <65119712+iadjivon@users.noreply.github.com>
Date: Tue, 20 May 2025 18:37:51 -0400
Subject: [PATCH 02/15] wip
---
content/en/account_management/guide/sync-cli.md | 5 ++++-
1 file changed, 4 insertions(+), 1 deletion(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index b71dbe334b9..7b1a56d09c5 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -24,4 +24,7 @@ The `datadog-sync-cli` tool can be installed from source, from Releases, or from
### Installing from releases
-###
\ No newline at end of file
+### Installing using Docker
+
+
+## Usage
\ No newline at end of file
From d77524de6ec6673604d1dc7c1b4e804cf9aa18dc Mon Sep 17 00:00:00 2001
From: Ida Adjivon <65119712+iadjivon@users.noreply.github.com>
Date: Wed, 21 May 2025 16:01:45 -0400
Subject: [PATCH 03/15] added to the usage section of the document. The flow
still needs work.
---
.../en/account_management/guide/sync-cli.md | 191 +++++++++++++++++-
1 file changed, 187 insertions(+), 4 deletions(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index 7b1a56d09c5..3f85cf1dce1 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -5,6 +5,14 @@ further_reading:
- link: "/agent/guide/"
tag: "Documentation"
text: "Agent guides"
+- link: "https://docs.datadoghq.com/account_management/multi_organization/"
+ tag: "Blog"
+ text: "Best practices for managing Datadog organizations at scale"
+- link: "/account_management/saml/"
+ tag: "Documentation"
+ text: "Configure SAML for your Datadog account"
+- link: "/account_management/billing/usage_details"
+ tag: "Documentation"
---
## Overview
@@ -12,19 +20,194 @@ Use the `datadog-sync-cli` tool to copy your dashboards, monitors and other conf
You can determine the frequency and timing of syncing based on your business requirements. However, regular syncing is essential to ensure that your secondary account is up-to-date in the event of an outage.
-Datadog recommends performing this operation on a daily basis.
+Datadog recommends syncing your accounts on a daily basis.
- The datadog-sync-cli tool is used to migrate resources across organizations, regardless of datacenter. It cannot, nor is it intended to, transfer any ingested data, such as logs, metrics etc. The source organization will not be modified, but the destination organization will have resources created and updated by the sync command.
+ Note: The datadog-sync-cli tool is used to migrate resources across organizations, regardless of datacenter. It cannot, nor is it intended to, transfer any ingested data, such as logs, metrics etc. The source organization will not be modified, but the destination organization will have resources created and updated by the sync command.
## Setup
-The `datadog-sync-cli` tool can be installed from source, from Releases, or from using Docker and builing an image.
+The `datadog-sync-cli` tool can be installed from:
+- [source](#installing-from-source)
+- [releases](#installing-from-releases)
+- [using Docker and builing an image](#installing-using-docker)
### Installing from source
+Installing from source requires Python `v3.9+`
+
+1. Clone the project repo and CD into the directory
+
+ ```shell
+ git clone https://github.com/DataDog/datadog-sync-cli.git
+ cd datadog-sync-cli
+ ```
+
+
+2. Install `datadog-sync-cli` tool using pip
+
+ ```shell
+ pip install .
+ ```
+
+3. Invoke the cli tool using
+ ```shell
+ datadog-sync
+ ```
+
### Installing from releases
+{{< tabs >}}
+{{% tab "MacOS and Linux" %}}
+
+1. Download the executable from the [Releases page][1]
+
+2. Provide the executable with executable permission
+ ```shell
+ chmod +x datadog-sync-cli-{system-name}-{machine-type}
+ ```
+
+3. Move the executable to your bin directory
+ ```shell
+ sudo mv datadog-sync-cli-{system-name}-{machine-type} /usr/local/bin/datadog-sync
+ ```
+
+4. Invoke the CLI tool using
+ ```shell
+ datadog-sync
+ ```
+
+[1]: https://github.com/DataDog/datadog-sync-cli/releases
+{{% /tab %}}
+
+{{% tab "Windows" %}}
+
+1. Download the executable with extension `.exe` from the [Releases page][1]
+
+2. Add the directory containing the exe file to your [path][2]
+
+3. Invoke the CLI tool in `cmd/powershell` using the file name and omitting the extension:
+ ```shell
+ datadog-sync-cli-windows-amd64
+ ```
+
+[1]: https://github.com/DataDog/datadog-sync-cli/releases
+[2]: https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/path
+{{% /tab %}}
+{{< /tabs >}}
+
### Installing using Docker
+1. Clone the project repo and CD into the directory
+
+ ```shell
+ git clone https://github.com/DataDog/datadog-sync-cli.git
+ cd datadog-sync-cli
+ ```
+
+2. Build the provided Dockerfile
+
+ ``` shell
+ docker build . -t datadog-sync
+ ```
+
+3. Run the Docker image using entrypoint below:
+ ```
+ docker run --rm -v :/datadog-sync:rw \
+ -e DD_SOURCE_API_KEY= \
+ -e DD_SOURCE_APP_KEY= \
+ -e DD_SOURCE_API_URL= \
+ -e DD_DESTINATION_API_KEY= \
+ -e DD_DESTINATION_APP_KEY= \
+ -e DD_DESTINATION_API_URL= \
+ datadog-sync:latest
+ ```
+The docker run command mounts a specified working directory to the container.
+
+
+## Usage
+
+These are the Available URL's for the source and destination API URLs when syncing your organization:
+
+| Site | Site URL | Site Parameter | Location |
+|---------|-----------------------------|---------------------|----------|
+| US1 | `https://app.datadoghq.com` | `datadoghq.com` | US |
+| US3 | `https://us3.datadoghq.com` | `us3.datadoghq.com` | US |
+| US5 | `https://us5.datadoghq.com` | `us5.datadoghq.com` | US |
+| EU1 | `https://app.datadoghq.eu` | `datadoghq.eu` | EU (Germany) |
+| US1-FED | `https://app.ddog-gov.com` | `ddog-gov.com` | US |
+| AP1 | `https://ap1.datadoghq.com` | `ap1.datadoghq.com` | Japan |
+
+
+For all available regions, see [Getting Started with Datadog Sites][1].
+
+
+### Syncing your resources
+
+1. Run the `import` command to read the specified resources from the source organization and store them locally into JSON files in the directory `resources/source`.
+
+2. Run the `sync` command which will use the stored files from previous `import` command to create/modify the resources on the destination organization. The pushed resources are saved in the directory resources/destination.
+ - (unless `--force-missing-dependencies flag is passed`)(`WHAT IS THIS REFERENING?`)
+
+3. The migrate command will run an `import` followed immediately by a `sync`.
+
+4. The reset command will delete resources at the destination; however, by default it backs up those resources first and fails if it cannot. You can (but probably shouldn't) skip the backup by using the --do-not-backup flag.
+
+ Note: The tool uses the resources directory as the source of truth for determining what resources need to be created and modified. Hence, this directory should not be removed or corrupted.
+
+Example usage:
+
+```shell
+# Import resources from parent organization and store them locally
+$ datadog-sync import \
+ --source-api-key="..." \
+ --source-app-key="..." \
+ --source-api-url="https://api.datadoghq.com" # this is an example of a source url, yours may be different
+
+> 2024-03-14 14:53:54,280 - INFO - Starting import...
+> ...
+> 2024-03-14 15:00:46,100 - INFO - Finished import
+
+# Check diff output to see what resources will be created/modified
+$ datadog-sync diffs \
+ --destination-api-key="..." \
+ --destination-app-key="..." \
+ --destination-api-url="https://api.datadoghq.eu" #this is an example of a destination url, yours may be different
+
+> 2024-03-14 15:46:22,014 - INFO - Starting diffs...
+> ...
+> 2024-03-14 14:51:15,379 - INFO - Finished diffs
+
+# Sync the resources to the child organization from locally stored files and save the output locally
+$ datadog-sync sync \
+ --destination-api-key="..." \
+ --destination-app-key="..." \
+ --destination-api-url="https://api.datadoghq.eu"
+
+> 2024-03-14 14:55:56,535 - INFO - Starting sync...
+> ...
+> 2024-03-14 14:56:00,797 - INFO - Finished sync: 1 successes, 0 errors
+```
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+## Further Reading
+
+{{< partial name="whats-next/whats-next.html" >}}
+
+[1]: https://docs.datadoghq.com/getting_started/site/
-## Usage
\ No newline at end of file
From afd85885992ceffaac89dd6d66b53c26f01bd228 Mon Sep 17 00:00:00 2001
From: Ida Adjivon
Date: Fri, 13 Jun 2025 17:12:03 -0400
Subject: [PATCH 04/15] wip
---
.../en/account_management/guide/sync-cli.md | 53 +++++++++++++++++--
1 file changed, 48 insertions(+), 5 deletions(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index 3f85cf1dce1..5fee5b51f26 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -22,7 +22,7 @@ You can determine the frequency and timing of syncing based on your business req
Datadog recommends syncing your accounts on a daily basis.
- Note: The datadog-sync-cli tool is used to migrate resources across organizations, regardless of datacenter. It cannot, nor is it intended to, transfer any ingested data, such as logs, metrics etc. The source organization will not be modified, but the destination organization will have resources created and updated by the sync command.
+ Note: The datadog-sync-cli tool is used to migrate resources across organizations, regardless of datacenter. It cannot, nor is it intended to, transfer any ingested data, such as logs, metrics etc. The source organization will not be modified, but the destination organization will have resources created and updated by the sync command.
## Setup
@@ -32,8 +32,11 @@ The `datadog-sync-cli` tool can be installed from:
- [using Docker and builing an image](#installing-using-docker)
### Installing from source
+
+{{% collapse-content title="Source installation steps" level="h5" expanded=true id="id-for-anchoring" %}}
Installing from source requires Python `v3.9+`
+
1. Clone the project repo and CD into the directory
```shell
@@ -52,10 +55,13 @@ Installing from source requires Python `v3.9+`
```shell
datadog-sync
```
+{{% /collapse-content %}}
### Installing from releases
+{{% collapse-content title="Releases installation steps" level="h5" expanded=true id="id-for-anchoring" %}}
+
{{< tabs >}}
{{% tab "MacOS and Linux" %}}
@@ -78,7 +84,7 @@ Installing from source requires Python `v3.9+`
[1]: https://github.com/DataDog/datadog-sync-cli/releases
{{% /tab %}}
-
+
{{% tab "Windows" %}}
1. Download the executable with extension `.exe` from the [Releases page][1]
@@ -94,8 +100,12 @@ Installing from source requires Python `v3.9+`
[2]: https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/path
{{% /tab %}}
{{< /tabs >}}
+{{% /collapse-content %}}
### Installing using Docker
+
+{{% collapse-content title="Docker installation steps" level="h5" expanded=true id="id-for-anchoring" %}}
+
1. Clone the project repo and CD into the directory
```shell
@@ -121,6 +131,7 @@ Installing from source requires Python `v3.9+`
datadog-sync:latest
```
The docker run command mounts a specified working directory to the container.
+{{% /collapse-content %}}
## Usage
@@ -145,15 +156,15 @@ For all available regions, see [Getting Started with Datadog Sites][1].
1. Run the `import` command to read the specified resources from the source organization and store them locally into JSON files in the directory `resources/source`.
2. Run the `sync` command which will use the stored files from previous `import` command to create/modify the resources on the destination organization. The pushed resources are saved in the directory resources/destination.
- - (unless `--force-missing-dependencies flag is passed`)(`WHAT IS THIS REFERENING?`)
+ - (unless `--force-missing-dependencies flag is passed`)(`WHAT IS THIS REFERENCING?`)
3. The migrate command will run an `import` followed immediately by a `sync`.
4. The reset command will delete resources at the destination; however, by default it backs up those resources first and fails if it cannot. You can (but probably shouldn't) skip the backup by using the --do-not-backup flag.
- Note: The tool uses the resources directory as the source of truth for determining what resources need to be created and modified. Hence, this directory should not be removed or corrupted.
+ Note:
The sync-cli tool uses the resources directory as the source of truth for determining what resources need to be created and modified. This directory should not be removed or corrupted.
-Example usage:
+### Example usage
```shell
# Import resources from parent organization and store them locally
@@ -187,15 +198,47 @@ $ datadog-sync sync \
> 2024-03-14 14:56:00,797 - INFO - Finished sync: 1 successes, 0 errors
```
+
+
+
+## Filtering the data collected
+Filtering is done on two levels, at top resources level and per individual resource level using `--resources` and `--filter` respectively.
+
+### Top resources level filtering
+By default all resources are imported, synced, etc. If you would like to perform actions on a specific top level resource, or subset of resources, use `--resources` option.
+For example, the command `datadog-sync import --resources="dashboard_lists,dashboards"` will import ALL dashboards and dashboard lists in your Datadog organization.
+### Per resource level filtering
+Individual resources can be further filtered using the `--filter `flag. For example, the following command `datadog-sync import --resources="dashboards,dashboard_lists" --filter='Type=dashboard_lists;Name=name;Value=My custom list'`, will import ALL dashboards and ONLY dashboard lists with the `name` attribute equal to `My custom list`.
+
+
+Filter option (`--filter`) accepts a string made up of `key=value` pairs separated by `;`.
+
+```
+--filter 'Type=;Name=;Value=;Operator='
+```
+
+### Available keys:
+`Type`
+: Resource such as Monitors, Dashboards, and more. [required]
+`Name`
+: Attribute key to filter on. This can be any attribute represented in dot notation (such as attributes.user_count). [required]
+`Value`
+: Regex to filter attribute value by. Note: special regex characters need to be escaped if filtering by raw string. [required]
+`Operator`
+: Available operators are below. All invalid operator's default to ExactMatch.
+- `Not`: Match not equal to Value.
+- `SubString` (_Deprecated_): Sub string matching. (This operator will be removed in future releases. See SubString and ExactMatch Deprecation section.)
+- `ExactMatch` (_Deprecated_): Exact string match. (This operator will be removed in future releases. See SubString and ExactMatch Deprecation section.)
+By default, if multiple filters are passed for the same resource, OR logic is applied to the filters. This behavior can be adjusted using the --filter-operator option.
From 8f2083a9f4e0b5b219154d5a047f7c3c1d09ece7 Mon Sep 17 00:00:00 2001
From: Ida Adjivon
Date: Mon, 30 Jun 2025 17:08:08 -0400
Subject: [PATCH 05/15] additional changes to the sync-cli doc
---
.../en/account_management/guide/sync-cli.md | 223 +++++++++++++-----
1 file changed, 166 insertions(+), 57 deletions(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index 5fee5b51f26..64bb271e521 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -24,16 +24,78 @@ Datadog recommends syncing your accounts on a daily basis.
Note: The datadog-sync-cli tool is used to migrate resources across organizations, regardless of datacenter. It cannot, nor is it intended to, transfer any ingested data, such as logs, metrics etc. The source organization will not be modified, but the destination organization will have resources created and updated by the sync command.
-## Setup
+## Supported resources and site URLs
+
+Before you begin, confirm if both the **resource** you are migrating and the **source/destination API URLs** you are using are supported for the `sync-cli` tool:
+
+{{% collapse-content title="List of supported resources" level="h5" expanded=true id="id-for-resources" %}}
+
+| Resource | Description |
+|--------------------------------------- |-----------------------------------------------------------|
+| authn_mappings | Sync Datadog authn mappings. |
+| dashboard_lists | Sync Datadog dashboard lists. |
+| dashboards | Sync Datadog dashboards. |
+| downtime_schedules | Sync Datadog downtimes. |
+| downtimes (**deprecated**) | Sync Datadog downtimes. |
+| host_tags | Sync Datadog host tags. |
+| logs_archives | Sync Datadog logs archives. Requires GCP, Azure, or AWS integration.|
+| logs_archives_order | Sync Datadog logs archives order. |
+| logs_custom_pipelines (**deprecated**) | Sync Datadog logs custom pipelines. |
+| logs_indexes | Sync Datadog logs indexes. |
+| logs_indexes_order | Sync Datadog logs indexes order. |
+| logs_metrics | Sync Datadog logs metrics. |
+| logs_pipelines | Sync Datadog logs OOTB integration and custom pipelines. |
+| logs_pipelines_order | Sync Datadog logs pipelines order. |
+| logs_restriction_queries | Sync Datadog logs restriction queries. |
+| metric_percentiles | Sync Datadog metric percentiles. |
+| metric_tag_configurations | Sync Datadog metric tags configurations. |
+| metrics_metadata | Sync Datadog metric metadata. |
+| monitors | Sync Datadog monitors. |
+| notebooks | Sync Datadog notebooks. |
+| powerpacks | Sync Datadog powerpacks. |
+| restriction_policies | Sync Datadog restriction policies. |
+| roles | Sync Datadog roles. |
+| sensitive_data_scanner_groups | Sync SDS groups |
+| sensitive_data_scanner_groups_order | Sync SDS groups order |
+| sensitive_data_scanner_rules | Sync SDS rules |
+| service_level_objectives | Sync Datadog SLOs. |
+| slo_corrections | Sync Datadog SLO corrections. |
+| spans_metrics | Sync Datadog spans metrics. |
+| synthetics_global_variables | Sync Datadog synthetic global variables. |
+| synthetics_private_locations | Sync Datadog synthetic private locations. |
+| synthetics_tests | Sync Datadog synthetic tests. |
+| teams | Sync Datadog teams (excluding permissions). |
+| team_memberships | Sync Datadog team memberships. |
+| users | Sync Datadog users. |
+
+**Note:** `logs_custom_pipelines` resource has been deprecated in favor of `logs_pipelines` resource which supports both logs OOTB integration and custom pipelines. To migrate to the new resource, rename the existing state files from `logs_custom_pipelines.json` to `logs_pipelines.json` for both source and destination files.
+{{% /collapse-content %}}
+
+{{% collapse-content title="List of source and destination API URLs" level="h5" expanded=true id="id-for-resources" %}}
+These are the Available URL's for the source and destination API URLs when syncing your organization:
+
+| Site | Site URL | Site Parameter | Location |
+|---------|-----------------------------|---------------------|----------|
+| US1 | `https://app.datadoghq.com` | `datadoghq.com` | US |
+| US3 | `https://us3.datadoghq.com` | `us3.datadoghq.com` | US |
+| US5 | `https://us5.datadoghq.com` | `us5.datadoghq.com` | US |
+| EU1 | `https://app.datadoghq.eu` | `datadoghq.eu` | EU (Germany) |
+| US1-FED | `https://app.ddog-gov.com` | `ddog-gov.com` | US |
+| AP1 | `https://ap1.datadoghq.com` | `ap1.datadoghq.com` | Japan |
-The `datadog-sync-cli` tool can be installed from:
-- [source](#installing-from-source)
-- [releases](#installing-from-releases)
-- [using Docker and builing an image](#installing-using-docker)
-### Installing from source
+For all available regions, see [Getting Started with Datadog Sites][1].
+[1]: https://docs.datadoghq.com/getting_started/site/
+{{% /collapse-content %}}
+
+
+
+## Install the datadog-sync-cli tool
+
+The `datadog-sync-cli` tool can be installed from [source](#installing-from-source), [releases](#installing-from-releases), and [using Docker and builing an image](#installing-using-docker):
-{{% collapse-content title="Source installation steps" level="h5" expanded=true id="id-for-anchoring" %}}
+
+{{% collapse-content title="Installing from source" level="h5" expanded=true id="id-for-anchoring" %}}
Installing from source requires Python `v3.9+`
@@ -58,9 +120,7 @@ Installing from source requires Python `v3.9+`
{{% /collapse-content %}}
-### Installing from releases
-
-{{% collapse-content title="Releases installation steps" level="h5" expanded=true id="id-for-anchoring" %}}
+{{% collapse-content title="Installing from releases" level="h5" expanded=true id="id-for-anchoring" %}}
{{< tabs >}}
{{% tab "MacOS and Linux" %}}
@@ -102,9 +162,8 @@ Installing from source requires Python `v3.9+`
{{< /tabs >}}
{{% /collapse-content %}}
-### Installing using Docker
-{{% collapse-content title="Docker installation steps" level="h5" expanded=true id="id-for-anchoring" %}}
+{{% collapse-content title="Installing using Docker" level="h5" expanded=true id="id-for-anchoring" %}}
1. Clone the project repo and CD into the directory
@@ -133,30 +192,16 @@ Installing from source requires Python `v3.9+`
The docker run command mounts a specified working directory to the container.
{{% /collapse-content %}}
+
-## Usage
+## How to use the datadog-sync-cli tool
-These are the Available URL's for the source and destination API URLs when syncing your organization:
-
-| Site | Site URL | Site Parameter | Location |
-|---------|-----------------------------|---------------------|----------|
-| US1 | `https://app.datadoghq.com` | `datadoghq.com` | US |
-| US3 | `https://us3.datadoghq.com` | `us3.datadoghq.com` | US |
-| US5 | `https://us5.datadoghq.com` | `us5.datadoghq.com` | US |
-| EU1 | `https://app.datadoghq.eu` | `datadoghq.eu` | EU (Germany) |
-| US1-FED | `https://app.ddog-gov.com` | `ddog-gov.com` | US |
-| AP1 | `https://ap1.datadoghq.com` | `ap1.datadoghq.com` | Japan |
-
-
-For all available regions, see [Getting Started with Datadog Sites][1].
-
-
-### Syncing your resources
+### Steps to sync your resources
1. Run the `import` command to read the specified resources from the source organization and store them locally into JSON files in the directory `resources/source`.
2. Run the `sync` command which will use the stored files from previous `import` command to create/modify the resources on the destination organization. The pushed resources are saved in the directory resources/destination.
- - (unless `--force-missing-dependencies flag is passed`)(`WHAT IS THIS REFERENCING?`)
+ - (unless `--force-missing-dependencies` flag is passed)(`WHAT IS THIS REFERENCING?`)
3. The migrate command will run an `import` followed immediately by a `sync`.
@@ -166,6 +211,8 @@ For all available regions, see [Getting Started with Datadog Sites][1].
### Example usage
+This example uses US1 as the source URL and the EU as the destination URL. Your source and destinations may be different. See the list of [supported source and destination API URLs](#supported-resources-and-site-urls) for more information.
+
```shell
# Import resources from parent organization and store them locally
$ datadog-sync import \
@@ -198,54 +245,116 @@ $ datadog-sync sync \
> 2024-03-14 14:56:00,797 - INFO - Finished sync: 1 successes, 0 errors
```
-
-
-
-## Filtering the data collected
-Filtering is done on two levels, at top resources level and per individual resource level using `--resources` and `--filter` respectively.
+### Filter the data collected
+By default all resources are imported and synced. You can use the filtering option to specify what resources are migrated. Filtering is done on two levels:
+- [top resources level filtering](#top-resources-level-filtering) (`--resources` filter option)
+- [individual resource level filtering](#per-resource-level-filtering) (`--filter` filter option)
-
-### Top resources level filtering
+#### Top resources level filtering
By default all resources are imported, synced, etc. If you would like to perform actions on a specific top level resource, or subset of resources, use `--resources` option.
For example, the command `datadog-sync import --resources="dashboard_lists,dashboards"` will import ALL dashboards and dashboard lists in your Datadog organization.
-### Per resource level filtering
-Individual resources can be further filtered using the `--filter `flag. For example, the following command `datadog-sync import --resources="dashboards,dashboard_lists" --filter='Type=dashboard_lists;Name=name;Value=My custom list'`, will import ALL dashboards and ONLY dashboard lists with the `name` attribute equal to `My custom list`.
+#### Individual resource level filtering
+Individual resources can be further filtered using the `--filter `flag. For example, the following command will import ALL dashboards and ONLY dashboard lists with the `name` attribute equal to `My custom list`:
+```shell
+datadog-sync import --resources="dashboards,dashboard_lists" --filter='Type=dashboard_lists;Name=name;Value=My custom list'
+```
-Filter option (`--filter`) accepts a string made up of `key=value` pairs separated by `;`.
+The filter option `--filter` accepts a string made up of `key=value` pairs separated by `;` as seen in the example here:
-```
+```shell
--filter 'Type=;Name=;Value=;Operator='
```
-### Available keys:
+##### SubString and ExactMatch Deprecation
-`Type`
-: Resource such as Monitors, Dashboards, and more. [required]
+In future releases the `SubString` and `ExactMatch` Operator will be removed in favor of the `Value` key. This is because the `Value` key supports regex so both of these scenarios are covered by just writing the appropriate regex. Below is an example:
-`Name`
-: Attribute key to filter on. This can be any attribute represented in dot notation (such as attributes.user_count). [required]
-
-`Value`
-: Regex to filter attribute value by. Note: special regex characters need to be escaped if filtering by raw string. [required]
-
-`Operator`
-: Available operators are below. All invalid operator's default to ExactMatch.
-- `Not`: Match not equal to Value.
-- `SubString` (_Deprecated_): Sub string matching. (This operator will be removed in future releases. See SubString and ExactMatch Deprecation section.)
-- `ExactMatch` (_Deprecated_): Exact string match. (This operator will be removed in future releases. See SubString and ExactMatch Deprecation section.)
-
-By default, if multiple filters are passed for the same resource, OR logic is applied to the filters. This behavior can be adjusted using the --filter-operator option.
+Let's take the scenario where you would like to filter for monitors that have the `filter test` in the `name` attribute:
+| Operator | Command |
+| ----------------| ---------------------------------------------------------------------- |
+| `SubString` | `--filter 'Type=monitors;Name=name;Value=filter test;Operator=SubString'` |
+| Using `Value` | `--filter 'Type=monitors;Name=name;Value=.*filter test.*` |
+| `ExactMatch` | `--filter 'Type=monitors;Name=name;Value=filter test;Operator=ExactMatch'` |
+| Using `Value` | `--filter 'Type=monitors;Name=name;Value=^filter test$` |
+### Available keys
+**Type** `required`
+: Resource such as Monitors, Dashboards, and more.
+`Name` [required]
+: Attribute key to filter on. This can be any attribute represented in dot notation such as `attributes.user_count`.
+`Value` [required]
+: Regex to filter attribute value by. Note: special regex characters need to be escaped if filtering by raw string.
+`Operator`
+: All invalid operator's default to ExactMatch. Available operators are:
+: - `Not`: Match not equal to Value.
+: - `SubString` (_Deprecated_): Sub string matching. This operator will be removed in future releases. See SubString and ExactMatch Deprecation section.
+: - `ExactMatch` (_Deprecated_): Exact string match. This operator will be removed in future releases. See SubString and ExactMatch Deprecation section.
+
+By default, if multiple filters are passed for the same resource, OR logic is applied to the filters. This behavior can be adjusted using the `--filter-operator` option.
+
+
+
+
+
+
+## Best practices
+
+Many Datadog resources are interdependent. For example, some Datadog resource can reference `roles` and `dashboards`, which includes widgets that may use Monitors or Synthetics data. The datadog-sync tool syncs these resources in order to ensure dependencies are not broken.
+
+If importing/syncing subset of resources, users should ensure that dependent resources are imported and synced as well.
+
+See [Supported resources](#supported-resources) section below for potential resource dependencies.
+
+{{% collapse-content title="Potential resources dependencies" level="h5" expanded=true id="id-for-resources" %}}
+
+| Resource | Dependencies |
+|----------------------------------------|------------------------------------------------------------------|
+| authn_mappings | roles, teams |
+| dashboard_lists | dashboards |
+| dashboards | monitors, roles, powerpacks, service_level_objectives |
+| downtime_schedules | monitors |
+| downtimes (**deprecated**) | monitors |
+| host_tags | - |
+| logs_archives | - (Requires manual setup of AWS, GCP or Azure integration) |
+| logs_archives_order | logs_archives |
+| logs_custom_pipelines (**deprecated**) | - |
+| logs_indexes | - |
+| logs_indexes_order | logs_indexes |
+| logs_metrics | - |
+| logs_pipelines | - |
+| logs_pipelines_order | logs_pipelines |
+| logs_restriction_queries | roles |
+| metric_percentiles | - |
+| metric_tag_configurations | - |
+| metrics_metadata | - |
+| monitors | roles, service_level_objectives |
+| notebooks | - |
+| powerpacks | monitors, service_level_objectives |
+| restriction_policies | dashboards, service_level_objectives, notebooks, users, roles |
+| roles | - |
+| sensitive_data_scanner_groups | - |
+| sensitive_data_scanner_groups_order | sensitive_data_scanner_groups |
+| sensitive_data_scanner_rules | sensitive_data_scanner_groups |
+| service_level_objectives | monitors, synthetics_tests |
+| slo_corrections | service_level_objectives |
+| spans_metrics | - |
+| synthetics_global_variables | synthetics_tests |
+| synthetics_private_locations | - |
+| synthetics_tests | synthetics_private_locations, synthetics_global_variables, roles |
+| teams | - |
+| team_memberships | teams, users |
+| users | roles |
+{{% /collapse-content %}}
## Further Reading
From 79e7ea4db46fa2fd5c0af67182ddf7b0df72f3ca Mon Sep 17 00:00:00 2001
From: Ida Adjivon
Date: Tue, 1 Jul 2025 18:39:55 -0400
Subject: [PATCH 06/15] completed migration of content to vscode
---
.../en/account_management/guide/sync-cli.md | 65 +++++++++++++++----
1 file changed, 51 insertions(+), 14 deletions(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index 64bb271e521..69b6c36a503 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -245,6 +245,7 @@ $ datadog-sync sync \
> 2024-03-14 14:56:00,797 - INFO - Finished sync: 1 successes, 0 errors
```
+
### Filter the data collected
By default all resources are imported and synced. You can use the filtering option to specify what resources are migrated. Filtering is done on two levels:
- [top resources level filtering](#top-resources-level-filtering) (`--resources` filter option)
@@ -268,32 +269,35 @@ The filter option `--filter` accepts a string made up of `key=value` pairs separ
```shell
--filter 'Type=;Name=;Value=;Operator='
```
+
##### SubString and ExactMatch Deprecation
In future releases the `SubString` and `ExactMatch` Operator will be removed in favor of the `Value` key. This is because the `Value` key supports regex so both of these scenarios are covered by just writing the appropriate regex. Below is an example:
-Let's take the scenario where you would like to filter for monitors that have the `filter test` in the `name` attribute:
+For example, if you would like to filter for monitors that have the `filter test` in the `name` attribute:
+
+| Operator | Command |
+| -------------------| ------------------------------------------------------------------------------|
+| `SubString` | `--filter 'Type=monitors;Name=name;Value=filter test;Operator=SubString'` |
+| Using `Value` | `--filter 'Type=monitors;Name=name;Value=.*filter test.*` |
+| `ExactMatch`
| `--filter 'Type=monitors;Name=name;Value=filter test;Operator=ExactMatch'` |
+| Using `Value` | `--filter 'Type=monitors;Name=name;Value=^filter test$` |
-| Operator | Command |
-| ----------------| ---------------------------------------------------------------------- |
-| `SubString` | `--filter 'Type=monitors;Name=name;Value=filter test;Operator=SubString'` |
-| Using `Value` | `--filter 'Type=monitors;Name=name;Value=.*filter test.*` |
-| `ExactMatch` | `--filter 'Type=monitors;Name=name;Value=filter test;Operator=ExactMatch'` |
-| Using `Value` | `--filter 'Type=monitors;Name=name;Value=^filter test$` |
+
### Available keys
-**Type** `required`
+Type `REQUIRED`
: Resource such as Monitors, Dashboards, and more.
-`Name` [required]
+Name `REQUIRED`
: Attribute key to filter on. This can be any attribute represented in dot notation such as `attributes.user_count`.
-`Value` [required]
+Value `REQUIRED`
: Regex to filter attribute value by. Note: special regex characters need to be escaped if filtering by raw string.
-`Operator`
+Operator
: All invalid operator's default to ExactMatch. Available operators are:
: - `Not`: Match not equal to Value.
: - `SubString` (_Deprecated_): Sub string matching. This operator will be removed in future releases. See SubString and ExactMatch Deprecation section.
@@ -302,19 +306,52 @@ Let's take the scenario where you would like to filter for monitors that have th
By default, if multiple filters are passed for the same resource, OR logic is applied to the filters. This behavior can be adjusted using the `--filter-operator` option.
+## Additional configurations
+### Using custom configuration instead of options
+You can use a custom configuration text file in place of using `options`. This is an example config file for a `US1` source URL and `EU` destination URL:
+
+```shell
+destination_api_url="https://api.datadoghq.eu"
+destination_api_key=""
+destination_app_key=""
+source_api_key=""
+source_app_key=""
+source_api_url="https://api.datadoghq.com"
+filter=["Type=Dashboards;Name=title;Value=Test screenboard", "Type=Monitors;Name=tags;Value=sync:true"]
+```
+Then, run:
+```shell
+datadog-sync import --config config
+```
+### Using the cleanup flag to sync changes from the source destination
+The tool's `sync` command provides a cleanup flag (`--cleanup`). Passing the cleanup flag will delete resources from the destination organization which have been removed from the source organization. The resources to be deleted are determined based on the difference between the [state files](#) of source and destination organization.
-## Best practices
+For example, `ResourceA` and `ResourceB` are imported and synced, followed by deleting `ResourceA` from the source organization. Running the `import` command will update the source organizations state file to only include `ResourceB`. The following `sync --cleanup=Force` command will now delete `ResourceA` from the destination organization.
-Many Datadog resources are interdependent. For example, some Datadog resource can reference `roles` and `dashboards`, which includes widgets that may use Monitors or Synthetics data. The datadog-sync tool syncs these resources in order to ensure dependencies are not broken.
+
+### Verify your Datadog disaster recovery (DDR) status
+
+By default all commands check the Datadog Disaster Recovery (DDR) status of both the source and destination organizations before running. This behavior is controlled by the boolean flag `--verify-ddr-status` or the environment variable `DD_VERIFY_DDR_STATUS`.
+
+
+### State files [Avoid data duplication while keeping data seperation]
+
+By default, a `resources` directory is generated in the current working directory of the user. This directory contains `json` mapping of resources between the source and destination organization. To avoid duplication and loss of mapping, this directory should be retained between tool usage. To override these directories use the `--source-resources-path` and `--destination-resource-path`.
+
+When running againts multiple destination organizations, a seperate working directory should be used to ensure seperation of data.
+
+## Best practices
+### Resource subsets must be migrated with their dependencies
+Many Datadog resources are interdependent. For example, some Datadog resource can reference `roles` and `dashboards`, which includes widgets that may use monitors or synthetics data. The datadog-sync tool syncs these resources in order to ensure dependencies are not broken.
If importing/syncing subset of resources, users should ensure that dependent resources are imported and synced as well.
See [Supported resources](#supported-resources) section below for potential resource dependencies.
-{{% collapse-content title="Potential resources dependencies" level="h5" expanded=true id="id-for-resources" %}}
+{{% collapse-content title="List of potential resources dependencies" level="h5" expanded=true id="id-for-resources" %}}
| Resource | Dependencies |
|----------------------------------------|------------------------------------------------------------------|
From 54ea4636fccf2d5c15d454a584b8dbe3643aa9af Mon Sep 17 00:00:00 2001
From: Ida Adjivon
Date: Thu, 3 Jul 2025 14:26:07 -0400
Subject: [PATCH 07/15] overview edit
---
content/en/account_management/guide/sync-cli.md | 6 +++---
1 file changed, 3 insertions(+), 3 deletions(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index 69b6c36a503..d2771ee6b0c 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -26,7 +26,7 @@ Datadog recommends syncing your accounts on a daily basis.
## Supported resources and site URLs
-Before you begin, confirm if both the **resource** you are migrating and the **source/destination API URLs** you are using are supported for the `sync-cli` tool:
+Before you begin, confirm that both the **resource** you are migrating and the **source/destination API URLs** you are using are supported by the `sync-cli` tool:
{{% collapse-content title="List of supported resources" level="h5" expanded=true id="id-for-resources" %}}
@@ -68,7 +68,7 @@ Before you begin, confirm if both the **resource** you are migrating and the **s
| team_memberships | Sync Datadog team memberships. |
| users | Sync Datadog users. |
-**Note:** `logs_custom_pipelines` resource has been deprecated in favor of `logs_pipelines` resource which supports both logs OOTB integration and custom pipelines. To migrate to the new resource, rename the existing state files from `logs_custom_pipelines.json` to `logs_pipelines.json` for both source and destination files.
+**Note:** `logs_custom_pipelines` resource has been deprecated in favor of `logs_pipelines` resource which supports both OOTB integration and custom pipelines. To migrate to the new resource, rename the existing state files from `logs_custom_pipelines.json` to `logs_pipelines.json` for both source and destination files.
{{% /collapse-content %}}
{{% collapse-content title="List of source and destination API URLs" level="h5" expanded=true id="id-for-resources" %}}
@@ -286,7 +286,7 @@ For example, if you would like to filter for monitors that have the `filter test
-### Available keys
+### List of available keys
Type `REQUIRED`
: Resource such as Monitors, Dashboards, and more.
From 9e69e39f7f9b799f8fb4e3d1303cc4db0d1ef703 Mon Sep 17 00:00:00 2001
From: Ida Adjivon
Date: Thu, 3 Jul 2025 14:29:10 -0400
Subject: [PATCH 08/15] supported resources
---
content/en/account_management/guide/sync-cli.md | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index d2771ee6b0c..17624b186fb 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -72,7 +72,7 @@ Before you begin, confirm that both the **resource** you are migrating and the *
{{% /collapse-content %}}
{{% collapse-content title="List of source and destination API URLs" level="h5" expanded=true id="id-for-resources" %}}
-These are the Available URL's for the source and destination API URLs when syncing your organization:
+These are the supported URL's for the source and destination API URLs when syncing your organization:
| Site | Site URL | Site Parameter | Location |
|---------|-----------------------------|---------------------|----------|
@@ -85,6 +85,7 @@ These are the Available URL's for the source and destination API URLs when synci
For all available regions, see [Getting Started with Datadog Sites][1].
+
[1]: https://docs.datadoghq.com/getting_started/site/
{{% /collapse-content %}}
From 001564fd2474990e88a466ffa391ee52dfc1c9a1 Mon Sep 17 00:00:00 2001
From: Ida Adjivon
Date: Thu, 3 Jul 2025 14:55:32 -0400
Subject: [PATCH 09/15] how to use edit
---
content/en/account_management/guide/sync-cli.md | 14 ++++++++------
1 file changed, 8 insertions(+), 6 deletions(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index 17624b186fb..9604c4f9da9 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -190,25 +190,27 @@ Installing from source requires Python `v3.9+`
-e DD_DESTINATION_API_URL= \
datadog-sync:latest
```
-The docker run command mounts a specified working directory to the container.
+The docker run command mounts a specified `` working directory to the container.
{{% /collapse-content %}}
## How to use the datadog-sync-cli tool
+ Note: The sync-cli tool uses the resources directory as the source of truth for determining what resources need to be created and modified. This directory should not be removed or corrupted.
+
### Steps to sync your resources
1. Run the `import` command to read the specified resources from the source organization and store them locally into JSON files in the directory `resources/source`.
-2. Run the `sync` command which will use the stored files from previous `import` command to create/modify the resources on the destination organization. The pushed resources are saved in the directory resources/destination.
- - (unless `--force-missing-dependencies` flag is passed)(`WHAT IS THIS REFERENCING?`)
+2. Run the `sync` command to use the stored files from the previous `import` command (unless the `--force-missing-dependencies` flag is passed) to create or modify the resources on the destination organization. The pushed resources are saved in the directory `resources/destination`.
+ - (`WHAT DOES THE --force-missing-dependencies DO IN THIS CASE?`)
-3. The migrate command will run an `import` followed immediately by a `sync`.
+3. The `migrate` command runs an `import` followed immediately by a `sync`.
-4. The reset command will delete resources at the destination; however, by default it backs up those resources first and fails if it cannot. You can (but probably shouldn't) skip the backup by using the --do-not-backup flag.
+4. The `reset` command deletes resources at the destination; however, by default it backs up those resources first and fails if it cannot back them up.
+ - **NOT recommended:** The back-up step can be skipped using the `--do-not-backup` flag.
- Note:
The sync-cli tool uses the resources directory as the source of truth for determining what resources need to be created and modified. This directory should not be removed or corrupted.
### Example usage
From 82e1c0ed14edada1b035955df912d35464ca2588 Mon Sep 17 00:00:00 2001
From: Ida Adjivon
Date: Thu, 3 Jul 2025 15:08:12 -0400
Subject: [PATCH 10/15] changes the site URL to API URL
---
.../en/account_management/guide/sync-cli.md | 22 +++++++++----------
1 file changed, 11 insertions(+), 11 deletions(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index 9604c4f9da9..09f56bba021 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -71,17 +71,17 @@ Before you begin, confirm that both the **resource** you are migrating and the *
**Note:** `logs_custom_pipelines` resource has been deprecated in favor of `logs_pipelines` resource which supports both OOTB integration and custom pipelines. To migrate to the new resource, rename the existing state files from `logs_custom_pipelines.json` to `logs_pipelines.json` for both source and destination files.
{{% /collapse-content %}}
-{{% collapse-content title="List of source and destination API URLs" level="h5" expanded=true id="id-for-resources" %}}
-These are the supported URL's for the source and destination API URLs when syncing your organization:
+{{% collapse-content title="List of supported API URLs" level="h5" expanded=true id="id-for-resources" %}}
+These are the supported URLs for the source and destination API URLs when syncing your organization:
-| Site | Site URL | Site Parameter | Location |
-|---------|-----------------------------|---------------------|----------|
-| US1 | `https://app.datadoghq.com` | `datadoghq.com` | US |
-| US3 | `https://us3.datadoghq.com` | `us3.datadoghq.com` | US |
-| US5 | `https://us5.datadoghq.com` | `us5.datadoghq.com` | US |
-| EU1 | `https://app.datadoghq.eu` | `datadoghq.eu` | EU (Germany) |
-| US1-FED | `https://app.ddog-gov.com` | `ddog-gov.com` | US |
-| AP1 | `https://ap1.datadoghq.com` | `ap1.datadoghq.com` | Japan |
+| Site | API URL | Site Parameter | Location |
+|---------|---------------------------------|---------------------|----------|
+| US1 | `https://api.datadoghq.com` | `datadoghq.com` | US |
+| US3 | `https://api.us3.datadoghq.com` | `us3.datadoghq.com` | US |
+| US5 | `https://api.us5.datadoghq.com` | `us5.datadoghq.com` | US |
+| EU1 | `https://api.datadoghq.eu` | `datadoghq.eu` | EU (Germany) |
+| US1-FED | `https://api.ddog-gov.com` | `ddog-gov.com` | US |
+| AP1 | `https://api.ap1.datadoghq.com` | `ap1.datadoghq.com` | Japan |
For all available regions, see [Getting Started with Datadog Sites][1].
@@ -214,7 +214,7 @@ The docker run command mounts a specified `` working direct
### Example usage
-This example uses US1 as the source URL and the EU as the destination URL. Your source and destinations may be different. See the list of [supported source and destination API URLs](#supported-resources-and-site-urls) for more information.
+This example uses the **US1 site URL** as the source URL and the **EU site URL** as the destination URL. Your source and destinations may be different. See the list of [supported source and destination API URLs](#supported-resources-and-site-urls) for more information.
```shell
# Import resources from parent organization and store them locally
From 0515bc43bc4ba655e326e11c3e6b2e82ab2ba773 Mon Sep 17 00:00:00 2001
From: Ida Adjivon
Date: Thu, 3 Jul 2025 15:16:12 -0400
Subject: [PATCH 11/15] example usage edit
---
content/en/account_management/guide/sync-cli.md | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index 09f56bba021..95b9e28f68c 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -214,7 +214,7 @@ The docker run command mounts a specified `` working direct
### Example usage
-This example uses the **US1 site URL** as the source URL and the **EU site URL** as the destination URL. Your source and destinations may be different. See the list of [supported source and destination API URLs](#supported-resources-and-site-urls) for more information.
+This example uses the **US1 API URL** for the source and the **EU API URL** for the destination of this data migration. Your source and destination may be different. See the list of [supported source and destination API URLs](#supported-resources-and-site-urls) for more information.
```shell
# Import resources from parent organization and store them locally
@@ -237,7 +237,7 @@ $ datadog-sync diffs \
> ...
> 2024-03-14 14:51:15,379 - INFO - Finished diffs
-# Sync the resources to the child organization from locally stored files and save the output locally
+# Sync the resources to the destination organization from locally stored files and save the output locally
$ datadog-sync sync \
--destination-api-key="..." \
--destination-app-key="..." \
From 65a81dd10e8551f262a30098a2627b8d800f3903 Mon Sep 17 00:00:00 2001
From: Ida Adjivon
Date: Thu, 3 Jul 2025 16:24:57 -0400
Subject: [PATCH 12/15] first revision completed
---
.../en/account_management/guide/sync-cli.md | 88 +++++++++++--------
1 file changed, 52 insertions(+), 36 deletions(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index 95b9e28f68c..861ec37d3f8 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -24,6 +24,8 @@ Datadog recommends syncing your accounts on a daily basis.
Note: The datadog-sync-cli tool is used to migrate resources across organizations, regardless of datacenter. It cannot, nor is it intended to, transfer any ingested data, such as logs, metrics etc. The source organization will not be modified, but the destination organization will have resources created and updated by the sync command.
+
+
## Supported resources and site URLs
Before you begin, confirm that both the **resource** you are migrating and the **source/destination API URLs** you are using are supported by the `sync-cli` tool:
@@ -214,7 +216,7 @@ The docker run command mounts a specified `` working direct
### Example usage
-This example uses the **US1 API URL** for the source and the **EU API URL** for the destination of this data migration. Your source and destination may be different. See the list of [supported source and destination API URLs](#supported-resources-and-site-urls) for more information.
+This example uses the `US1` API URL for the source and the `EU` API URL for the destination of this data migration. Your source and destination URLs may be different. See the list of [supported source and destination API URLs](#supported-resources-and-site-urls) for more information.
```shell
# Import resources from parent organization and store them locally
@@ -247,50 +249,36 @@ $ datadog-sync sync \
> ...
> 2024-03-14 14:56:00,797 - INFO - Finished sync: 1 successes, 0 errors
```
+
+## Filter the data collected
+By default all resources are imported and synced. You can use the filtering option to specify what resources are migrated. Filtering can be done on two levels:
+- [Top resources level ](#top-resources-level-filtering)
+- [Individual resource level ](#per-resource-level-filtering)
-### Filter the data collected
-By default all resources are imported and synced. You can use the filtering option to specify what resources are migrated. Filtering is done on two levels:
-- [top resources level filtering](#top-resources-level-filtering) (`--resources` filter option)
-- [individual resource level filtering](#per-resource-level-filtering) (`--filter` filter option)
-
-#### Top resources level filtering
-By default all resources are imported, synced, etc. If you would like to perform actions on a specific top level resource, or subset of resources, use `--resources` option.
-
-For example, the command `datadog-sync import --resources="dashboard_lists,dashboards"` will import ALL dashboards and dashboard lists in your Datadog organization.
+### Top resources level filtering
+To perform actions on a specific top level resource, or subset of resources, use `--resources` option. For example, this command imports **ALL** dashboards and dashboard lists in your Datadog organization:
+```shell
+datadog-sync import --resources="dashboard_lists,dashboards"
+```
-#### Individual resource level filtering
-Individual resources can be further filtered using the `--filter `flag. For example, the following command will import ALL dashboards and ONLY dashboard lists with the `name` attribute equal to `My custom list`:
+### Individual resource level filtering
+Individual resources can be further filtered using the `--filter ` flag. For example, the following command imports **ALL** dashboards and **ONLY** dashboard lists with the `name` attribute equal to `My custom list`:
```shell
datadog-sync import --resources="dashboards,dashboard_lists" --filter='Type=dashboard_lists;Name=name;Value=My custom list'
```
-The filter option `--filter` accepts a string made up of `key=value` pairs separated by `;` as seen in the example here:
+The `--filter` option accepts a string of `key=value` pairs separated by `;` as seen in this example:
```shell
--filter 'Type=;Name=;Value=;Operator='
```
-##### SubString and ExactMatch Deprecation
-
-In future releases the `SubString` and `ExactMatch` Operator will be removed in favor of the `Value` key. This is because the `Value` key supports regex so both of these scenarios are covered by just writing the appropriate regex. Below is an example:
-
-For example, if you would like to filter for monitors that have the `filter test` in the `name` attribute:
-
-| Operator | Command |
-| -------------------| ------------------------------------------------------------------------------|
-| `SubString` | `--filter 'Type=monitors;Name=name;Value=filter test;Operator=SubString'` |
-| Using `Value` | `--filter 'Type=monitors;Name=name;Value=.*filter test.*` |
-| `ExactMatch`
| `--filter 'Type=monitors;Name=name;Value=filter test;Operator=ExactMatch'` |
-| Using `Value` | `--filter 'Type=monitors;Name=name;Value=^filter test$` |
-
-
-
-### List of available keys
-
+#### List of available keys
+
-By default, if multiple filters are passed for the same resource, OR logic is applied to the filters. This behavior can be adjusted using the `--filter-operator` option.
+| Keys | description |
+| ------------------------|---------------------------------------------------------------|
+|**Type** (required) |Resource such as Monitors, Dashboards, and more. |
+|**Name** (required) |Attribute key to filter on. This can be any attribute represented in dot notation such as `attributes.user_count`. |
+|**Value** (required) |Regex to filter attribute value by. Note: special regex characters need to be escaped if filtering by raw string. |
+|**Operator** |All invalid operator's default to ExactMatch. Available operators are:
- `Not`: Match not equal to Value.
- `SubString` (_Deprecated_): Sub string matching. This operator will be removed in future releases. See the [SubString and ExactMatch Deprecation](#substring-and-exactmatch-deprecation) section.
- `ExactMatch` (_Deprecated_): Exact string match. This operator will be removed in future releases. See the [SubString and ExactMatch Deprecation](#substring-and-exactmatch-deprecation) section. |
+
+By default, if multiple filters are passed for the same resource, the **OR** logic is applied to the filters. This behavior can be adjusted using the `--filter-operator` option. (`DO WE HAVE AN EXAMPLE OF THIS USAGE`)
+
+
+
+
+#### SubString and ExactMatch Deprecation
+
+In future releases (`IN WHICH RELEASES IS THIS RELEASED?`) the `SubString` and `ExactMatch` Operator will be removed in favor of the `Value` key. This is because the `Value` key supports regex so both of these scenarios are covered by just writing the appropriate regex.
+
+This example shows the difference in syntax when using `Value` to filter for monitors that have `filter test` in the `name` attribute:
+
+| Operator | Command |
+| -------------------| ------------------------------------------------------------------------------|
+| `SubString` | `--filter 'Type=monitors;Name=name;Value=filter test;Operator=SubString'` |
+| Using `Value` | `--filter 'Type=monitors;Name=name;Value=.*filter test.*` |
+| `ExactMatch` | `--filter 'Type=monitors;Name=name;Value=filter test;Operator=ExactMatch'` |
+| Using `Value` | `--filter 'Type=monitors;Name=name;Value=^filter test$` |
+
+
+
## Additional configurations
### Using custom configuration instead of options
-You can use a custom configuration text file in place of using `options`. This is an example config file for a `US1` source URL and `EU` destination URL:
+You can use a custom configuration text file in place of using filtering options. This is an example config file for a `US1` source URL and `EU` destination URL:
```shell
destination_api_url="https://api.datadoghq.eu"
@@ -330,9 +344,9 @@ datadog-sync import --config config
```
### Using the cleanup flag to sync changes from the source destination
-The tool's `sync` command provides a cleanup flag (`--cleanup`). Passing the cleanup flag will delete resources from the destination organization which have been removed from the source organization. The resources to be deleted are determined based on the difference between the [state files](#) of source and destination organization.
+The `sync` command provides a cleanup flag (`--cleanup`). Passing the cleanup flag ensures deleted resources from the source are also removed from the destination organization. The resources to be deleted are determined by the differences in the [state files](#state-files---avoid-data-duplication-while-keeping-data-seperation) of source and destination organizations.
-For example, `ResourceA` and `ResourceB` are imported and synced, followed by deleting `ResourceA` from the source organization. Running the `import` command will update the source organizations state file to only include `ResourceB`. The following `sync --cleanup=Force` command will now delete `ResourceA` from the destination organization.
+For example, let's take a `ResourceA` and `ResourceB` that are imported and synced. After a deletion of `ResourceA` from the source organization, running the `import` command updates the source organization's state file to only include `ResourceB`. Running the `sync --cleanup=Force` command deletes `ResourceA` from the destination organization.
### Verify your Datadog disaster recovery (DDR) status
@@ -340,12 +354,14 @@ For example, `ResourceA` and `ResourceB` are imported and synced, followed by de
By default all commands check the Datadog Disaster Recovery (DDR) status of both the source and destination organizations before running. This behavior is controlled by the boolean flag `--verify-ddr-status` or the environment variable `DD_VERIFY_DDR_STATUS`.
-### State files [Avoid data duplication while keeping data seperation]
+### State files - how to avoid data duplication while keeping data seperation
By default, a `resources` directory is generated in the current working directory of the user. This directory contains `json` mapping of resources between the source and destination organization. To avoid duplication and loss of mapping, this directory should be retained between tool usage. To override these directories use the `--source-resources-path` and `--destination-resource-path`.
When running againts multiple destination organizations, a seperate working directory should be used to ensure seperation of data.
+
+
## Best practices
### Resource subsets must be migrated with their dependencies
Many Datadog resources are interdependent. For example, some Datadog resource can reference `roles` and `dashboards`, which includes widgets that may use monitors or synthetics data. The datadog-sync tool syncs these resources in order to ensure dependencies are not broken.
From 216602bfb73e634295a23a3f9d59e2c0537dbe31 Mon Sep 17 00:00:00 2001
From: Ida Adjivon
Date: Thu, 3 Jul 2025 16:28:14 -0400
Subject: [PATCH 13/15] changes
---
content/en/account_management/guide/sync-cli.md | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index 861ec37d3f8..639268e21bd 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -302,7 +302,7 @@ Operator
|**Value** (required) |Regex to filter attribute value by. Note: special regex characters need to be escaped if filtering by raw string. |
|**Operator** |All invalid operator's default to ExactMatch. Available operators are:
- `Not`: Match not equal to Value.
- `SubString` (_Deprecated_): Sub string matching. This operator will be removed in future releases. See the [SubString and ExactMatch Deprecation](#substring-and-exactmatch-deprecation) section.
- `ExactMatch` (_Deprecated_): Exact string match. This operator will be removed in future releases. See the [SubString and ExactMatch Deprecation](#substring-and-exactmatch-deprecation) section. |
-By default, if multiple filters are passed for the same resource, the **OR** logic is applied to the filters. This behavior can be adjusted using the `--filter-operator` option. (`DO WE HAVE AN EXAMPLE OF THIS USAGE`)
+If multiple filters are passed for the same resource, the **OR** logic is applied to the filters by default. This behavior can be adjusted using the `--filter-operator` option. (`DO WE HAVE AN EXAMPLE OF THIS USAGE`)
@@ -344,7 +344,7 @@ datadog-sync import --config config
```
### Using the cleanup flag to sync changes from the source destination
-The `sync` command provides a cleanup flag (`--cleanup`). Passing the cleanup flag ensures deleted resources from the source are also removed from the destination organization. The resources to be deleted are determined by the differences in the [state files](#state-files---avoid-data-duplication-while-keeping-data-seperation) of source and destination organizations.
+The `sync` command provides a `--cleanup` flag. Passing the cleanup flag ensures deleted resources from the source are also removed from the destination organization. The resources to be deleted are determined by the differences in the [state files](#state-files---avoid-data-duplication-while-keeping-data-seperation) of source and destination organizations.
For example, let's take a `ResourceA` and `ResourceB` that are imported and synced. After a deletion of `ResourceA` from the source organization, running the `import` command updates the source organization's state file to only include `ResourceB`. Running the `sync --cleanup=Force` command deletes `ResourceA` from the destination organization.
From f6a05b4bb9814906a99adde4d99dda40a84dcc9a Mon Sep 17 00:00:00 2001
From: Ida Adjivon
Date: Thu, 3 Jul 2025 17:14:18 -0400
Subject: [PATCH 14/15] changes
---
content/en/account_management/guide/sync-cli.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index 639268e21bd..65c6914c580 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -346,7 +346,7 @@ datadog-sync import --config config
### Using the cleanup flag to sync changes from the source destination
The `sync` command provides a `--cleanup` flag. Passing the cleanup flag ensures deleted resources from the source are also removed from the destination organization. The resources to be deleted are determined by the differences in the [state files](#state-files---avoid-data-duplication-while-keeping-data-seperation) of source and destination organizations.
-For example, let's take a `ResourceA` and `ResourceB` that are imported and synced. After a deletion of `ResourceA` from the source organization, running the `import` command updates the source organization's state file to only include `ResourceB`. Running the `sync --cleanup=Force` command deletes `ResourceA` from the destination organization.
+For example, let's take a *ResourceA* and *ResourceB* that are imported and synced. After a deletion of *ResourceA* from the source organization, running the `import` command updates the source organization's state file to only include *ResourceB* . Running the `sync --cleanup=Force` command deletes *ResourceA* from the destination organization.
### Verify your Datadog disaster recovery (DDR) status
From b83310ba300358867da431ace100435bae4286cf Mon Sep 17 00:00:00 2001
From: Ida Adjivon
Date: Tue, 8 Jul 2025 15:27:30 -0400
Subject: [PATCH 15/15] additional read through
---
.../en/account_management/guide/sync-cli.md | 42 +++++++++----------
1 file changed, 19 insertions(+), 23 deletions(-)
diff --git a/content/en/account_management/guide/sync-cli.md b/content/en/account_management/guide/sync-cli.md
index 65c6914c580..2d1f4090374 100644
--- a/content/en/account_management/guide/sync-cli.md
+++ b/content/en/account_management/guide/sync-cli.md
@@ -11,8 +11,6 @@ further_reading:
- link: "/account_management/saml/"
tag: "Documentation"
text: "Configure SAML for your Datadog account"
-- link: "/account_management/billing/usage_details"
- tag: "Documentation"
---
## Overview
@@ -74,7 +72,7 @@ Before you begin, confirm that both the **resource** you are migrating and the *
{{% /collapse-content %}}
{{% collapse-content title="List of supported API URLs" level="h5" expanded=true id="id-for-resources" %}}
-These are the supported URLs for the source and destination API URLs when syncing your organization:
+These are the supported URLs for the source and destination API URLs when syncing your organization: `IS AP2 SUPPORTED?` AND `HOW DOES IT WORK EXACTLY WITH THE GOV SITE SINCE DATA IS SPECIFIC TO THAT SITE?`
| Site | API URL | Site Parameter | Location |
|---------|---------------------------------|---------------------|----------|
@@ -99,7 +97,7 @@ The `datadog-sync-cli` tool can be installed from [source](#installing-from-sour
{{% collapse-content title="Installing from source" level="h5" expanded=true id="id-for-anchoring" %}}
-Installing from source requires Python `v3.9+`
+Installing from source requires Python `v3.9 and above`
1. Clone the project repo and CD into the directory
@@ -116,7 +114,7 @@ Installing from source requires Python `v3.9+`
pip install .
```
-3. Invoke the cli tool using
+3. Invoke the CLI tool using `WHAT'S AN EXAMPLE TO ADD HERE FOR `
```shell
datadog-sync
```
@@ -168,7 +166,7 @@ Installing from source requires Python `v3.9+`
{{% collapse-content title="Installing using Docker" level="h5" expanded=true id="id-for-anchoring" %}}
-1. Clone the project repo and CD into the directory
+1. Clone the project repo and **cd** into the directory
```shell
git clone https://github.com/DataDog/datadog-sync-cli.git
@@ -181,7 +179,7 @@ Installing from source requires Python `v3.9+`
docker build . -t datadog-sync
```
-3. Run the Docker image using entrypoint below:
+3. Run the Docker image using the entrypoint below:
```
docker run --rm -v :/datadog-sync:rw \
-e DD_SOURCE_API_KEY= \
@@ -192,14 +190,14 @@ Installing from source requires Python `v3.9+`
-e DD_DESTINATION_API_URL= \
datadog-sync:latest
```
-The docker run command mounts a specified `` working directory to the container.
+The `docker run` command mounts a specified `` working directory to the container.
{{% /collapse-content %}}
## How to use the datadog-sync-cli tool
- Note: The sync-cli tool uses the resources directory as the source of truth for determining what resources need to be created and modified. This directory should not be removed or corrupted.
+ Note: The sync-cli tool uses the resources directory as the source of truth for determining what resources need to be created and modified. This directory should not be removed or corrupted.
### Steps to sync your resources
@@ -252,19 +250,19 @@ $ datadog-sync sync \
## Filter the data collected
-By default all resources are imported and synced. You can use the filtering option to specify what resources are migrated. Filtering can be done on two levels:
+By default all resources are imported and synced. To specify what resources are migrated, use the filtering option. Filtering can be done on two levels:
- [Top resources level ](#top-resources-level-filtering)
- [Individual resource level ](#per-resource-level-filtering)
### Top resources level filtering
-To perform actions on a specific top level resource, or subset of resources, use `--resources` option. For example, this command imports **ALL** dashboards and dashboard lists in your Datadog organization:
+To perform actions on a specific top level resource, or subset of resources, use `--resources` option. For example, this command imports **ALL** dashboards and **ALL** dashboard lists in your Datadog organization:
```shell
datadog-sync import --resources="dashboard_lists,dashboards"
```
### Individual resource level filtering
-Individual resources can be further filtered using the `--filter ` flag. For example, the following command imports **ALL** dashboards and **ONLY** dashboard lists with the `name` attribute equal to `My custom list`:
+Individual resources can be further filtered using the `--filter ` flag. For example, the following command imports **ALL** dashboards and **ONLY** dashboard lists where the `name` attribute is equal to `My custom list`:
```shell
datadog-sync import --resources="dashboards,dashboard_lists" --filter='Type=dashboard_lists;Name=name;Value=My custom list'
@@ -299,7 +297,7 @@ Operator
| ------------------------|---------------------------------------------------------------|
|**Type** (required) |Resource such as Monitors, Dashboards, and more. |
|**Name** (required) |Attribute key to filter on. This can be any attribute represented in dot notation such as `attributes.user_count`. |
-|**Value** (required) |Regex to filter attribute value by. Note: special regex characters need to be escaped if filtering by raw string. |
+|**Value** (required) |Regex to filter attribute value by.
Note: special regex characters need to be escaped if filtering by raw string. |
|**Operator** |All invalid operator's default to ExactMatch. Available operators are:
- `Not`: Match not equal to Value.
- `SubString` (_Deprecated_): Sub string matching. This operator will be removed in future releases. See the [SubString and ExactMatch Deprecation](#substring-and-exactmatch-deprecation) section.
- `ExactMatch` (_Deprecated_): Exact string match. This operator will be removed in future releases. See the [SubString and ExactMatch Deprecation](#substring-and-exactmatch-deprecation) section. |
If multiple filters are passed for the same resource, the **OR** logic is applied to the filters by default. This behavior can be adjusted using the `--filter-operator` option. (`DO WE HAVE AN EXAMPLE OF THIS USAGE`)
@@ -309,7 +307,7 @@ If multiple filters are passed for the same resource, the **OR** logic is applie
#### SubString and ExactMatch Deprecation
-In future releases (`IN WHICH RELEASES IS THIS RELEASED?`) the `SubString` and `ExactMatch` Operator will be removed in favor of the `Value` key. This is because the `Value` key supports regex so both of these scenarios are covered by just writing the appropriate regex.
+In future releases (`IN WHICH RELEASES IS THIS AVAILABLE?`) the `SubString` and `ExactMatch` Operator will be removed in favor of the `Value` key. This is because the `Value` key supports regex so both of these scenarios are covered by just writing the appropriate regex.
This example shows the difference in syntax when using `Value` to filter for monitors that have `filter test` in the `name` attribute:
@@ -325,7 +323,7 @@ This example shows the difference in syntax when using `Value` to filter for mon
## Additional configurations
### Using custom configuration instead of options
-You can use a custom configuration text file in place of using filtering options. This is an example config file for a `US1` source URL and `EU` destination URL:
+You can use a custom configuration text file in place of using filtering options. This is an example config file for a `US1` source and `EU` destination:
```shell
destination_api_url="https://api.datadoghq.eu"
@@ -344,9 +342,9 @@ datadog-sync import --config config
```
### Using the cleanup flag to sync changes from the source destination
-The `sync` command provides a `--cleanup` flag. Passing the cleanup flag ensures deleted resources from the source are also removed from the destination organization. The resources to be deleted are determined by the differences in the [state files](#state-files---avoid-data-duplication-while-keeping-data-seperation) of source and destination organizations.
+The `sync` command provides a `--cleanup` flag. Passing the cleanup flag ensures deleted resources from the source are also removed from the destination organization. The resources to be deleted are determined by the differences in the [state files](#state-files---avoid-data-duplication-while-keeping-data-seperation) of the source and destination organizations.
-For example, let's take a *ResourceA* and *ResourceB* that are imported and synced. After a deletion of *ResourceA* from the source organization, running the `import` command updates the source organization's state file to only include *ResourceB* . Running the `sync --cleanup=Force` command deletes *ResourceA* from the destination organization.
+For example, let's take a *ResourceA* and *ResourceB* that are imported and synced. After a deletion of *ResourceA* from the source organization, running the `import` command updates the source organization's state file to only include *ResourceB*. Running the `sync --cleanup=Force` command deletes *ResourceA* from the destination organization.
### Verify your Datadog disaster recovery (DDR) status
@@ -356,7 +354,7 @@ By default all commands check the Datadog Disaster Recovery (DDR) status of both
### State files - how to avoid data duplication while keeping data seperation
-By default, a `resources` directory is generated in the current working directory of the user. This directory contains `json` mapping of resources between the source and destination organization. To avoid duplication and loss of mapping, this directory should be retained between tool usage. To override these directories use the `--source-resources-path` and `--destination-resource-path`.
+By default, a `resources` directory is generated in the current working directory of the user. This directory contains a `json` mapping of the resources between the source and destination organization. To avoid duplication and loss of mapping, this directory should be retained between tool usage. To override these directories use the `--source-resources-path` and `--destination-resource-path`.
When running againts multiple destination organizations, a seperate working directory should be used to ensure seperation of data.
@@ -364,11 +362,11 @@ When running againts multiple destination organizations, a seperate working dire
## Best practices
### Resource subsets must be migrated with their dependencies
-Many Datadog resources are interdependent. For example, some Datadog resource can reference `roles` and `dashboards`, which includes widgets that may use monitors or synthetics data. The datadog-sync tool syncs these resources in order to ensure dependencies are not broken.
+If importing or syncing subset of resources, users should ensure that dependent resources are imported and synced as well. This is because many Datadog resources are interdependent.
-If importing/syncing subset of resources, users should ensure that dependent resources are imported and synced as well.
+For example, some Datadog resource can reference `roles` and `dashboards`, which includes widgets that may use monitors or synthetics data. The `datadog-sync-cli` tool syncs these resources in order to ensure dependencies are not broken.
-See [Supported resources](#supported-resources) section below for potential resource dependencies.
+See the list of resources dependencies below for potential resource dependencies.
{{% collapse-content title="List of potential resources dependencies" level="h5" expanded=true id="id-for-resources" %}}
@@ -417,5 +415,3 @@ See [Supported resources](#supported-resources) section below for potential reso
{{< partial name="whats-next/whats-next.html" >}}
[1]: https://docs.datadoghq.com/getting_started/site/
-
-