Skip to content

Commit 864cdc4

Browse files
buraizudomalessi
andauthored
Docs11112/storage monitoring restructure (#29706)
* [DOCS-11112] Add cloud storage monitoring folder * [DOCS-11112] Add tile partials * [DOCS-11112] Add tile partials, edit style * [DOCS-11112] Revert header changes * [DOCS-11112] Incorporate feedback * [DOCS-11112] Add AWS doc link * Apply suggestions from code review Co-authored-by: domalessi <111786334+domalessi@users.noreply.github.com> * Incorporate feedback Co-authored-by: domalessi <111786334+domalessi@users.noreply.github.com> * Fix link * [DOCS-11112] Add existing inventory tab * [DOCS-11112] Fix link --------- Co-authored-by: domalessi <111786334+domalessi@users.noreply.github.com>
1 parent 8d2afce commit 864cdc4

File tree

2 files changed

+179
-57
lines changed

2 files changed

+179
-57
lines changed

content/en/integrations/guide/storage-monitoring-setup.md

Lines changed: 150 additions & 57 deletions
Original file line numberDiff line numberDiff line change
@@ -9,51 +9,76 @@ private: true
99

1010
Storage Monitoring for Amazon S3, Google Cloud Storage, and Azure Blob Storage provides deep, prefix-level analytics to help you understand exactly how your storage is being used. It detects potential issues before they impact operations, and helps you make data-driven decisions about storage optimization. Use these insights to track storage growth, investigate access patterns, and optimize costs.
1111

12-
This guide explains how to configure Storage Monitoring in Datadog for your S3 buckets, GCS buckets, and Azure Storage Accounts.
12+
This guide explains how to configure Storage Monitoring in Datadog for your Amazon S3 buckets, Google Cloud Storage buckets, and Azure storage accounts. Access your Storage Monitoring data by navigating to **Infrastructure > Storage Monitoring**.
1313

14-
Access your Storage Monitoring data by navigating to **Infrastructure > Storage Monitoring**.
14+
Select your cloud storage service to access setup instructions.
1515

16-
## Setup for Amazon S3
16+
{{< partial name="cloud_storage_monitoring/storage-monitoring-setup.html" >}}
1717

18-
### Installation
18+
## Setup for Amazon S3
1919

2020
{{< tabs >}}
2121
{{% tab "Recommended: Storage Monitoring UI" %}}
2222

23-
The fastest way to set up Storage Monitoring is going to **Infrastructure > Storage Monitoring > [Add Buckets][1]**. On the Add Buckets page, you can configure multiple S3 buckets for Storage Monitoring in one go.
23+
The fastest way to configure Storage Monitoring is through the [Add Buckets][501] page in Datadog, where you can set up multiple S3 buckets at the same time.
2424

25-
1. Go to Datadog > Infrastructure > Storage Monitoring.
26-
27-
2. Click [Add Buckets][1].
25+
1. Go to Datadog > **Infrastructure** > **Storage Monitoring**.
26+
2. Click [Add Buckets][501].
2827

2928
{{< img src="integrations/guide/storage_monitoring/add-buckets.png" alt="Select buckets for enabling Storage Monitoring" responsive="true">}}
3029

3130
3. Enable Amazon S3 Integration and Resource collection for all the AWS accounts you want to monitor.
3231

33-
**Note**: For each AWS account that has the S3 buckets you want to monitor, make sure your Datadog IAM roles include the following permissions: `s3:GetObject`, `s3:ListObjects`, and `s3:PutInventoryConfiguration`.
32+
1. **Allow Datadog to read from your destination buckets.** Add the following permissions to the Datadog IAM integration role for the account that owns the destination buckets:
33+
- `s3:GetObject`
34+
- `s3:ListBucket`
35+
36+
Scope these read-only permissions to only the destination buckets containing your S3 inventory files.
37+
38+
1. **Allow source buckets to write to destination buckets.** The destination buckets must include a policy that allows the source buckets to write inventory data. See [Creating a destination bucket policy][502] in the AWS documentation for details.
39+
40+
Example source-bucket policy:
41+
42+
```json
43+
{
44+
"Version": "2012-10-17",
45+
"Statement": [
46+
{
47+
"Sid": "AllowListInventoryBucket",
48+
"Effect": "Allow",
49+
"Action": "s3:ListBucket",
50+
"Resource": "arn:aws:s3:::storage-monitoring-s3-inventory-destination"
51+
},
52+
{
53+
"Sid": "AllowGetInventoryObjects",
54+
"Effect": "Allow",
55+
"Action": "s3:GetObject",
56+
"Resource": "arn:aws:s3:::storage-monitoring-s3-inventory-destination/*"
57+
}
58+
]
59+
}
60+
61+
```
3462

3563
4. Select the S3 buckets you want to monitor with Storage Monitoring. You can select buckets from multiple AWS accounts at once.
3664

3765
{{< img src="integrations/guide/storage_monitoring/step-2.png" alt="Select buckets for enabling Storage Monitoring" responsive="true">}}
3866

3967
5. Assign a destination bucket per region to store S3 inventory reports from the source buckets. This can be an existing AWS bucket or a new one.
4068

41-
- Source bucket: The S3 bucket you want to monitor with Storage Monitoring
42-
- Destination bucket: Used to store inventory reports (one per AWS region, can be reused)
43-
69+
- Source bucket: The S3 bucket you want to monitor with Storage Monitoring
70+
- Destination bucket: Used to store inventory reports (one per AWS region, can be reused)
4471
6. Complete the configuration. The inventory generation process will start within AWS within 24 hours of the first report.
45-
4672
7. Return to **Infrastructure > Storage Monitoring** to see your bucket(s) appear.
4773

48-
[1]: https://app.datadoghq.com/storage-monitoring?mConfigure=true
49-
74+
[501]: https://app.datadoghq.com/storage-monitoring?mConfigure=true
75+
[502]: https://docs.aws.amazon.com/AmazonS3/latest/userguide/configure-inventory.html#configure-inventory-destination-bucket-policy
5076
{{% /tab %}}
5177
{{% tab "CloudFormation" %}}
5278

5379
You can also set up Storage Monitoring using the provided CloudFormation templates. This process involves two steps:
5480

55-
#### Step 1: Configure inventory generation
56-
81+
### Step 1: Configure inventory generation
5782

5883
This template configures your existing S3 bucket to generate inventory reports, which Datadog uses to generate detailed metrics about your bucket prefixes.
5984

@@ -81,7 +106,7 @@ This template configures your existing S3 bucket to generate inventory reports,
81106
**Note:** This CloudFormation template can be rolled back, but rolling back doesn't delete the created resources. This is to ensure the existing bucket doesn't get deleted. You can manually delete the inventory configurations by going on the **Management** tab in the bucket view.
82107

83108
**Note:** Review [Amazon S3 pricing][106] for costs related to inventory generation.
84-
#### Step 2: Configure required permissions
109+
### Step 2: Configure required permissions
85110

86111
This template creates two IAM policies:
87112
- A policy to allow Datadog to read inventory files from the destination bucket
@@ -100,12 +125,11 @@ This template creates two IAM policies:
100125
- **SourceBucketPrefix**: This parameter limits the inventory generation to a specific prefix in the source bucket
101126
- **DestinationBucketPrefix**: If you want to reuse an existing bucket as the destination, this parameter allows the inventory files to be shipped to a specific prefix in that bucket. Ensure that any prefixes do not include trailing slashes (`/`)
102127

103-
104128
{{< img src="integrations/guide/storage_monitoring/bucket_policy_stack_details.png" alt="Stack parameters for bucket policy" responsive="true" style="width:90%;" >}}
105129

106130
6. On the **Review and create** step, verify the parameters have been entered correctly, and click **Submit**.
107131

108-
#### Post-setup steps
132+
### Post-setup steps
109133

110134
After completing the CloudFormation setup, fill out the [post-setup form][105] with the following required information:
111135
1. Name of the destination bucket holding the inventory files.
@@ -122,22 +146,91 @@ After completing the CloudFormation setup, fill out the [post-setup form][105] w
122146
[105]: https://forms.gle/L97Ndxr2XLen1GBs7
123147
[106]: https://aws.amazon.com/s3/pricing/
124148
{{% /tab %}}
149+
150+
{{% tab "Terraform" %}}
151+
152+
You can use the Terraform [aws_s3_bucket_inventory][403] resource to set up Storage Monitoring.
153+
154+
The following example shows how to enable daily inventory on an S3 bucket for Datadog monitoring. To use this example:
155+
156+
- Replace `<MY_MONITORED_BUCKET>` with the name of the bucket to be monitored.
157+
- Replace `<MY_INVENTORY_DESTINATION>` with the name of the bucket that receives your inventory files.
158+
- Replace `<DESTINATION_ACCOUNT_ID>` with the AWS account ID that owns the destination bucket.
159+
160+
```tf
161+
resource "aws_s3_bucket" "monitored" {
162+
bucket = "<MY_MONITORED_BUCKET>"
163+
}
164+
165+
resource "aws_s3_bucket" "inventory_destination" {
166+
bucket = "<MY_INVENTORY_DESTINATION>"
167+
}
168+
169+
resource "aws_s3_bucket_inventory" "daily_inventory" {
170+
bucket = aws_s3_bucket.monitored.id
171+
name = "datadog-daily-inventory"
172+
173+
174+
included_object_versions = "All"
175+
schedule {
176+
frequency = "Daily"
177+
}
178+
destination {
179+
bucket {
180+
account_id = "<DESTINATION_ACCOUNT_ID>"
181+
bucket_arn = aws_s3_bucket.inventory_destination.arn
182+
format = "CSV"
183+
prefix = "datadog-inventory/"
184+
}
185+
}
186+
optional_fields = [
187+
"Size",
188+
"StorageClass",
189+
"LastModifiedDate"
190+
]
191+
}
192+
```
193+
194+
**Notes**:
195+
196+
- The destination bucket can be your source bucket, but for security and logical separation, many organizations use a separate bucket.
197+
- The `optional_fields` section is recommended for Datadog prefix metrics.
198+
199+
### Post-setup steps
200+
201+
Once the inventory configuration is set up and your inventory files begin appearing in the destination bucket, fill out [this form][401] to provide your S3 configuration details. This allows Datadog to begin generating prefix metrics for your storage.
202+
203+
### Use modules for complex setups
204+
205+
If you need to manage multiple buckets, complex inventory policies, encryption, or cross-account setups, you can use the [terraform-aws-s3-bucket module][402].
206+
207+
### Troubleshooting
208+
209+
- S3 Inventory files are delivered daily, and may take up to 24 hours to appear after setup.
210+
- Ensure IAM permissions allow S3 to write inventory files to your destination bucket.
211+
- If cross-account access is needed, confirm that the inventory destination prefix (`datadog-inventory/` in the example) is correct and accessible to Datadog.
212+
213+
[401]: https://docs.google.com/forms/d/e/1FAIpQLScd0xor8RQ76N6BemvvMzg9UU7Q90svFrNGY8n83sMF2JXhkA/viewform
214+
[402]: https://github.com/terraform-aws-modules/terraform-aws-s3-bucket/tree/master/examples/s3-inventory
215+
[403]: https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/s3_bucket_inventory
216+
{{% /tab %}}
217+
125218
{{% tab "AWS Console" %}}
126219

127220
To manually set up the required [Amazon S3 Inventory][206] and related configuration, follow these steps:
128221

129-
#### Step 1: Create a destination bucket
222+
### Step 1: Create a destination bucket
130223

131224
1. [Create an S3 bucket][201] to store your inventory files. This bucket acts as the central location for inventory reports. **Note**: You must only use one destination bucket for all inventory files generated in an AWS account.
132225
2. Create a prefix within the destination bucket (optional).
133226

134-
#### Step 2: Configure the bucket and integration role policies
227+
### Step 2: Configure the bucket and integration role policies
135228

136229
1. Follow the steps in the [Amazon S3 user guide][202] to add a bucket policy to your destination bucket allowing write access (`s3:PutObject`) from your source buckets.
137230

138231
2. Ensure the Datadog AWS integration role has `s3:GetObject` and `s3:ListObjects` permissions on the destination bucket. These permissions allow Datadog to read the generated inventory files.
139232

140-
#### Step 3: Configure Inventory generation
233+
### Step 3: Configure inventory generation
141234

142235
For each bucket you want to monitor:
143236
1. Go to the [Amazon S3 buckets page][203] in the AWS console, and select the bucket.
@@ -160,7 +253,7 @@ For each bucket you want to monitor:
160253

161254
**Note**: Review [Amazon S3 pricing][204] for costs related to inventory generation.
162255

163-
#### Post-setup steps
256+
### Post-setup steps
164257

165258
After completing the above steps, fill out the [post-setup form][205] with the following required information:
166259

@@ -178,6 +271,18 @@ After completing the above steps, fill out the [post-setup form][205] with the f
178271
[205]: https://forms.gle/L97Ndxr2XLen1GBs7
179272
[206]: https://docs.aws.amazon.com/AmazonS3/latest/userguide/configure-inventory.html
180273
{{% /tab %}}
274+
275+
{{% tab "Existing S3 inventory" %}}
276+
277+
If you have already configured S3 inventory for the buckets you want to monitor, choose **one** of the following options:
278+
279+
- Fill out [this form][601] to share your configurations with Datadog
280+
- [Reach out to us][602] to use an API for setting up multiple buckets
281+
282+
[601]: https://forms.gle/dhDbSxTvCUDXg1QR7
283+
[602]: mailto:storage-monitoring@datadoghq.com
284+
{{% /tab %}}
285+
181286
{{< /tabs >}}
182287

183288
### Validation
@@ -188,28 +293,26 @@ To verify your setup:
188293
- Confirm the Datadog integration can access the files:
189294
- Navigate to **Infrastructure -> Storage Monitoring -> Installation Recommendations** to see if the bucket you configured is showing in the list
190295

191-
192296
### Troubleshooting
297+
193298
If you encounter any issues or need assistance:
194299
- Make sure to use only one destination bucket for all inventory files per AWS account
195300
- Verify all permissions are correctly configured
196301
- If you're still encountering issues, [reach out][1] with your bucket details, AWS account ID, and Datadog org ID
197302

198303
## Setup for Google Cloud Storage
199304

200-
### Installation
201-
202305
The process involves the following steps:
203306

204-
#### Step 1: Install the GCP integration and enable resource collection
307+
#### Step 1: Install the Google Cloud integration and enable resource collection
205308

206-
To collect GCP Storage metrics from your GCP project, install the GCP integration in Datadog. Enable Resource Collection for the project containing the buckets you want to monitor. Resource Collection allows Datadog to associate your buckets' labels with the metrics collected through storage monitoring.
309+
To collect Google Cloud Storage metrics from your Google Cloud project, install the Google Cloud integration in Datadog. Enable Resource Collection for the project containing the buckets you want to monitor. Resource Collection allows Datadog to associate your buckets' labels with the metrics collected through storage monitoring.
207310

208311
**Note**: While you can disable specific metric namespaces, keep the Cloud Storage namespace (gcp.storage) enabled.
209312

210313
#### Step 2: Enable the Storage Insights API
211314

212-
Enable the [Storage Insights][2] API in your GCP project.
315+
Enable the [Storage Insights][2] API in your Google Cloud project.
213316

214317
#### Step 3: Grant service agent permissions
215318

@@ -226,21 +329,21 @@ You can create an inventory report configuration in multiple ways. The quickest
226329

227330
1. Includes these metadata fields: `"bucket", "name", "project", "size", "updated", "storageClass"`
228331
2. Generates CSV reports with `'\n'` as the delimiter and `','` as the separator
229-
3. Uses this destination path format: `<Bucket>/{{date}}`, where `<Bucket>` is the monitored bucket-name
332+
3. Uses this destination path format: `<BUCKET>/{{date}}`, where `<BUCKET>` is the monitored bucket-name
230333

231334
{{< tabs >}}
232335
{{% tab "Google Cloud CLI" %}}
233336

234337
Use the [Google Cloud CLI][301] to run the following command:
235338

236339
```
237-
gcloud storage insights inventory-reports create <source_bucket_url> \
340+
gcloud storage insights inventory-reports create <SOURCE_BUCKET_URL> \
238341
--no-csv-header \
239342
--display-name=datadog-storage-monitoring \
240-
--destination=<gs://my_example_destination_bucket/source_bucket_name/{{date}}> \
343+
--destination=gs://<DESTINATION_BUCKET>/<SOURCE_BUCKET>/{{date}}> \
241344
--metadata-fields=project,bucket,name,size,updated,storageClass \
242345
--schedule-starts=<YYYY-MM-DD> \
243-
--schedule-repeats=<daily|weekly> \
346+
--schedule-repeats=<DAILY|WEEKLY> \
244347
--schedule-repeats-until=<YYYY-MM-DD>
245348
```
246349

@@ -249,7 +352,7 @@ gcloud storage insights inventory-reports create <source_bucket_url> \
249352
{{% /tab %}}
250353
{{% tab "Terraform" %}}
251354

252-
Copy the following Terraform template, substitute the necessary arguments, and apply it in the GCP project that contains your bucket.
355+
Copy the following Terraform template, substitute the necessary arguments, and apply it in the Google Cloud project that contains your bucket.
253356

254357
<!-- vale off -->
255358
{{% collapse-content title="Terraform configuration for inventory reports" level="h4" expanded=true %}}
@@ -372,8 +475,8 @@ After completing the setup steps, fill out the [post-setup][3] form with the fol
372475
2. Name of the service account with the granted permissions
373476
3. Prefix where the files are stored in the destination bucket (if any)
374477
4. Name of the source bucket you want to monitor (the bucket producing inventory files)
375-
5. GCP location of the destination bucket holding the inventory files
376-
6. GCP ProjectID containing the buckets
478+
5. Google Cloud location of the destination bucket holding the inventory files
479+
6. Google Cloud ProjectID containing the buckets
377480
7. Datadog org ID
378481

379482
### Validation
@@ -385,38 +488,32 @@ To verify your setup:
385488
4. Navigate to Infrastructure -> Storage Monitoring -> Installation Recommendations to see if your configured bucket appears in the list
386489

387490
### Troubleshooting
491+
388492
If you encounter any issues or need assistance:
389-
- Use only one destination bucket for all inventory files per GCP project
493+
- Use only one destination bucket for all inventory files per Google Cloud project
390494
- Verify all permissions are correctly configured
391-
- If issues persist, [reach out][1] with your bucket details, GCP Project ID, and Datadog org ID
495+
- If issues persist, [reach out][1] with your bucket details, Google Cloud Project ID, and Datadog org ID
392496

393497
[1]: mailto:storage-monitoring@datadoghq.com
394498
[2]: https://cloud.google.com/storage/docs/insights/using-inventory-reports#enable_the_api
395499
[3]: https://forms.gle/c7b8JiLENDaUEqGk8
396500

397-
398-
399501
## Setup for Azure Blob Storage
400502

401-
### Installation
402-
403-
To set up Storage Monitoring for Azure Blob Storage, follow these steps:
404-
405503
{{< tabs >}}
406504
{{% tab "Azure CLI" %}}
407505

408-
To enable inventories for the selected storage accounts in each subscription, run the following script in your [Azure Cloud Shell][301]:
506+
Enable inventories for the selected storage accounts in each subscription by running the following script in your [Azure Cloud Shell][301]:
409507

410508
```shell
411509
curl https://datadogstoragemonitoring.blob.core.windows.net/scripts/install.sh \
412-
| bash -s -- <client_id> <subscription_id> <comma_separated_storage_account_names>
510+
| bash -s -- <CLIENT_ID> <SUBSCRIPTION_ID> <COMMA_SEPARATED_STORAGE_ACCOUNT_NAMES>
413511
```
414512

415513
Before running the script, set your [shell environment][302] to Bash and replace the various placeholder inputs with the correct values:
416-
- `<client_id>`: The client ID of an App Registration already set up using the [Datadog Azure integration][302]
417-
- `<subscription_id>`: The subscription ID of the Azure subscription containing the storage accounts
418-
- `<comma_separated_storage_account_names>`: A comma-separated list of the storage accounts you want to monitor. For example, `storageaccount1,storageaccount2`
419-
514+
- `<CLIENT_ID>`: The client ID of an App Registration already set up using the [Datadog Azure integration][302]
515+
- `<SUBSCRIPTION_ID>`: The subscription ID of the Azure subscription containing the storage accounts
516+
- `<COMMA_SEPARATED_STORAGE_ACCOUNT_NAMES>`: A comma-separated list of the storage accounts you want to monitor (for example, `storageaccount1,storageaccount2`)
420517

421518
[301]: https://shell.azure.com
422519
[302]: /integrations/azure/#setup
@@ -427,8 +524,7 @@ Before running the script, set your [shell environment][302] to Bash and replace
427524

428525
For Each Storage Account you wish to monitor, follow all of the steps here:
429526

430-
431-
#### Create a blob inventory policy
527+
### Create a blob inventory policy
432528
1. In the Azure portal, navigate to your Storage Account.
433529
2. Go to **Data management** > **Blob inventory**.
434530
3. Click **Add**.
@@ -451,7 +547,7 @@ For Each Storage Account you wish to monitor, follow all of the steps here:
451547
- Exclude prefix: datadog-storage-monitoring
452548
5. Click **Add**.
453549

454-
#### Add the role assignment
550+
### Add the role assignment
455551
1. In the Azure portal, navigate to your Storage Account.
456552
2. Go to **Data storage** > **Containers**.
457553
3. Click on the **datadog-storage-monitoring** container.
@@ -472,6 +568,3 @@ For Each Storage Account you wish to monitor, follow all of the steps here:
472568
After you finish with the above steps, fill out the [post-setup form][310].
473569

474570
[310]: https://forms.gle/WXFbGyBwWfEo3gbM7
475-
476-
477-

0 commit comments

Comments
 (0)