Enable Logpush to Datadog
Cloudflare Logpush supports pushing logs directly to Datadog via the Cloudflare dashboard or via API.
Manage via the Cloudflare dashboard
Enable Logpush to Datadog via the dashboard.
To enable the Cloudflare Logpush service:
Log in to the Cloudflare dashboard.
Select the Enterprise account or domain you want to use with Logpush.
Go to Analytics & Logs > Logs.
Click Add Logpush job. A modal window opens where you will need to complete several steps.
Select the dataset you want to push to a storage service.
Select the data fields to include in your logs. Add or remove fields later by modifying your settings in Logs > Logpush.
Select Datadog.
Enter or select the following destination information:
- Datadog URL Endpoint, which can be either one below. You can find the difference at Datadog API reference.
<div class="tab noCodeTab" id="tab-v1-00abeddfefaf8029"><code>https://http-intake.logs.datadoghq.com/v1/input</code></div> <div class="tab noCodeTab" id="tab-v2-00abeddfefaf8029"><code>https://http-intake.logs.datadoghq.com/api/v2/logs</code></div>
- Datadog API Key, can be retrieved by following these steps.
Click Validate access.
Click Save and Start Pushing to finish enabling Logpush.
Once connected, Cloudflare lists Datadog as a connected service under Logs > Logpush. Edit or remove connected services from here.
Manage via API
To set up a Datadog Logpush job:
- Create a job with the appropriate endpoint URL and authentication parameters.
- Enable the job to begin pushing logs.
1. Create a job
To create a job, make a POST
request to the Logpush jobs endpoint with the following fields:
name (optional) - Use your domain name as the job name.
destination_conf - A log destination consisting of an endpoint URL, authorization header, and zero or more optional parameters that Datadog supports in the string format below.
<DATADOG_ENDPOINT_URL>
: The Datadog HTTP logs intake endpoint, which can be either one below. You can find the difference at Datadog API reference.
https://http-intake.logs.datadoghq.com/v1/input
https://http-intake.logs.datadoghq.com/api/v2/logs
<DATADOG_API_KEY>
: The Datadog API token can be retrieved by following these steps. For example,20e6d94e8c57924ad1be3c29bcaee0197d
.ddsource
: Set tocloudflare
.service
,host
,ddtags
: Optional parameters allowed by Datadog.
"datadog://<DATADOG_ENDPOINT_URL>?header_DD-API-KEY=<DATADOG_API_KEY>&ddsource=cloudflare&service=<SERVICE>&host=<HOST>&ddtags=<TAGS>"
- dataset - The category of logs you want to receive. Refer to Log fields for the full list of supported datasets.
- logpull_options (optional) - To configure fields, sample rate, and timestamp format, refer to API configuration options.
Example request using cURL:
curl -s -X POST \https://api.cloudflare.com/client/v4/zones/<ZONE_ID>/logpush/jobs \
-H "X-Auth-Email: <EMAIL>" \
-H "X-Auth-Key: <API_KEY>" \
-d '{"name":"<DOMAIN_NAME>","destination_conf": "datadog://<DATADOG_ENDPOINT_URL>?header_DD-API-KEY=<DATADOG_API_KEY>&ddsource=cloudflare&service=<SERVICE>&host=<HOST>&ddtags=<TAGS>", "logpull_options": "fields=ClientIP,ClientRequestHost,ClientRequestMethod,ClientRequestURI,EdgeEndTimestamp,EdgeResponseBytes,EdgeResponseStatus,EdgeStartTimestamp,RayID×tamps=rfc3339", "dataset": "http_requests"}' | jq .
Response:
{ "errors": [], "messages": [], "result": { "id": 100, "dataset": "http_requests", "enabled": false, "name": "<DOMAIN_NAME>", "logpull_options": "fields=ClientIP,ClientRequestHost,ClientRequestMethod,ClientRequestURI,EdgeEndTimestamp,EdgeResponseBytes,EdgeResponseStatus,EdgeStartTimestamp,RayID×tamps=rfc3339", "destination_conf": "datadog://<DATADOG_ENDPOINT_URL>?header_DD-API-KEY=<DATADOG_API_KEY>", "last_complete": null, "last_error": null, "error_message": null }, "success": true
}
2. Enable (update) a job
To enable a job, make a PUT
request to the Logpush jobs endpoint. You will use the job ID returned from the previous step in the URL and send {"enabled": true}
in the request body.
Example request using cURL:
curl -s -X PUT \https://api.cloudflare.com/client/v4/zones/<ZONE_ID>/logpush/jobs/100 -d'{"enabled":true}' | jq .
Response:
{ "errors": [], "messages": [], "result": { "id": 100, "dataset": "http_requests", "enabled": true, "name": "<DOMAIN_NAME>", "logpull_options": "fields=ClientIP,ClientRequestHost,ClientRequestMethod,ClientRequestURI,EdgeEndTimestamp,EdgeResponseBytes,EdgeResponseStatus,EdgeStartTimestamp,RayID×tamps=rfc3339", "destination_conf": "datadog://<DATADOG_ENDPOINT_URL>?header_DD-API-KEY=<DATADOG_API_KEY>", "last_complete": null, "last_error": null, "error_message": null }, "success": true
}