Respuesta DPN Support Specialist Certification Exam

You might also like

Download as txt, pdf, or txt
Download as txt, pdf, or txt
You are on page 1of 10

DPN Support Specialist Certification Exam

Final Score: 3825/4000


Multiple Choice
1)
A common issue that users experience with log monitoring is they can see their logs
appear in the Livetail, but those logs are not showing in the Log Explorer. This
means they cannot perform any analysis on those logs.

What are the four common causes?

The log timestamp is outside of the allowed range.


The logs are submitted with reserved attributes.
There are exclusion filters.
The daily quota has been reached.
The user does not have the privilege to view the logs.
The log contains sensitive information such as passwords and credit card numbers.
Score: 75.00
Correct answer(s):
• The log timestamp is outside of the allowed range.
• The logs are submitted with reserved attributes.
• There are exclusion filters.
• The daily quota has been reached.
The user does not have the privilege to view the logs.
The log contains sensitive information such as passwords and credit card numbers.
Single Choice
2)
A metric is a generic term and is identified by the name only. It differs from a
datapoint which is more specific with various properties.

Which of the following is NOT a property of a datapoint? (Choose one).

type
name
value
• size
timestamp
tag(s)
Score: 100.00
Single Choice
3)
Conf.d is the configuration directory that contains all the configuration files for
the Agent checks.

All config files are in what format?

.json
.tomi
• .yaml
.cfg
.ini
Score: 100.00
Single Choice
4)
There are timestamp restrictions on the logs submitted to Datadog. If logs sent to
Datadog have a timestamp that is too old, or the timestamp is too far in the
future, the logs will not be processed.

The timestamp of logs need to be within what range?


Between 12 hours in the past and 12 hours in the future.
Between 18 hours in the past and 8 hours in the future.
Between 6 hours in the past and 6 hours in the future.
Between 8 hours in the past and 2 hours in the future.
• Between 18 hours in the past and 2 hours in the future.
Score: 100.00
Multiple Choice
5)
The Trace Agent is built and packaged with the Datadog Agent. It is responsible for
various tasks around APM data like keeping track of traces received from the
tracing client and sending those traces to the Datadog platform.

When troubleshooting the Datadog Agent, what information should you collect?
(Choose two).

Tracing Client Startup logs


Debug or trace level flare
Environment description
Score: 50.00
Correct answer(s):
Tracing Client Startup logs
• Debug or trace level flare
• Environment description
Single Choice
6)
Facets allow you to perform analysis on the logs you have indexed. In the Log
Explorer, you can use them to search your logs, define log patterns and perform log
analytics. You can create facets for any attributes you need.

Choose the correct statement.

• You need to be careful when creating a facet because it will only apply to new
logs and historical logs will not appear in searches and analysis.
You need to be careful when creating a facet because it will apply retroactively to
historical logs and they will appear in searches and analysis.
Score: 100.00
Multiple Choice
7)
A flare can be very useful for troubleshooting issues. However, a flare won’t help
solve all issues.

A flare can help you to identify: (Choose two)

Issues with the UI


• Issues with checks run by the Agent
Issues with installing the Agent
• Issues with the Agent
Issues with cloud integrations / crawler issues
Score: 100.00
Multiple Choice
8)
The SAML configuration in Datadog provides a feature called SAML Strict Mode. When
SAML strict mode is enabled, it ensures that all users with access to Datadog must
have valid credentials in your company’s identity provider/directory service to
access their Datadog account. You notice users are not able to log into their
Datadog account.

Why? (Choose two)


• The Identity Provider (IdP) is down.
• Users must login using SSO.
Users are typing their passwords incorrectly.
Score: 100.00
Single Choice
9)
When graphing a metric, you can normally choose the metric name from the metric
field dropdown list. Sometimes, the metric name does not appear in the list and you
need to type in the name.

Why doesn’t the metric name always appear in the list?

• Only metrics collected within the past 24 hours appear in the list.
The list only shows recently accessed metrics.
The list is dynamic and only shows popular metrics used in dashboards and monitors.
The list is alphabetical and only shows the top 10 metrics for each letter.
Score: 100.00
Single Choice
10)
The Flush interval is the period of time where the DogStatsD client aggregates
multiple data points for a unique metric, into a single data point.

By default, how long is this interval?

15 seconds
30 seconds
5 seconds
• 10 seconds
5 minutes
60 seconds
Score: 100.00
Multiple Choice
11)
A flare contains all of the Agent’s configuration files and logs packaged into an
archive file. However, before asking for a flare, there are many troubleshooting
steps you can do without a flare.

Choose the five steps.

• Check installer log


• Check the infrastructure list for integration errors
• Check the Event Stream for Agent restarts
• Check the infrastructure list for the value of the ntp.offset metric
• Use a Notebook with bar graphs to confirm gaps in metric submission
Check the agent.log and status.log
Score: 100.00
Multiple Choice
12)
When collecting logs, there are some technical specifications to consider, which
will help prevent issues of logs not appearing in the Datadog UI.

Which of the following statements are true? (Choose three)

There is an API rate limit of 100GB of log events that can be submitted per hour.
• Datadog recommends a log event should not exceed 256KB in size.
There must be direct internet connectivity to Datadog from the host sending logs.
• When using the Datadog TCP or HTTP API directly, log events up to 1MB are
accepted.
• When using the Datadog Agent, log events greater than 256KB are split into
several entries.
Score: 100.00
Single Choice
13)
When troubleshooting APM issues at the Datadog Agent stage, there are different log
levels you can set depending on the level of detail you need.

Which of the following three levels provides the most detailed information?

• TRACE
DEBUG
INFO
Score: 100.00
Single Choice
14)
Each part of the Datadog Agent has its own log file such as the agent.log, process-
agent.log and trace-agent.log. These logs rollover when they reach a certain size
and become a backup copy. If there is already a previous backup, it is overwritten
with the new copy.

By default, log rollover occurs when the log reaches what size?

• 10MB
1GB
1MB
10GB
100MB
Score: 100.00
Single Choice
15)
Before a user can access a Datadog organization for the first time, you should
check they have been invited to that organization and the user account is active.
Inviting a user will generate an email with a unique link for them to join.

Where in Datadog can you confirm who invited the user and when?

Security Signals
• Event stream
Watchdog
Log Explorer
Score: 100.00
Single Choice
16)
Datadog’s Live Processes gives you real-time visibility into the processes running
on your infrastructure. You can break down resource consumption by host/containers
at the process level and query for processes running on a specific host, in a
specific zone, or running a specific workload.

This Live Process monitoring is enabled by default.

True
• False
Score: 100.00
Single Choice
17)
Special characters in logs are not searchable in logs search. In order to search
these special characters, it is recommended to parse them into attributes with the
grok parser, and then search for logs that contain the attribute.

False
• True
Score: 100.00
Single Choice
18)
For APM investigations, it is useful to know you can change the log level of the
flare to match your needs, based on where you are in the investigation.

Which log level provides more context for what is happening by adding <loglevel>
logs to your log files?

• DEBUG
TRACE
INFO
Score: 100.00
Single Choice
19)
Live Process Monitors are health checks that return the status of matching
processes. This is useful for monitoring the number of processes running on a host
or on multiple hosts. When configuring Live Process Monitors, it’s important to
keep in mind how long the live process data is retained for.

How long is the default retention period?

15 minutes
8 hours
• 36 hours
24 hours
Score: 100.00
Multiple Choice
20)
When collecting logs for analysis, it’s common that not all logs are useful and
don’t need to be indexed. Exclusion filters can be used to control which logs
flowing in your index should be removed.

Choose the correct two statements.

Excluded logs are discarded from indexes and do not flow through Livetail.
Excluded logs cannot be used to generate metrics and cannot be archived.
• Excluded logs can be used to generate metrics and can be archived.
• Excluded logs are discarded from indexes, but still flow through Livetail.
Score: 100.00
Single Choice
21)
There are many concepts to understand when it comes to Application Performance
Monitoring. To be able to troubleshoot APM issues effectively, you need to
understand what a Span is.

A span represents a logical unit of work in the system for a given period of time.
Each span consists of a span.name, start time, duration, and span tags.

Choose the correct statement.

Spans cannot be nested within each other.


A span contains only one trace.
• A span contains one or more traces.
Spans and traces are unrelated.
Score: 100.00
Single Choice
22)
For Log monitoring investigations, it is useful to know the components that affect
logs and how they appear in the UI.

Which component applies a list of sequential processors to a filtered subset of


incoming logs?

Processor
Attributes
Indexes
• Pipeline
Facets
Score: 100.00
Single Choice
23)
When troubleshooting APM issues at the Datadog Agent stage, you may need to ask for
more details via a flare.

Which level flare is usually the most useful?

TRACE
• DEBUG
INFO
Score: 100.00
Single Choice
24)
The Usage Attribution feature gives visibility into what’s driving product usage.
This is useful when you need to know the contribution of usage for chargeback
purposes, or need to monitor the daily usage to control usage spikes and trends.
You can report on this usage by tags, such as teams, business units, applications,
services, environments etc.

How many tags can you use simultaneously?

2
• 3
5
6
4
Score: 100.00
Single Choice
25)
One of the most important attributes of a log is the log status, such as INFO,
WARN, ERROR etc. Sometimes the default status can be incorrect in Datadog but this
can easily be remapped.

What is ONE reason for this status being incorrect?

The status string needs to be lower case.


• Only the first character of the status string is used to determine the log
status.
The status string needs to be upper case.
The status string is not in English.
All the characters of the status string are used to determine the log status (e.g.
warn ≠ warning).
Score: 100.00
Multiple Choice
26)
Issues with collecting metrics from integrations is commonly due to the
configuration. Before commencing detailed troubleshooting, make sure the
integration is enabled and configured correctly.

What should you do next? (Choose three).

• Ask for a flare.


• Confirm the integration is an Agent Check and not a crawler integration.
Uninstall and reinstall the Datadog Agent.
• Confirm the config file is correctly formatted.
Contact the 3rd party vendor support for the integration.
Score: 100.00
Single Choice
27)
All requests to Datadog’s API must be authenticated.

Choose the correct statement.

Requests that write data require both API key and APP key.
Requests that write data require an APP key.
Requests that write data require an API key.
Score: 0.00
Correct answer(s):
Requests that write data require both API key and APP key.
Requests that write data require an APP key.
• Requests that write data require an API key.
Single Choice
28)
It is common to compare the MAX/MIN/SUM metric values in CloudWatch to those seen
in Datadog. The values are likely to be different and this is expected.

Why?

• CloudWatch will display the raw MAX/MIN/SUM value, while Datadog will show the
MAX/MIN/SUM of the AVERAGE values received.
Values shown in Datadog are less accurate and have a degree of variance because our
method of extracting the values is different.
There are conflicting duplicate metrics with the same metric name but different
values.
Score: 100.00
Single Choice
29)
Using graphs is a useful way to check for gaps in metrics and confirm there is an
NTP offset issue. However, when the NTP offset is too large, it can be difficult or
impossible to confirm this in the Datadog UI.

What is the best way to confirm there’s a NTP offset issue with the Agent?

Create a dashboard using the Top List widget showing the NTP offset metric.
• Run the Agent Status command to see the NTP offset.
Extend the graphs start time and end time so the graph shows a larger window.
Score: 100.00
Single Choice
30)
As DogStatsD receives data, it aggregates multiple data points for each unique
metric into a single data point over a period of time.

What is this period of time called?


Buffer interval
• Flush interval
Queue interval
Submission interval
Score: 100.00
Multiple Choice
31)
AWS has extensive CloudWatch metrics. Datadog does not collect all statistics (e.g.
Min, Max, Sum, Ave, Count, p99 etc) for all CloudWatch metrics. The AWS
documentation is checked for the recommended statistics, and if it’s not specified
there, we collect the AVERAGE.

Choose two reasons why Datadog does not collect all statistics.

Only the common statistics requested by our customers are collected.


• To significantly reduce the amount of data collected and overall traffic between
Datadog and AWS.
• Only certain statistics are valuable for what a metric represents.
To reduce the work of our software engineers.
Score: 100.00
Single Choice
32)
For Log monitoring investigations, it is useful to know the components that affect
logs and how they appear in the UI.

Which component executes within a pipeline to complete a data-structuring action on


a log?

Attributes
• Processor
Pipeline
Facets
Indexes
Score: 100.00
Single Choice
33)
Roles categorize users and define what account permissions those users have, such
as what data they can read or what account assets they can modify.

By default, how many roles does Datadog offer?

2
4
• 3
5
6
Score: 100.00
Single Choice
34)
The Agent config is the datadog.yaml file. Depending on the operating system, the
datadog.yaml file will contain only the explicit variables, OR the full yaml file
including the default values.

Which operating system’s datadog.yaml file shows only the explicitly set variables?

• Linux
Windows
Score: 100.00
Single Choice
35)
A multi-org account consists of a single parent organization and multiple child
organizations. Submitting metrics and events to these accounts requires a valid API
key.

Choose the correct statement.

The API key used by all child organizations is the same and you only need one valid
API key.
• API keys are unique to organizations and not interchangeable.
The API key used needs to match the parent organization only.
Score: 100.00
Multiple Choice
36)
Datadog’s cloud provider integrations (AWS, Azure, GCP, etc) are crawler based.

Choose the two correct statements.

A crawler is simply the Datadog Agent that interacts with an API.


• A crawler is a program that runs at a repeated interval that interacts with an
API.
• A crawler pulls in metrics via APIs.
You do not need to configure anything on the cloud provider’s end.
Score: 100.00
Multiple Choice
37)
The Agent’s NTP Offset needs to be accurate. Any significant offset can have
undesired effects.

Which effects are caused by large NTP Offsets? (Choose three).

• Metric delays
• Incorrect alert triggers
Agent buffer overload
Agent crashloops
• Gaps in graphs of metrics
Score: 100.00
Single Choice
38)
Just in time (JIT) provisioning allows a user to be created within Datadog the
first time they try to log in. This eliminates the need for administrators to
manually create user accounts one at a time.

Organizations can configure multiple email domains to enable JIT provisioning for
all users of those domains.

False
• True
Score: 100.00
Single Choice
39)
For Log monitoring investigations, it is useful to know the components that affect
logs and how they appear in the UI.

Which component is a user-defined tag and attribute from your indexed logs that are
meant for qualitative or quantitative data analysis?

Pipeline
Processor
Indexes
• Facets
Attributes
Score: 100.00
Single Choice
40)
Roles categorize users and define what account permissions those users have, such
as what data they can read or what account assets they can modify.

By default, which are the roles that Datadog offers?

Admin, Super User, Read Only


• Admin, Standard, Read Only
Admin, Power User, Standard, Read Only
Admin, Modify, Write, Read
Score: 100.00

You might also like