Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

Exam Databricks-Certified-Professional-Data-Engineer All Questions
Exam Databricks-Certified-Professional-Data-Engineer All Questions

View all questions & answers for the Databricks-Certified-Professional-Data-Engineer exam

Databricks Certification Databricks-Certified-Professional-Data-Engineer Question # 9 Topic 1 Discussion

Databricks-Certified-Professional-Data-Engineer Exam Topic 1 Question 9 Discussion:
Question #: 9
Topic #: 1

A new data engineer notices that a critical field was omitted from an application that writes its Kafka source to Delta Lake. This happened even though the critical field was in the Kafka source. That field was further missing from data written to dependent, long-term storage. The retention threshold on the Kafka service is seven days. The pipeline has been in production for three months.

Which describes how Delta Lake can help to avoid data loss of this nature in the future?


A.

The Delta log and Structured Streaming checkpoints record the full history of the Kafka producer.


B.

Delta Lake schema evolution can retroactively calculate the correct value for newly added fields, as long as the data was in the original source.


C.

Delta Lake automatically checks that all fields present in the source data are included in the ingestion layer.


D.

Data can never be permanently dropped or deleted from Delta Lake, so data loss is not possible under any circumstance.


E.

Ingestine all raw data and metadata from Kafka to a bronze Delta table creates a permanent, replayable history of the data state.


Get Premium Databricks-Certified-Professional-Data-Engineer Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.