In Avro, removing or adding a field that has a default is a schema evolution
Correct Answer:
A
Clients with new schema will be able to read records saved with old schema and clients with old schema will be able to read records saved with new schema.
You are building a consumer application that processes events from a Kafka topic. What is the most important metric to monitor to ensure real-time processing?
Correct Answer:
B
This metric shows the current lag (number of messages behind the broker)
What's is true about Kafka brokers and clients from version 0.10.2 onwards?
Correct Answer:
C
Kafka's new bidirectional client compatibility introduced in 0.10.2 allows this. Read more herehttps://www.confluent.io/blog/upgrading-apache-kafka-clients-just-got-easier/
You are doing complex calculations using a machine learning framework on records fetched from a Kafka topic. It takes more about 6 minutes to process a record batch, and the consumer enters rebalances even though it's still running. How can you improve this scenario?
Correct Answer:
A
Here, we need to change the setting max.poll.interval.ms (default 300000) to its double in order to tell Kafka a consumer should be considered dead if the consumer only if it hasn't called the .poll() method in 10 minutes instead of 5.
A consumer is configured with enable.auto.commit=false. What happens when close() is called on the consumer object?
Correct Answer:
B
Calling close() on consumer immediately triggers a partition rebalance as the consumer will not be available anymore.