In Kafka, every broker... (select three)
B, E, F
Kafka topics are divided into partitions and spread across brokers. Each brokers knows about all the
metadata and each broker is a bootstrap broker, but only one of them is elected controller
To continuously export data from Kafka into a target database, I should use
Kafka Connect Sink is used to export data from Kafka to external databases and Kafka Connect Source
is used to import from external databases into Kafka.
A Zookeeper configuration has tickTime of 2000, initLimit of 20 and syncLimit of 5. What's the
timeout value for followers to connect to Zookeeper?
tick time is 2000 ms, and initLimit is the config taken into account when establishing a connection to
Zookeeper, so the answer is 2000 * 20 = 40000 ms = 40s
In Avro, adding an element to an enum without a default is a __ schema evolution
Since Confluent 5.4.0, Avro 1.9.1 is used. Since default value was added to enum complex type , the
schema resolution changed from:
(<1.9.1) if both are enums:** if the writer's symbol is not present in the reader's enum, then an error
is signalled. **(>=1.9.1) if both are enums:
if the writer's symbol is not present in the reader's enum and the reader has a default value, then
that value is used, otherwise an error is signalled.
There are five brokers in a cluster, a topic with 10 partitions and replication factor of 3, and a quota of
producer_bytes_rate of 1 MB/sec has been specified for the client. What is the maximum throughput
allowed for the client?
Each producer is allowed to produce @ 1MB/s to a broker. Max throughput 5 * 1MB, because we
have 5 brokers.
A topic "sales" is being produced to in the Americas region. You are mirroring this topic using Mirror
Maker to the European region. From there, you are only reading the topic for analytics purposes.
What kind of mirroring is this?
This is active-passing as the replicated topic is used for read-only purposes only
What is true about replicas ?
Replicas are passive - they don't handle produce or consume request. Produce and consume requests
get sent to the node hosting partition leader.
If I want to send binary data through the REST proxy, it needs to be base64 encoded. Which
component needs to encode the binary data into base 64?
The REST Proxy requires to receive data over REST that is already base64 encoded, hence it is the
responsibility of the producer
What's is true about Kafka brokers and clients from version 0.10.2 onwards?
Kafka's new bidirectional client compatibility introduced in 0.10.2 allows this. Read more
How will you set the retention for the topic named ‚Äúmy-topic‚Äù to 1 hour?
retention.ms can be configured at topic level while creating topic or by altering topic. It shouldn't be
set at the broker level (log.retention.ms) as this would impact all the topics in the cluster, not just the
one we are interested in