Kafka消费者无法从安全的Kafka集群中消费消息。

huangapple 未分类评论52阅读模式
英文:

Kafka consumer not consuming messages from secured kafka cluster

问题

我正在使用 kafka-console-consumer,并且它运行正常。以下是配置文件(admin.conf)。

security.protocol=SASL_SSL
sasl.mechanism=SCRAM-SHA-256
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username="admin" password="admin-secret";
ssl.truststore.location=F:\\temp\\kafka.server.truststore.jks
ssl.truststore.password=serversecret

使用 console-consumer 命令的输出:

PS F:\Soft\confluent-5.1.0-2.11\confluent-5.1.0\bin\windows> ./kafka-console-consumer.bat --bootstrap-server prd-app-kafka-0:9093,prd-app-kafka-1:9093,prd-app-kafka-2:9093 --from-beginning --topic test_topic3 --consumer.config admin.conf
dgdsfgdf
sfgsfg
rherht
errger
11111111111
aaaaaaaaaaaa
cvvvc
2222222222222222
aaaaa
csdacv
2222222222222222222222
12121121212
35364646346
Processed a total of 13 messages

但是 Java 客户端消费者未能消费消息。以下是我的 Kafka 消费者 Java 代码。

// ...(部分代码被省略)

Properties props = new Properties();
props.put("bootstrap.servers", "prd-app-kafka-0:9093,prd-app-kafka-1:9093,prd-app-kafka-2:9093");
props.put("sasl.jaas.config", "org.apache.kafka.common.security.scram.ScramLoginModule required username=\"admin\" password=\"admin-secret\";");
props.put("group.id", "demo-consumer");
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("auto.offset.reset", "earliest");
props.put("security.protocol", "SASL_SSL");
props.put("sasl.mechanism", "SCRAM-SHA-256");
props.put("ssl.truststore.location", "F:\\temp\\kafka.server.truststore.jks");
props.put("ssl.truststore.password", "serversecret");

KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props);
System.out.println("Consumer created...");

// ...(部分代码被省略)

try {
    while (true) {
        System.out.println("Consumer polling started....");
        ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(10000));
        System.out.println("records.count():: " + records.count());
        for (ConsumerRecord<String, String> record : records)
            System.out.println(record.key() + ": " + record.value());
    }
}
// ...(部分代码被省略)

以下是用于 Kafka 客户端库的 Maven 依赖。

<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>2.4.1</version>
</dependency>

<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-streams</artifactId>
    <version>2.4.1</version>
</dependency>

以下是我的 jaas.conf 文件。

KafkaClient {
org.apache.kafka.common.security.scram.ScramLoginModule required
username="admin123"
password="admin-secret";
};

以下是我的输出。

2020-04-07 17:12:45 INFO  ConsumerConfig - ConsumerConfig values: 
    ...
2020-04-07 17:12:46 DEBUG KafkaConsumer - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Initializing the Kafka consumer
2020-04-07 17:12:46 INFO  AbstractLogin - Successfully logged in.
...
2020-04-07 17:13:56 DEBUG AbstractCoordinator - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Sending FindCoordinator request to broker prd-app-kafka-0:9093 (id: -1 rack: null)
...
2020-04-07 17:13:56 DEBUG AbstractCoordinator - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Coordinator discovery failed, refreshing metadata
...
records.count():: 0

如果我忽略了任何必需的配置,请告诉我。

英文:

I am using kafka-console-consumer and it is working fine. Here is the config file (admin.conf).

security.protocol=SASL_SSL
sasl.mechanism=SCRAM-SHA-256
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username=&quot;admin&quot; password=&quot;admin-secret&quot;;
ssl.truststore.location=F:\\temp\\kafka.server.truststore.jks
ssl.truststore.password=serversecret

console-consumer command with output:

PS F:\Soft\confluent-5.1.0-2.11\confluent-5.1.0\bin\windows&gt; ./kafka-console-consumer.bat --bootstrap-server prd-app-kafka-0:9093,prd-app-kafka-1:9093,prd-app-kafka-2:9093 --from-beginning --topic test_topic3 --consumer.config admin.conf
dgdsfgdf
sfgsfg
rherht
errger
11111111111
aaaaaaaaaaaa
cvvvc
2222222222222222
aaaaa
csdacv
2222222222222222222222
12121121212
35364646346
Processed a total of 13 messages

But java client consumer not consuming messages. Here is my Kafka consumer java code

        //System.setProperty(&quot;java.security.auth.login.config&quot;,&quot;F:\\temp\\jaas.conf&quot;);

        Properties props = new Properties();
		props.put(&quot;bootstrap.servers&quot;, &quot;prd-app-kafka-0:9093,prd-app-kafka-1:9093,prd-app-kafka-2:9093&quot;);
		props.put(&quot;sasl.jaas.config&quot;, &quot;org.apache.kafka.common.security.scram.ScramLoginModule required username=\&quot;admin\&quot; password=\&quot;admin-secret\&quot;;&quot;);    
		props.put(&quot;group.id&quot;, &quot;demo-consumer&quot;);
		props.put(&quot;key.deserializer&quot;, &quot;org.apache.kafka.common.serialization.StringDeserializer&quot;);
		props.put(&quot;value.deserializer&quot;, &quot;org.apache.kafka.common.serialization.StringDeserializer&quot;);
		props.put(&quot;auto.offset.reset&quot;, &quot;earliest&quot;);
		props.put(&quot;security.protocol&quot;,&quot;SASL_SSL&quot;);  
		props.put(&quot;sasl.mechanism&quot;,&quot;SCRAM-SHA-256&quot;);
		props.put(&quot;ssl.truststore.location&quot;,&quot;F:\\temp\\kafka.server.truststore.jks&quot;);
		props.put(&quot;ssl.truststore.password&quot;,&quot;serversecret&quot;);
		KafkaConsumer&lt;String, String&gt; consumer = new KafkaConsumer&lt;&gt;(props);
		System.out.println(&quot;Consumer created...&quot;);
		consumer.subscribe(Arrays.asList(&quot;test_topic3&quot;));
		try {
			while (true) {
				System.out.println(&quot;Consumer polling started....&quot;);
				ConsumerRecords&lt;String, String&gt; records = consumer.poll(Duration.ofMillis(10000));
				System.out.println(&quot;records.count():: &quot;+ records.count());
				for (ConsumerRecord&lt;String, String&gt; record : records)
					System.out.println(record.key() + &quot;: &quot; + record.value());
			}
		}

Here is maven dependency for kafka client library.

        &lt;dependency&gt;
			&lt;groupId&gt;org.apache.kafka&lt;/groupId&gt;
			&lt;artifactId&gt;kafka-clients&lt;/artifactId&gt;
			&lt;version&gt;2.4.1&lt;/version&gt;
		&lt;/dependency&gt;

		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.kafka&lt;/groupId&gt;
			&lt;artifactId&gt;kafka-streams&lt;/artifactId&gt;
			&lt;version&gt;2.4.1&lt;/version&gt;
		&lt;/dependency&gt;

Here is my jaas.conf file.

KafkaClient {
org.apache.kafka.common.security.scram.ScramLoginModule required
username=&quot;admin123&quot;
password=&quot;admin-secret&quot;;
};

Here is my output

2020-04-07 17:12:45 INFO  ConsumerConfig - ConsumerConfig values: 
	allow.auto.create.topics = true
	auto.commit.interval.ms = 5000
	auto.offset.reset = earliest
	bootstrap.servers = [prd-app-kafka-0:9093, prd-app-kafka-1:9093, prd-app-kafka-2:9093]
	check.crcs = true
	client.dns.lookup = default
	client.id = 
	client.rack = 
	connections.max.idle.ms = 540000
	default.api.timeout.ms = 60000
	enable.auto.commit = true
	exclude.internal.topics = true
	fetch.max.bytes = 52428800
	fetch.max.wait.ms = 500
	fetch.min.bytes = 1
	group.id = demo-consumer
	group.instance.id = null
	heartbeat.interval.ms = 3000
	interceptor.classes = []
	internal.leave.group.on.close = true
	isolation.level = read_uncommitted
	key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
	max.partition.fetch.bytes = 1048576
	max.poll.interval.ms = 300000
	max.poll.records = 500
	metadata.max.age.ms = 300000
	metric.reporters = []
	metrics.num.samples = 2
	metrics.recording.level = INFO
	metrics.sample.window.ms = 30000
	partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
	receive.buffer.bytes = 65536
	reconnect.backoff.max.ms = 1000
	reconnect.backoff.ms = 50
	request.timeout.ms = 30000
	retry.backoff.ms = 100
	sasl.client.callback.handler.class = null
	sasl.jaas.config = [hidden]
	sasl.kerberos.kinit.cmd = /usr/bin/kinit
	sasl.kerberos.min.time.before.relogin = 60000
	sasl.kerberos.service.name = null
	sasl.kerberos.ticket.renew.jitter = 0.05
	sasl.kerberos.ticket.renew.window.factor = 0.8
	sasl.login.callback.handler.class = null
	sasl.login.class = null
	sasl.login.refresh.buffer.seconds = 300
	sasl.login.refresh.min.period.seconds = 60
	sasl.login.refresh.window.factor = 0.8
	sasl.login.refresh.window.jitter = 0.05
	sasl.mechanism = SCRAM-SHA-256
	security.protocol = SASL_SSL
	security.providers = null
	send.buffer.bytes = 131072
	session.timeout.ms = 10000
	ssl.cipher.suites = null
	ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
	ssl.endpoint.identification.algorithm = https
	ssl.key.password = null
	ssl.keymanager.algorithm = SunX509
	ssl.keystore.location = null
	ssl.keystore.password = null
	ssl.keystore.type = JKS
	ssl.protocol = TLS
	ssl.provider = null
	ssl.secure.random.implementation = null
	ssl.trustmanager.algorithm = PKIX
	ssl.truststore.location = F:\temp\kafka.server.truststore.jks
	ssl.truststore.password = [hidden]
	ssl.truststore.type = JKS
	value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer

2020-04-07 17:12:45 DEBUG KafkaConsumer - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Initializing the Kafka consumer
2020-04-07 17:12:45 INFO  AbstractLogin - Successfully logged in.
2020-04-07 17:12:46 DEBUG SslEngineBuilder - Created SSL context with keystore null, truststore SecurityStore(path=F:\temp\kafka.server.truststore.jks, modificationTime=Mon Apr 06 18:56:42 GMT+03:00 2020), provider SunJSSE.
2020-04-07 17:12:46 INFO  AppInfoParser - Kafka version: 5.4.1-ccs
2020-04-07 17:12:46 INFO  AppInfoParser - Kafka commitId: bd7407ab4c5d30c1
2020-04-07 17:12:46 INFO  AppInfoParser - Kafka startTimeMs: 1586268766652
2020-04-07 17:12:46 DEBUG KafkaConsumer - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Kafka consumer initialized
Consumer created...
2020-04-07 17:12:46 INFO  KafkaConsumer - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Subscribed to topic(s): test_topic3
Consumer polling started....
2020-04-07 17:12:46 DEBUG AbstractCoordinator - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Sending FindCoordinator request to broker prd-app-kafka-0:9093 (id: -1 rack: null)
2020-04-07 17:12:46 DEBUG NetworkClient - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Initiating connection to node prd-app-kafka-0:9093 (id: -1 rack: null) using address prd-app-kafka-0/10.10.54.72
2020-04-07 17:12:47 DEBUG SaslClientAuthenticator - Set SASL client state to SEND_APIVERSIONS_REQUEST
2020-04-07 17:12:47 DEBUG SaslClientAuthenticator - Creating SaslClient: client=null;service=kafka;serviceHostname=prd-app-kafka-0;mechs=[SCRAM-SHA-256]
2020-04-07 17:12:47 DEBUG ScramSaslClient - Setting SASL/SCRAM_SHA_256 client state to SEND_CLIENT_FIRST_MESSAGE
2020-04-07 17:12:47 DEBUG Selector - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Created socket with SO_RCVBUF = 65536, SO_SNDBUF = 131072, SO_TIMEOUT = 0 to node -1
2020-04-07 17:12:47 DEBUG NetworkClient - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Completed connection to node -1. Fetching API versions.
records.count():: 0
Consumer polling started....
records.count():: 0
Consumer polling started....
records.count():: 0
Consumer polling started....
2020-04-07 17:13:16 DEBUG AbstractCoordinator - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Coordinator discovery failed, refreshing metadata
records.count():: 0
Consumer polling started....
2020-04-07 17:13:26 DEBUG AbstractCoordinator - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Sending FindCoordinator request to broker prd-app-kafka-0:9093 (id: -1 rack: null)
records.count():: 0
Consumer polling started....
records.count():: 0
Consumer polling started....
2020-04-07 17:13:56 DEBUG AbstractCoordinator - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Coordinator discovery failed, refreshing metadata
records.count():: 0
Consumer polling started....
2020-04-07 17:13:56 DEBUG AbstractCoordinator - [Consumer clientId=consumer-demo-consumer-1, groupId=demo-consumer] Sending FindCoordinator request to broker prd-app-kafka-0:9093 (id: -1 rack: null)
records.count():: 0

Please let me know if I am missing any required configuration.

huangapple
  • 本文由 发表于 2020年4月7日 22:52:01
  • 转载请务必保留本文链接:https://java.coder-hub.com/61082874.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定