NodeJS : KafkaJSProtocolError: The group member's supported protocols are incompatible with those of existing membersKafka Java Consumer API Issuekafka connect hdfs sink connector is failing even when json data contains schema and payload fieldHow to access Kafka in docker machine from local?Zookeeper in DockerNot able to read event from single topic that has single partition through multiple consumersDebezium error when connecting to kafka multi-broker in docker swarmKafka Connect with DebeziumNot able to view kafka consumer output while executing in ECLIPSE: PySparkPort binding error when connecting to localhost Kafka broker from localhost Consumer in Docker containerKafka - Docker Error : kafka.zookeeper.ZooKeeperClientTimeoutException in Kafka shell
How do I calculate the difference in lens reach between a superzoom compact and a DSLR zoom lens?
Max Order of an Isogeny Class of Rational Elliptic Curves is 8?
Strangeness with gears
Is it really ~648.69 km/s delta-v to "land" on the surface of the Sun?
Why should we care about syntactic proofs if we can show semantically that statements are true?
Dropdowns & Chevrons for Right to Left languages
Is it okay for a ticket seller to grab a tip in the USA?
As a 16 year old, how can I keep my money safe from my mother?
How to use grep to search through the --help output?
Replace value with variable length between double quotes
Plausibility of Ice Eaters in the Arctic
How can I tell if a flight itinerary is fake?
Author changing name
Why doesn't the "ch" pronunciation rule occur for words such as "durch" and "manchmal"?
Ordering a word list
A stranger from Norway wants to have money delivered to me
Does this Foo machine halt?
Non-OR journals which regularly publish OR research
What's this thing in a peltier cooler?
Why isn’t SHA-3 in wider use?
What does Apple mean by "This may decrease battery life"?
Performance of a branch and bound algorithm VS branch-cut-heuristics
(11 of 11: Meta) What is Pyramid Cult's All-Time Favorite?
Generator for parity?
NodeJS : KafkaJSProtocolError: The group member's supported protocols are incompatible with those of existing members
Kafka Java Consumer API Issuekafka connect hdfs sink connector is failing even when json data contains schema and payload fieldHow to access Kafka in docker machine from local?Zookeeper in DockerNot able to read event from single topic that has single partition through multiple consumersDebezium error when connecting to kafka multi-broker in docker swarmKafka Connect with DebeziumNot able to view kafka consumer output while executing in ECLIPSE: PySparkPort binding error when connecting to localhost Kafka broker from localhost Consumer in Docker containerKafka - Docker Error : kafka.zookeeper.ZooKeeperClientTimeoutException in Kafka shell
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
I am trying to capture data from Kafka using MongoDB debezium connector but I am getting error when I try to read it with KafkaJS:
KafkaJSProtocolError: The group member's supported protocols are incompatible with those of existing members
I am using docker images to capture data.
Here are the steps, I am following :
Start Zookeeper
docker run -it --rm --name zookeeper -p 2181:2181 -p 2888:2888 -p 3888:3888 debezium/zookeeper:latest
start kafka
docker run -it --rm --name kafka -p 9092:9092 --link zookeeper:zookeeper debezium/kafka:latest
I have MongoDB running with replicate mode already
Start debezium Kafka connect
docker run -it --rm --name connect -p 8083:8083 -e GROUP_ID=1 -e CONFIG_STORAGE_TOPIC=my_connect_configs -e OFFSET_STORAGE_TOPIC=my_connect_offsets -e STATUS_STORAGE_TOPIC=my_connect_statuses --link zookeeper:zookeeper --link kafka:kafka debezium/connect:latest
Then Post MongoDB connector configuration
curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/ -d ' "name": "mongodb-connector", "config": "connector.class": "io.debezium.connector.mongodb.MongoDbConnector", "mongodb.hosts": "rs0/abc.com:27017", "mongodb.name": "fullfillment", "collection.whitelist": "mongodev.test", "mongodb.user": "kafka", "mongodb.password": "kafka01" '
With this If I run a watcher docker container, I am able to data in Json format in console
docker run -it --name watchermongo --rm --link zookeeper:zookeeper --link kafka:kafka debezium/kafka:0.9 watch-topic -a -k fullfillment.mongodev.test
but I want to capture this data in application so that I can manipulate it, process it and push to ElasticSearch. For that I am using
https://github.com/tulios/kafkajs
But When I run the consumer code, I am getting error.. Here is code example
//'use strict';
// clientId=connect-1, groupId=1
const Kafka = require('kafkajs')
const kafka = new Kafka(
clientId: 'connect-1',
brokers: ['localhost:9092', 'localhost:9093']
)
// Consuming
const consumer = kafka.consumer( groupId: '1' )
var consumeMessage = async () =>
await consumer.connect()
await consumer.subscribe( topic: 'fullfillment.mongodev.test' )
await consumer.run(
eachMessage: async ( topic, partition, message ) =>
console.log(
value: message.value.toString(),
)
,
)
consumeMessage();
KafkaJSProtocolError: The group member's supported protocols are incompatible with those of existing members
docker apache-kafka apache-kafka-connect debezium kafkajs
add a comment |
I am trying to capture data from Kafka using MongoDB debezium connector but I am getting error when I try to read it with KafkaJS:
KafkaJSProtocolError: The group member's supported protocols are incompatible with those of existing members
I am using docker images to capture data.
Here are the steps, I am following :
Start Zookeeper
docker run -it --rm --name zookeeper -p 2181:2181 -p 2888:2888 -p 3888:3888 debezium/zookeeper:latest
start kafka
docker run -it --rm --name kafka -p 9092:9092 --link zookeeper:zookeeper debezium/kafka:latest
I have MongoDB running with replicate mode already
Start debezium Kafka connect
docker run -it --rm --name connect -p 8083:8083 -e GROUP_ID=1 -e CONFIG_STORAGE_TOPIC=my_connect_configs -e OFFSET_STORAGE_TOPIC=my_connect_offsets -e STATUS_STORAGE_TOPIC=my_connect_statuses --link zookeeper:zookeeper --link kafka:kafka debezium/connect:latest
Then Post MongoDB connector configuration
curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/ -d ' "name": "mongodb-connector", "config": "connector.class": "io.debezium.connector.mongodb.MongoDbConnector", "mongodb.hosts": "rs0/abc.com:27017", "mongodb.name": "fullfillment", "collection.whitelist": "mongodev.test", "mongodb.user": "kafka", "mongodb.password": "kafka01" '
With this If I run a watcher docker container, I am able to data in Json format in console
docker run -it --name watchermongo --rm --link zookeeper:zookeeper --link kafka:kafka debezium/kafka:0.9 watch-topic -a -k fullfillment.mongodev.test
but I want to capture this data in application so that I can manipulate it, process it and push to ElasticSearch. For that I am using
https://github.com/tulios/kafkajs
But When I run the consumer code, I am getting error.. Here is code example
//'use strict';
// clientId=connect-1, groupId=1
const Kafka = require('kafkajs')
const kafka = new Kafka(
clientId: 'connect-1',
brokers: ['localhost:9092', 'localhost:9093']
)
// Consuming
const consumer = kafka.consumer( groupId: '1' )
var consumeMessage = async () =>
await consumer.connect()
await consumer.subscribe( topic: 'fullfillment.mongodev.test' )
await consumer.run(
eachMessage: async ( topic, partition, message ) =>
console.log(
value: message.value.toString(),
)
,
)
consumeMessage();
KafkaJSProtocolError: The group member's supported protocols are incompatible with those of existing members
docker apache-kafka apache-kafka-connect debezium kafkajs
What consumer code are you running? The fact that you get data fromwatch-topic
shows that the Debezium/Kafka bit is working just fine. The error that you've got comes from KafkaJS and how you're using that.
– Robin Moffatt
Mar 27 at 11:04
Also, for writing to Elasticsearch using Kafka Connect withkafka-connect-elasticsearch
. Hook that up to a Kafka topic to which you write your processed data (or directly to the topic from Debezium, if you just want to mirror MongoDB to Elasticsearch)
– Robin Moffatt
Mar 27 at 11:06
Thanks @RobinMoffatt , I have updated code for consuming data using Nodejs app. I tried other kafka-connect-elasticsearch as well , but I am not able to install it on my VM
– Mahajan344
Mar 27 at 12:23
Can you try varyinggroupId: '1'
forgroupId: 'foobar'
? That error message suggests there are other consumers in a group of the same name.
– Robin Moffatt
Mar 27 at 14:29
add a comment |
I am trying to capture data from Kafka using MongoDB debezium connector but I am getting error when I try to read it with KafkaJS:
KafkaJSProtocolError: The group member's supported protocols are incompatible with those of existing members
I am using docker images to capture data.
Here are the steps, I am following :
Start Zookeeper
docker run -it --rm --name zookeeper -p 2181:2181 -p 2888:2888 -p 3888:3888 debezium/zookeeper:latest
start kafka
docker run -it --rm --name kafka -p 9092:9092 --link zookeeper:zookeeper debezium/kafka:latest
I have MongoDB running with replicate mode already
Start debezium Kafka connect
docker run -it --rm --name connect -p 8083:8083 -e GROUP_ID=1 -e CONFIG_STORAGE_TOPIC=my_connect_configs -e OFFSET_STORAGE_TOPIC=my_connect_offsets -e STATUS_STORAGE_TOPIC=my_connect_statuses --link zookeeper:zookeeper --link kafka:kafka debezium/connect:latest
Then Post MongoDB connector configuration
curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/ -d ' "name": "mongodb-connector", "config": "connector.class": "io.debezium.connector.mongodb.MongoDbConnector", "mongodb.hosts": "rs0/abc.com:27017", "mongodb.name": "fullfillment", "collection.whitelist": "mongodev.test", "mongodb.user": "kafka", "mongodb.password": "kafka01" '
With this If I run a watcher docker container, I am able to data in Json format in console
docker run -it --name watchermongo --rm --link zookeeper:zookeeper --link kafka:kafka debezium/kafka:0.9 watch-topic -a -k fullfillment.mongodev.test
but I want to capture this data in application so that I can manipulate it, process it and push to ElasticSearch. For that I am using
https://github.com/tulios/kafkajs
But When I run the consumer code, I am getting error.. Here is code example
//'use strict';
// clientId=connect-1, groupId=1
const Kafka = require('kafkajs')
const kafka = new Kafka(
clientId: 'connect-1',
brokers: ['localhost:9092', 'localhost:9093']
)
// Consuming
const consumer = kafka.consumer( groupId: '1' )
var consumeMessage = async () =>
await consumer.connect()
await consumer.subscribe( topic: 'fullfillment.mongodev.test' )
await consumer.run(
eachMessage: async ( topic, partition, message ) =>
console.log(
value: message.value.toString(),
)
,
)
consumeMessage();
KafkaJSProtocolError: The group member's supported protocols are incompatible with those of existing members
docker apache-kafka apache-kafka-connect debezium kafkajs
I am trying to capture data from Kafka using MongoDB debezium connector but I am getting error when I try to read it with KafkaJS:
KafkaJSProtocolError: The group member's supported protocols are incompatible with those of existing members
I am using docker images to capture data.
Here are the steps, I am following :
Start Zookeeper
docker run -it --rm --name zookeeper -p 2181:2181 -p 2888:2888 -p 3888:3888 debezium/zookeeper:latest
start kafka
docker run -it --rm --name kafka -p 9092:9092 --link zookeeper:zookeeper debezium/kafka:latest
I have MongoDB running with replicate mode already
Start debezium Kafka connect
docker run -it --rm --name connect -p 8083:8083 -e GROUP_ID=1 -e CONFIG_STORAGE_TOPIC=my_connect_configs -e OFFSET_STORAGE_TOPIC=my_connect_offsets -e STATUS_STORAGE_TOPIC=my_connect_statuses --link zookeeper:zookeeper --link kafka:kafka debezium/connect:latest
Then Post MongoDB connector configuration
curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/ -d ' "name": "mongodb-connector", "config": "connector.class": "io.debezium.connector.mongodb.MongoDbConnector", "mongodb.hosts": "rs0/abc.com:27017", "mongodb.name": "fullfillment", "collection.whitelist": "mongodev.test", "mongodb.user": "kafka", "mongodb.password": "kafka01" '
With this If I run a watcher docker container, I am able to data in Json format in console
docker run -it --name watchermongo --rm --link zookeeper:zookeeper --link kafka:kafka debezium/kafka:0.9 watch-topic -a -k fullfillment.mongodev.test
but I want to capture this data in application so that I can manipulate it, process it and push to ElasticSearch. For that I am using
https://github.com/tulios/kafkajs
But When I run the consumer code, I am getting error.. Here is code example
//'use strict';
// clientId=connect-1, groupId=1
const Kafka = require('kafkajs')
const kafka = new Kafka(
clientId: 'connect-1',
brokers: ['localhost:9092', 'localhost:9093']
)
// Consuming
const consumer = kafka.consumer( groupId: '1' )
var consumeMessage = async () =>
await consumer.connect()
await consumer.subscribe( topic: 'fullfillment.mongodev.test' )
await consumer.run(
eachMessage: async ( topic, partition, message ) =>
console.log(
value: message.value.toString(),
)
,
)
consumeMessage();
KafkaJSProtocolError: The group member's supported protocols are incompatible with those of existing members
docker apache-kafka apache-kafka-connect debezium kafkajs
docker apache-kafka apache-kafka-connect debezium kafkajs
edited Mar 27 at 12:22
Mahajan344
asked Mar 27 at 7:24
Mahajan344Mahajan344
1,4146 gold badges27 silver badges54 bronze badges
1,4146 gold badges27 silver badges54 bronze badges
What consumer code are you running? The fact that you get data fromwatch-topic
shows that the Debezium/Kafka bit is working just fine. The error that you've got comes from KafkaJS and how you're using that.
– Robin Moffatt
Mar 27 at 11:04
Also, for writing to Elasticsearch using Kafka Connect withkafka-connect-elasticsearch
. Hook that up to a Kafka topic to which you write your processed data (or directly to the topic from Debezium, if you just want to mirror MongoDB to Elasticsearch)
– Robin Moffatt
Mar 27 at 11:06
Thanks @RobinMoffatt , I have updated code for consuming data using Nodejs app. I tried other kafka-connect-elasticsearch as well , but I am not able to install it on my VM
– Mahajan344
Mar 27 at 12:23
Can you try varyinggroupId: '1'
forgroupId: 'foobar'
? That error message suggests there are other consumers in a group of the same name.
– Robin Moffatt
Mar 27 at 14:29
add a comment |
What consumer code are you running? The fact that you get data fromwatch-topic
shows that the Debezium/Kafka bit is working just fine. The error that you've got comes from KafkaJS and how you're using that.
– Robin Moffatt
Mar 27 at 11:04
Also, for writing to Elasticsearch using Kafka Connect withkafka-connect-elasticsearch
. Hook that up to a Kafka topic to which you write your processed data (or directly to the topic from Debezium, if you just want to mirror MongoDB to Elasticsearch)
– Robin Moffatt
Mar 27 at 11:06
Thanks @RobinMoffatt , I have updated code for consuming data using Nodejs app. I tried other kafka-connect-elasticsearch as well , but I am not able to install it on my VM
– Mahajan344
Mar 27 at 12:23
Can you try varyinggroupId: '1'
forgroupId: 'foobar'
? That error message suggests there are other consumers in a group of the same name.
– Robin Moffatt
Mar 27 at 14:29
What consumer code are you running? The fact that you get data from
watch-topic
shows that the Debezium/Kafka bit is working just fine. The error that you've got comes from KafkaJS and how you're using that.– Robin Moffatt
Mar 27 at 11:04
What consumer code are you running? The fact that you get data from
watch-topic
shows that the Debezium/Kafka bit is working just fine. The error that you've got comes from KafkaJS and how you're using that.– Robin Moffatt
Mar 27 at 11:04
Also, for writing to Elasticsearch using Kafka Connect with
kafka-connect-elasticsearch
. Hook that up to a Kafka topic to which you write your processed data (or directly to the topic from Debezium, if you just want to mirror MongoDB to Elasticsearch)– Robin Moffatt
Mar 27 at 11:06
Also, for writing to Elasticsearch using Kafka Connect with
kafka-connect-elasticsearch
. Hook that up to a Kafka topic to which you write your processed data (or directly to the topic from Debezium, if you just want to mirror MongoDB to Elasticsearch)– Robin Moffatt
Mar 27 at 11:06
Thanks @RobinMoffatt , I have updated code for consuming data using Nodejs app. I tried other kafka-connect-elasticsearch as well , but I am not able to install it on my VM
– Mahajan344
Mar 27 at 12:23
Thanks @RobinMoffatt , I have updated code for consuming data using Nodejs app. I tried other kafka-connect-elasticsearch as well , but I am not able to install it on my VM
– Mahajan344
Mar 27 at 12:23
Can you try varying
groupId: '1'
for groupId: 'foobar'
? That error message suggests there are other consumers in a group of the same name.– Robin Moffatt
Mar 27 at 14:29
Can you try varying
groupId: '1'
for groupId: 'foobar'
? That error message suggests there are other consumers in a group of the same name.– Robin Moffatt
Mar 27 at 14:29
add a comment |
1 Answer
1
active
oldest
votes
You should not be using the same groupId in both Connect and your KafkaJS consumer. If you do, they will be part of the same consumer group, which means that messages would only be consumed by one or the other, if it even worked at all.
If you change the groupId of your KafkaJS consumer to something unique, it should work.
Note that by default a new KafkaJS consumer group will start consuming from the latest offset, so it won't consume already produced messages. You can override this behavior with the fromBeginning
flag in the consumer.subscribe
call. See https://kafka.js.org/docs/consuming#from-beginning
So that means even though from Zookeeper, kafka docker images .. i didn't pass any client id or group id.. while consuming it.. I can use any random client id and group id ?
– Mahajan344
Mar 28 at 9:03
Thanks @tommy for your help.. I just changed group Id from 1 to 2 and I am able to see data in console through this NodeJS app...Thanks a lot... Any suggestion how to sink this data in ElasticSearch... So custom node app to do this dirty work
– Mahajan344
Mar 28 at 9:20
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55371819%2fnodejs-kafkajsprotocolerror-the-group-members-supported-protocols-are-incomp%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
You should not be using the same groupId in both Connect and your KafkaJS consumer. If you do, they will be part of the same consumer group, which means that messages would only be consumed by one or the other, if it even worked at all.
If you change the groupId of your KafkaJS consumer to something unique, it should work.
Note that by default a new KafkaJS consumer group will start consuming from the latest offset, so it won't consume already produced messages. You can override this behavior with the fromBeginning
flag in the consumer.subscribe
call. See https://kafka.js.org/docs/consuming#from-beginning
So that means even though from Zookeeper, kafka docker images .. i didn't pass any client id or group id.. while consuming it.. I can use any random client id and group id ?
– Mahajan344
Mar 28 at 9:03
Thanks @tommy for your help.. I just changed group Id from 1 to 2 and I am able to see data in console through this NodeJS app...Thanks a lot... Any suggestion how to sink this data in ElasticSearch... So custom node app to do this dirty work
– Mahajan344
Mar 28 at 9:20
add a comment |
You should not be using the same groupId in both Connect and your KafkaJS consumer. If you do, they will be part of the same consumer group, which means that messages would only be consumed by one or the other, if it even worked at all.
If you change the groupId of your KafkaJS consumer to something unique, it should work.
Note that by default a new KafkaJS consumer group will start consuming from the latest offset, so it won't consume already produced messages. You can override this behavior with the fromBeginning
flag in the consumer.subscribe
call. See https://kafka.js.org/docs/consuming#from-beginning
So that means even though from Zookeeper, kafka docker images .. i didn't pass any client id or group id.. while consuming it.. I can use any random client id and group id ?
– Mahajan344
Mar 28 at 9:03
Thanks @tommy for your help.. I just changed group Id from 1 to 2 and I am able to see data in console through this NodeJS app...Thanks a lot... Any suggestion how to sink this data in ElasticSearch... So custom node app to do this dirty work
– Mahajan344
Mar 28 at 9:20
add a comment |
You should not be using the same groupId in both Connect and your KafkaJS consumer. If you do, they will be part of the same consumer group, which means that messages would only be consumed by one or the other, if it even worked at all.
If you change the groupId of your KafkaJS consumer to something unique, it should work.
Note that by default a new KafkaJS consumer group will start consuming from the latest offset, so it won't consume already produced messages. You can override this behavior with the fromBeginning
flag in the consumer.subscribe
call. See https://kafka.js.org/docs/consuming#from-beginning
You should not be using the same groupId in both Connect and your KafkaJS consumer. If you do, they will be part of the same consumer group, which means that messages would only be consumed by one or the other, if it even worked at all.
If you change the groupId of your KafkaJS consumer to something unique, it should work.
Note that by default a new KafkaJS consumer group will start consuming from the latest offset, so it won't consume already produced messages. You can override this behavior with the fromBeginning
flag in the consumer.subscribe
call. See https://kafka.js.org/docs/consuming#from-beginning
edited Mar 28 at 10:23
answered Mar 28 at 9:01
Tommy BrunnTommy Brunn
1,3702 gold badges19 silver badges33 bronze badges
1,3702 gold badges19 silver badges33 bronze badges
So that means even though from Zookeeper, kafka docker images .. i didn't pass any client id or group id.. while consuming it.. I can use any random client id and group id ?
– Mahajan344
Mar 28 at 9:03
Thanks @tommy for your help.. I just changed group Id from 1 to 2 and I am able to see data in console through this NodeJS app...Thanks a lot... Any suggestion how to sink this data in ElasticSearch... So custom node app to do this dirty work
– Mahajan344
Mar 28 at 9:20
add a comment |
So that means even though from Zookeeper, kafka docker images .. i didn't pass any client id or group id.. while consuming it.. I can use any random client id and group id ?
– Mahajan344
Mar 28 at 9:03
Thanks @tommy for your help.. I just changed group Id from 1 to 2 and I am able to see data in console through this NodeJS app...Thanks a lot... Any suggestion how to sink this data in ElasticSearch... So custom node app to do this dirty work
– Mahajan344
Mar 28 at 9:20
So that means even though from Zookeeper, kafka docker images .. i didn't pass any client id or group id.. while consuming it.. I can use any random client id and group id ?
– Mahajan344
Mar 28 at 9:03
So that means even though from Zookeeper, kafka docker images .. i didn't pass any client id or group id.. while consuming it.. I can use any random client id and group id ?
– Mahajan344
Mar 28 at 9:03
Thanks @tommy for your help.. I just changed group Id from 1 to 2 and I am able to see data in console through this NodeJS app...Thanks a lot... Any suggestion how to sink this data in ElasticSearch... So custom node app to do this dirty work
– Mahajan344
Mar 28 at 9:20
Thanks @tommy for your help.. I just changed group Id from 1 to 2 and I am able to see data in console through this NodeJS app...Thanks a lot... Any suggestion how to sink this data in ElasticSearch... So custom node app to do this dirty work
– Mahajan344
Mar 28 at 9:20
add a comment |
Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.
Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55371819%2fnodejs-kafkajsprotocolerror-the-group-members-supported-protocols-are-incomp%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
What consumer code are you running? The fact that you get data from
watch-topic
shows that the Debezium/Kafka bit is working just fine. The error that you've got comes from KafkaJS and how you're using that.– Robin Moffatt
Mar 27 at 11:04
Also, for writing to Elasticsearch using Kafka Connect with
kafka-connect-elasticsearch
. Hook that up to a Kafka topic to which you write your processed data (or directly to the topic from Debezium, if you just want to mirror MongoDB to Elasticsearch)– Robin Moffatt
Mar 27 at 11:06
Thanks @RobinMoffatt , I have updated code for consuming data using Nodejs app. I tried other kafka-connect-elasticsearch as well , but I am not able to install it on my VM
– Mahajan344
Mar 27 at 12:23
Can you try varying
groupId: '1'
forgroupId: 'foobar'
? That error message suggests there are other consumers in a group of the same name.– Robin Moffatt
Mar 27 at 14:29