Kafka JsonConverter schema with nullable field and topic with forward slashJSON: why are forward slashes escaped?Response goes to onFailure method while Parsing JSON data in AndroidNot able to run Kafka Connect in distributed mode - Error while attempting to create/ find topics 'connect-offsets'WSO2 EI611 Kafka Connector Inbound Endpoint error with CAR & Consolejava.lang.NoSuchMethodError: org.json.JSONObject.putOnce(Ljava/lang/String;Ljava/lang/Object;)Lorg/json/JSONObject;Unable to link Kafka MQTT source connecttor to InfluxDB sink connectorAndroid Get JSON From Local FileKafka AVRO Deserialization Error when pushed data using Python's Avro libraryKafka-Connect HDFS - Protobuf to Parquet

What do you do if you have developments on your paper during the long peer review process?

How can this Stack Exchange site have an animated favicon?

Everyone and NTFS permissions

What is the meaning of word 'crack' in chapter 33 of A Game of Thrones?

How can an attacker use robots.txt?

Could Apollo astronauts see city lights from the moon?

Is this Portent-like spell balanced?

To what extent is it worthwhile to report check fraud / refund scams?

Going to France with limited French for a day

Two trains move towards each other, a bird moves between them. How many trips can the bird make?

How do pilots align the HUD with their eyeballs?

Is it true that, "just ten trading days represent 63 per cent of the returns of the past 50 years"?

Why is there not a feasible solution for a MIP?

I reverse the source code, you negate the input!

Norwegian refuses EU delay (4.7 hours) compensation because it turned out there was nothing wrong with the aircraft

Is it a good idea to leave minor world details to the reader's imagination?

Should the average user with no special access rights be worried about SMS-based 2FA being theoretically interceptable?

How to deal with my team leader who keeps calling me about project updates even though I am on leave for personal reasons?

Which place in our solar system is mostly fit for terraforming?

Social leper versus social leopard

Meaning of 'ran' in German?

Can I take NEW (still in their boxes) PC PARTS in my checked in luggage?

Why are there two fundamental laws of logic?

Functional analysis of the Pink Panther



Kafka JsonConverter schema with nullable field and topic with forward slash


JSON: why are forward slashes escaped?Response goes to onFailure method while Parsing JSON data in AndroidNot able to run Kafka Connect in distributed mode - Error while attempting to create/ find topics 'connect-offsets'WSO2 EI611 Kafka Connector Inbound Endpoint error with CAR & Consolejava.lang.NoSuchMethodError: org.json.JSONObject.putOnce(Ljava/lang/String;Ljava/lang/Object;)Lorg/json/JSONObject;Unable to link Kafka MQTT source connecttor to InfluxDB sink connectorAndroid Get JSON From Local FileKafka AVRO Deserialization Error when pushed data using Python's Avro libraryKafka-Connect HDFS - Protobuf to Parquet






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








0















We are trying to setup a source connector from mqtt to Landoop Kafka (after we will need a sink to InfluxDB).
The payload is in JSON format and we can't change that.
Some of the fields are nullable string.
Also we are using "forward slash" to separate subtopics (e.g. `Test/1, Test/2, etc.).



We have tried to use the JsonConverterWithSchemaEvolution but it gives problem with the forward slashes.
At the same time using JsonSimpleConverter seems to not support the null and using the auto-generated schema with the first converter results in an "incompatible" schema error.



What can we do?



MQTT DataSource configuration:




"name": "DataSource",
"config":

"connector.class": "com.datamountaineer.streamreactor.connect.mqtt.source.MqttSourceConnector",
"connect.mqtt.username": USER,
"connect.mqtt.password": PASS,
"tasks.max": "1",
"connect.mqtt.kcql": "INSERT INTO TestJson SELECT * FROM Test/1 WITHCONVERTER=`com.datamountaineer.streamreactor.connect.converters.source.JsonSimpleConverter`",
"connect.mqtt.service.quality": "2",
"connect.mqtt.hosts": "tcp://mqtt-broker:1883"




Sample JSON message:




   "TimeStamp":"24/07/2018 14:38:00.2650000",
   "unit":"U3",
   "Acc1":36.0,
   "PPR":null,



Auto generated schema-value




   "type":"record",
   "name":"TestJson",
   "fields":[
      
         "name":"TimeStamp",
         "type":["null","string"],
         "default":null
      ,
         "name":"unit",
         "type":["null","string"],
"default":null
      ,
         "name":"Acc1",
         "type":["null","double"],
         "default":null
      ,
         "name":"PPR",
         "type":["null","string"],
"default":null
      
   ],
   "connect.name":"TestJson"



Exception when using auto-generated schema and JsonSimpleConverter:



org.apache.kafka.connect.errors.DataException: TestJson
    at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:77)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:253)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:219)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: "type":"record","name":"TestJson","fields":["name":"TimeStamp","type":"string","name":"unit","type":"string","name":"Acc1","type":"double","name":"PPR","type":"string"],"connect.name":"TestJson"

Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema being registered is incompatible with an earlier schema; error code: 409
    at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:203)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:229)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:320)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:312)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:307)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:115)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:154)
    at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:79)
    at io.confluent.connect.avro.AvroConverter$Serializer.serialize(AvroConverter.java:116)
    at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:75)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:253)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:219)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)









share|improve this question


























  • It will valuable if you also put exception that appears in logs

    – wardziniak
    Mar 28 at 16:43

















0















We are trying to setup a source connector from mqtt to Landoop Kafka (after we will need a sink to InfluxDB).
The payload is in JSON format and we can't change that.
Some of the fields are nullable string.
Also we are using "forward slash" to separate subtopics (e.g. `Test/1, Test/2, etc.).



We have tried to use the JsonConverterWithSchemaEvolution but it gives problem with the forward slashes.
At the same time using JsonSimpleConverter seems to not support the null and using the auto-generated schema with the first converter results in an "incompatible" schema error.



What can we do?



MQTT DataSource configuration:




"name": "DataSource",
"config":

"connector.class": "com.datamountaineer.streamreactor.connect.mqtt.source.MqttSourceConnector",
"connect.mqtt.username": USER,
"connect.mqtt.password": PASS,
"tasks.max": "1",
"connect.mqtt.kcql": "INSERT INTO TestJson SELECT * FROM Test/1 WITHCONVERTER=`com.datamountaineer.streamreactor.connect.converters.source.JsonSimpleConverter`",
"connect.mqtt.service.quality": "2",
"connect.mqtt.hosts": "tcp://mqtt-broker:1883"




Sample JSON message:




   "TimeStamp":"24/07/2018 14:38:00.2650000",
   "unit":"U3",
   "Acc1":36.0,
   "PPR":null,



Auto generated schema-value




   "type":"record",
   "name":"TestJson",
   "fields":[
      
         "name":"TimeStamp",
         "type":["null","string"],
         "default":null
      ,
         "name":"unit",
         "type":["null","string"],
"default":null
      ,
         "name":"Acc1",
         "type":["null","double"],
         "default":null
      ,
         "name":"PPR",
         "type":["null","string"],
"default":null
      
   ],
   "connect.name":"TestJson"



Exception when using auto-generated schema and JsonSimpleConverter:



org.apache.kafka.connect.errors.DataException: TestJson
    at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:77)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:253)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:219)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: "type":"record","name":"TestJson","fields":["name":"TimeStamp","type":"string","name":"unit","type":"string","name":"Acc1","type":"double","name":"PPR","type":"string"],"connect.name":"TestJson"

Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema being registered is incompatible with an earlier schema; error code: 409
    at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:203)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:229)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:320)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:312)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:307)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:115)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:154)
    at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:79)
    at io.confluent.connect.avro.AvroConverter$Serializer.serialize(AvroConverter.java:116)
    at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:75)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:253)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:219)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)









share|improve this question


























  • It will valuable if you also put exception that appears in logs

    – wardziniak
    Mar 28 at 16:43













0












0








0


2






We are trying to setup a source connector from mqtt to Landoop Kafka (after we will need a sink to InfluxDB).
The payload is in JSON format and we can't change that.
Some of the fields are nullable string.
Also we are using "forward slash" to separate subtopics (e.g. `Test/1, Test/2, etc.).



We have tried to use the JsonConverterWithSchemaEvolution but it gives problem with the forward slashes.
At the same time using JsonSimpleConverter seems to not support the null and using the auto-generated schema with the first converter results in an "incompatible" schema error.



What can we do?



MQTT DataSource configuration:




"name": "DataSource",
"config":

"connector.class": "com.datamountaineer.streamreactor.connect.mqtt.source.MqttSourceConnector",
"connect.mqtt.username": USER,
"connect.mqtt.password": PASS,
"tasks.max": "1",
"connect.mqtt.kcql": "INSERT INTO TestJson SELECT * FROM Test/1 WITHCONVERTER=`com.datamountaineer.streamreactor.connect.converters.source.JsonSimpleConverter`",
"connect.mqtt.service.quality": "2",
"connect.mqtt.hosts": "tcp://mqtt-broker:1883"




Sample JSON message:




   "TimeStamp":"24/07/2018 14:38:00.2650000",
   "unit":"U3",
   "Acc1":36.0,
   "PPR":null,



Auto generated schema-value




   "type":"record",
   "name":"TestJson",
   "fields":[
      
         "name":"TimeStamp",
         "type":["null","string"],
         "default":null
      ,
         "name":"unit",
         "type":["null","string"],
"default":null
      ,
         "name":"Acc1",
         "type":["null","double"],
         "default":null
      ,
         "name":"PPR",
         "type":["null","string"],
"default":null
      
   ],
   "connect.name":"TestJson"



Exception when using auto-generated schema and JsonSimpleConverter:



org.apache.kafka.connect.errors.DataException: TestJson
    at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:77)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:253)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:219)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: "type":"record","name":"TestJson","fields":["name":"TimeStamp","type":"string","name":"unit","type":"string","name":"Acc1","type":"double","name":"PPR","type":"string"],"connect.name":"TestJson"

Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema being registered is incompatible with an earlier schema; error code: 409
    at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:203)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:229)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:320)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:312)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:307)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:115)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:154)
    at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:79)
    at io.confluent.connect.avro.AvroConverter$Serializer.serialize(AvroConverter.java:116)
    at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:75)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:253)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:219)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)









share|improve this question
















We are trying to setup a source connector from mqtt to Landoop Kafka (after we will need a sink to InfluxDB).
The payload is in JSON format and we can't change that.
Some of the fields are nullable string.
Also we are using "forward slash" to separate subtopics (e.g. `Test/1, Test/2, etc.).



We have tried to use the JsonConverterWithSchemaEvolution but it gives problem with the forward slashes.
At the same time using JsonSimpleConverter seems to not support the null and using the auto-generated schema with the first converter results in an "incompatible" schema error.



What can we do?



MQTT DataSource configuration:




"name": "DataSource",
"config":

"connector.class": "com.datamountaineer.streamreactor.connect.mqtt.source.MqttSourceConnector",
"connect.mqtt.username": USER,
"connect.mqtt.password": PASS,
"tasks.max": "1",
"connect.mqtt.kcql": "INSERT INTO TestJson SELECT * FROM Test/1 WITHCONVERTER=`com.datamountaineer.streamreactor.connect.converters.source.JsonSimpleConverter`",
"connect.mqtt.service.quality": "2",
"connect.mqtt.hosts": "tcp://mqtt-broker:1883"




Sample JSON message:




   "TimeStamp":"24/07/2018 14:38:00.2650000",
   "unit":"U3",
   "Acc1":36.0,
   "PPR":null,



Auto generated schema-value




   "type":"record",
   "name":"TestJson",
   "fields":[
      
         "name":"TimeStamp",
         "type":["null","string"],
         "default":null
      ,
         "name":"unit",
         "type":["null","string"],
"default":null
      ,
         "name":"Acc1",
         "type":["null","double"],
         "default":null
      ,
         "name":"PPR",
         "type":["null","string"],
"default":null
      
   ],
   "connect.name":"TestJson"



Exception when using auto-generated schema and JsonSimpleConverter:



org.apache.kafka.connect.errors.DataException: TestJson
    at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:77)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:253)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:219)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: "type":"record","name":"TestJson","fields":["name":"TimeStamp","type":"string","name":"unit","type":"string","name":"Acc1","type":"double","name":"PPR","type":"string"],"connect.name":"TestJson"

Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema being registered is incompatible with an earlier schema; error code: 409
    at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:203)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:229)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:320)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:312)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:307)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:115)
    at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:154)
    at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:79)
    at io.confluent.connect.avro.AvroConverter$Serializer.serialize(AvroConverter.java:116)
    at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:75)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:253)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:219)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)






json apache-kafka mqtt nullable jsonconvert






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Apr 3 at 7:21







elabard

















asked Mar 28 at 16:39









elabardelabard

2884 silver badges14 bronze badges




2884 silver badges14 bronze badges















  • It will valuable if you also put exception that appears in logs

    – wardziniak
    Mar 28 at 16:43

















  • It will valuable if you also put exception that appears in logs

    – wardziniak
    Mar 28 at 16:43
















It will valuable if you also put exception that appears in logs

– wardziniak
Mar 28 at 16:43





It will valuable if you also put exception that appears in logs

– wardziniak
Mar 28 at 16:43












0






active

oldest

votes














Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);














draft saved

draft discarded
















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55402805%2fkafka-jsonconverter-schema-with-nullable-field-and-topic-with-forward-slash%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55402805%2fkafka-jsonconverter-schema-with-nullable-field-and-topic-with-forward-slash%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현