java.lang.StackOverflowError when doing Spark Streamingjava.io.NotSerializableException in java code?NotSerializableException: org.apache.avro.io.DecoderFactory in Google Cloud Dataflow pipelineConcurrentLinkedQueue: NotSerializableExceptionTomcat Warning : Cannot serialize session attribute ServletRequestAttributes.DESTRUCTION_CALLBACK.SearchServiceSpark: java.io.NotSerializableExceptionPyspark - fpgrowth - association rules - StackOverflow errorSpark Kafka Streaming CommitAsync ErrorJCS Disable auxiliary cache due ConcurrentModificationException in thread “main” java.lang.NoClassDefFoundError: org/joda/time/DateTime SparkSerialization of Scala Objects

Draw a symmetric alien head

What are Elsa's reasons for selecting the Holy Grail on behalf of Donovan?

Am I legally required to provide a (GPL licensed) source code even after a project is abandoned?

Subtract the Folded Matrix

King or Queen-Which piece is which?

Why does independence imply zero correlation?

What are the current battlegrounds for people’s “rights” in the UK?

How did Gollum enter Moria?

What are the pros and cons for the two possible "gear directions" when parking the car on a hill?

"Correct me if I'm wrong"

Designing a magic-compatible polearm

Mathematically modelling RC circuit with a linear input

Why does Linux list NVMe drives as /dev/nvme0 instead of /dev/sda?

Why is it easier to balance a non-moving bike standing up than sitting down?

What is the "ls" directory in my home directory?

Should I include an appendix for inessential, yet related worldbuilding to my story?

How could empty set be unique if it could be vacuously false

Explain why a line can never intersect a plane in exactly two points.

Dates on degrees don’t make sense – will people care?

Why don't we have a weaning party like Avraham did?

Why isn't my calculation that we should be able to see the sun well beyond the observable universe valid?

Explicit song lyrics checker

How does DC work with natural 20?

What is the oldest commercial MS-DOS program that can run on modern versions of Windows without third-party software?



java.lang.StackOverflowError when doing Spark Streaming


java.io.NotSerializableException in java code?NotSerializableException: org.apache.avro.io.DecoderFactory in Google Cloud Dataflow pipelineConcurrentLinkedQueue: NotSerializableExceptionTomcat Warning : Cannot serialize session attribute ServletRequestAttributes.DESTRUCTION_CALLBACK.SearchServiceSpark: java.io.NotSerializableExceptionPyspark - fpgrowth - association rules - StackOverflow errorSpark Kafka Streaming CommitAsync ErrorJCS Disable auxiliary cache due ConcurrentModificationException in thread “main” java.lang.NoClassDefFoundError: org/joda/time/DateTime SparkSerialization of Scala Objects






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








0















I'm doing Spark Streaming to parse some kafka messages in real time. Before I parse the message, I read some file from local and construct two variables GridMatrix GM and LinkMatcher LM which are useful to parsing. Here is the code that gives me java.lang.StackOverflowError when I submit it using spark-submit xxx.jar:



public class Stream implements Serializable 
GridMatrix GM = GridMatrixConstructor.init_Grid_Matrix(0.001);
LinkMatcher LM = new LinkMatcher();

public void parse_rdd_record(String[] fields)
try
System.out.println(InetAddress.getLocalHost().getHostName() + "---->" + Thread.currentThread());

catch (Exception e)
e.printStackTrace();

System.out.println(LM.GF.toString());
System.out.println(GM.topleft_x);


public void Streaming_process() throws Exception
SparkConf conf = new SparkConf()
.setAppName("SparkStreaming")
.setMaster("local[*]");
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
conf.registerKryoClasses(new Class<?>[]
Class.forName("Streaming.Stream")
);


JavaSparkContext sc = new JavaSparkContext(conf);
sc.setLogLevel("WARN");
JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(2000));
Map<String, Object> kafkaParams = new HashMap<>();
kafkaParams.put("bootstrap.servers", "xxx.xx.xx.xx:20103,xxx.xx.xx.xx:20104,xxx.xx.xx.xx:20105");
kafkaParams.put("key.deserializer", StringDeserializer.class);
kafkaParams.put("value.deserializer", StringDeserializer.class);
kafkaParams.put("group.id", "use_a_separate_group_id_for_each_stream");
kafkaParams.put("auto.offset.reset", "latest");
kafkaParams.put("enable.auto.commit", false);

Collection<String> topics = Arrays.asList("nc_topic_gis_test");
JavaInputDStream<ConsumerRecord<String, String>> GPS_DStream =
KafkaUtils.createDirectStream(
ssc,
LocationStrategies.PreferConsistent(),
ConsumerStrategies.<String, String>Subscribe(topics, kafkaParams)
);

JavaPairDStream<String, String> GPS_DStream_Pair = GPS_DStream.mapToPair(
(PairFunction<ConsumerRecord<String, String>, String, String>) record ->
new Tuple2<>("GPSValue", record.value()));

GPS_DStream_Pair.foreachRDD(PairRDD -> PairRDD.foreach(rdd ->
String[] fields = rdd._2.split(",");
this.parse_rdd_record(fields);
));

ssc.start();
ssc.awaitTermination();


public static void main(String[] args) throws Exception
new Stream().Streaming_process();




It gives me the following error:



Exception in thread "streaming-job-executor-0" java.lang.StackOverflowError
at java.io.Bits.putDouble(Bits.java:121)
at java.io.ObjectStreamClass$FieldReflector.getPrimFieldValues(ObjectStreamClass.java:2168)
at java.io.ObjectStreamClass.getPrimFieldValues(ObjectStreamClass.java:1389)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1533)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at java.util.HashMap.internalWriteEntries(HashMap.java:1790)
at java.util.HashMap.writeObject(HashMap.java:1363)
at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1140)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at java.util.HashMap.internalWriteEntries(HashMap.java:1790)
at java.util.HashMap.writeObject(HashMap.java:1363)
at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)


However, if I change the GM and LM to static variable, it runs well. Change line 2 and line 3 to:



private static final GridMatrix GM = GridMatrixConstructor.init_Grid_Matrix(0.001);
private static final LinkMatcher LM = new LinkMatcher();


Could someone tell me why it won't work with none static variables?










share|improve this question




























    0















    I'm doing Spark Streaming to parse some kafka messages in real time. Before I parse the message, I read some file from local and construct two variables GridMatrix GM and LinkMatcher LM which are useful to parsing. Here is the code that gives me java.lang.StackOverflowError when I submit it using spark-submit xxx.jar:



    public class Stream implements Serializable 
    GridMatrix GM = GridMatrixConstructor.init_Grid_Matrix(0.001);
    LinkMatcher LM = new LinkMatcher();

    public void parse_rdd_record(String[] fields)
    try
    System.out.println(InetAddress.getLocalHost().getHostName() + "---->" + Thread.currentThread());

    catch (Exception e)
    e.printStackTrace();

    System.out.println(LM.GF.toString());
    System.out.println(GM.topleft_x);


    public void Streaming_process() throws Exception
    SparkConf conf = new SparkConf()
    .setAppName("SparkStreaming")
    .setMaster("local[*]");
    conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
    conf.registerKryoClasses(new Class<?>[]
    Class.forName("Streaming.Stream")
    );


    JavaSparkContext sc = new JavaSparkContext(conf);
    sc.setLogLevel("WARN");
    JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(2000));
    Map<String, Object> kafkaParams = new HashMap<>();
    kafkaParams.put("bootstrap.servers", "xxx.xx.xx.xx:20103,xxx.xx.xx.xx:20104,xxx.xx.xx.xx:20105");
    kafkaParams.put("key.deserializer", StringDeserializer.class);
    kafkaParams.put("value.deserializer", StringDeserializer.class);
    kafkaParams.put("group.id", "use_a_separate_group_id_for_each_stream");
    kafkaParams.put("auto.offset.reset", "latest");
    kafkaParams.put("enable.auto.commit", false);

    Collection<String> topics = Arrays.asList("nc_topic_gis_test");
    JavaInputDStream<ConsumerRecord<String, String>> GPS_DStream =
    KafkaUtils.createDirectStream(
    ssc,
    LocationStrategies.PreferConsistent(),
    ConsumerStrategies.<String, String>Subscribe(topics, kafkaParams)
    );

    JavaPairDStream<String, String> GPS_DStream_Pair = GPS_DStream.mapToPair(
    (PairFunction<ConsumerRecord<String, String>, String, String>) record ->
    new Tuple2<>("GPSValue", record.value()));

    GPS_DStream_Pair.foreachRDD(PairRDD -> PairRDD.foreach(rdd ->
    String[] fields = rdd._2.split(",");
    this.parse_rdd_record(fields);
    ));

    ssc.start();
    ssc.awaitTermination();


    public static void main(String[] args) throws Exception
    new Stream().Streaming_process();




    It gives me the following error:



    Exception in thread "streaming-job-executor-0" java.lang.StackOverflowError
    at java.io.Bits.putDouble(Bits.java:121)
    at java.io.ObjectStreamClass$FieldReflector.getPrimFieldValues(ObjectStreamClass.java:2168)
    at java.io.ObjectStreamClass.getPrimFieldValues(ObjectStreamClass.java:1389)
    at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1533)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
    at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
    at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
    at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
    at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
    at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
    at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
    at java.util.HashMap.internalWriteEntries(HashMap.java:1790)
    at java.util.HashMap.writeObject(HashMap.java:1363)
    at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1140)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
    at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
    at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
    at java.util.HashMap.internalWriteEntries(HashMap.java:1790)
    at java.util.HashMap.writeObject(HashMap.java:1363)
    at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)


    However, if I change the GM and LM to static variable, it runs well. Change line 2 and line 3 to:



    private static final GridMatrix GM = GridMatrixConstructor.init_Grid_Matrix(0.001);
    private static final LinkMatcher LM = new LinkMatcher();


    Could someone tell me why it won't work with none static variables?










    share|improve this question
























      0












      0








      0








      I'm doing Spark Streaming to parse some kafka messages in real time. Before I parse the message, I read some file from local and construct two variables GridMatrix GM and LinkMatcher LM which are useful to parsing. Here is the code that gives me java.lang.StackOverflowError when I submit it using spark-submit xxx.jar:



      public class Stream implements Serializable 
      GridMatrix GM = GridMatrixConstructor.init_Grid_Matrix(0.001);
      LinkMatcher LM = new LinkMatcher();

      public void parse_rdd_record(String[] fields)
      try
      System.out.println(InetAddress.getLocalHost().getHostName() + "---->" + Thread.currentThread());

      catch (Exception e)
      e.printStackTrace();

      System.out.println(LM.GF.toString());
      System.out.println(GM.topleft_x);


      public void Streaming_process() throws Exception
      SparkConf conf = new SparkConf()
      .setAppName("SparkStreaming")
      .setMaster("local[*]");
      conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
      conf.registerKryoClasses(new Class<?>[]
      Class.forName("Streaming.Stream")
      );


      JavaSparkContext sc = new JavaSparkContext(conf);
      sc.setLogLevel("WARN");
      JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(2000));
      Map<String, Object> kafkaParams = new HashMap<>();
      kafkaParams.put("bootstrap.servers", "xxx.xx.xx.xx:20103,xxx.xx.xx.xx:20104,xxx.xx.xx.xx:20105");
      kafkaParams.put("key.deserializer", StringDeserializer.class);
      kafkaParams.put("value.deserializer", StringDeserializer.class);
      kafkaParams.put("group.id", "use_a_separate_group_id_for_each_stream");
      kafkaParams.put("auto.offset.reset", "latest");
      kafkaParams.put("enable.auto.commit", false);

      Collection<String> topics = Arrays.asList("nc_topic_gis_test");
      JavaInputDStream<ConsumerRecord<String, String>> GPS_DStream =
      KafkaUtils.createDirectStream(
      ssc,
      LocationStrategies.PreferConsistent(),
      ConsumerStrategies.<String, String>Subscribe(topics, kafkaParams)
      );

      JavaPairDStream<String, String> GPS_DStream_Pair = GPS_DStream.mapToPair(
      (PairFunction<ConsumerRecord<String, String>, String, String>) record ->
      new Tuple2<>("GPSValue", record.value()));

      GPS_DStream_Pair.foreachRDD(PairRDD -> PairRDD.foreach(rdd ->
      String[] fields = rdd._2.split(",");
      this.parse_rdd_record(fields);
      ));

      ssc.start();
      ssc.awaitTermination();


      public static void main(String[] args) throws Exception
      new Stream().Streaming_process();




      It gives me the following error:



      Exception in thread "streaming-job-executor-0" java.lang.StackOverflowError
      at java.io.Bits.putDouble(Bits.java:121)
      at java.io.ObjectStreamClass$FieldReflector.getPrimFieldValues(ObjectStreamClass.java:2168)
      at java.io.ObjectStreamClass.getPrimFieldValues(ObjectStreamClass.java:1389)
      at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1533)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
      at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
      at java.util.HashMap.internalWriteEntries(HashMap.java:1790)
      at java.util.HashMap.writeObject(HashMap.java:1363)
      at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:498)
      at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1140)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
      at java.util.HashMap.internalWriteEntries(HashMap.java:1790)
      at java.util.HashMap.writeObject(HashMap.java:1363)
      at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)


      However, if I change the GM and LM to static variable, it runs well. Change line 2 and line 3 to:



      private static final GridMatrix GM = GridMatrixConstructor.init_Grid_Matrix(0.001);
      private static final LinkMatcher LM = new LinkMatcher();


      Could someone tell me why it won't work with none static variables?










      share|improve this question














      I'm doing Spark Streaming to parse some kafka messages in real time. Before I parse the message, I read some file from local and construct two variables GridMatrix GM and LinkMatcher LM which are useful to parsing. Here is the code that gives me java.lang.StackOverflowError when I submit it using spark-submit xxx.jar:



      public class Stream implements Serializable 
      GridMatrix GM = GridMatrixConstructor.init_Grid_Matrix(0.001);
      LinkMatcher LM = new LinkMatcher();

      public void parse_rdd_record(String[] fields)
      try
      System.out.println(InetAddress.getLocalHost().getHostName() + "---->" + Thread.currentThread());

      catch (Exception e)
      e.printStackTrace();

      System.out.println(LM.GF.toString());
      System.out.println(GM.topleft_x);


      public void Streaming_process() throws Exception
      SparkConf conf = new SparkConf()
      .setAppName("SparkStreaming")
      .setMaster("local[*]");
      conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
      conf.registerKryoClasses(new Class<?>[]
      Class.forName("Streaming.Stream")
      );


      JavaSparkContext sc = new JavaSparkContext(conf);
      sc.setLogLevel("WARN");
      JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(2000));
      Map<String, Object> kafkaParams = new HashMap<>();
      kafkaParams.put("bootstrap.servers", "xxx.xx.xx.xx:20103,xxx.xx.xx.xx:20104,xxx.xx.xx.xx:20105");
      kafkaParams.put("key.deserializer", StringDeserializer.class);
      kafkaParams.put("value.deserializer", StringDeserializer.class);
      kafkaParams.put("group.id", "use_a_separate_group_id_for_each_stream");
      kafkaParams.put("auto.offset.reset", "latest");
      kafkaParams.put("enable.auto.commit", false);

      Collection<String> topics = Arrays.asList("nc_topic_gis_test");
      JavaInputDStream<ConsumerRecord<String, String>> GPS_DStream =
      KafkaUtils.createDirectStream(
      ssc,
      LocationStrategies.PreferConsistent(),
      ConsumerStrategies.<String, String>Subscribe(topics, kafkaParams)
      );

      JavaPairDStream<String, String> GPS_DStream_Pair = GPS_DStream.mapToPair(
      (PairFunction<ConsumerRecord<String, String>, String, String>) record ->
      new Tuple2<>("GPSValue", record.value()));

      GPS_DStream_Pair.foreachRDD(PairRDD -> PairRDD.foreach(rdd ->
      String[] fields = rdd._2.split(",");
      this.parse_rdd_record(fields);
      ));

      ssc.start();
      ssc.awaitTermination();


      public static void main(String[] args) throws Exception
      new Stream().Streaming_process();




      It gives me the following error:



      Exception in thread "streaming-job-executor-0" java.lang.StackOverflowError
      at java.io.Bits.putDouble(Bits.java:121)
      at java.io.ObjectStreamClass$FieldReflector.getPrimFieldValues(ObjectStreamClass.java:2168)
      at java.io.ObjectStreamClass.getPrimFieldValues(ObjectStreamClass.java:1389)
      at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1533)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
      at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
      at java.util.HashMap.internalWriteEntries(HashMap.java:1790)
      at java.util.HashMap.writeObject(HashMap.java:1363)
      at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:498)
      at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1140)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
      at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
      at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
      at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
      at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
      at java.util.HashMap.internalWriteEntries(HashMap.java:1790)
      at java.util.HashMap.writeObject(HashMap.java:1363)
      at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)


      However, if I change the GM and LM to static variable, it runs well. Change line 2 and line 3 to:



      private static final GridMatrix GM = GridMatrixConstructor.init_Grid_Matrix(0.001);
      private static final LinkMatcher LM = new LinkMatcher();


      Could someone tell me why it won't work with none static variables?







      java apache-spark static spark-streaming






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Mar 25 at 7:08









      YQ.WangYQ.Wang

      429314




      429314






















          1 Answer
          1






          active

          oldest

          votes


















          0














          The difference between static and non static version is that when non static it sends them to all workers as the Stream closure, when static not by default, except it using by one of streaming lambda, which is not a case.



          While sending to the workers it trying to serialize an object. And it fails according to the provided stack trace. The reason most probably because those structures has cyclic declaration on themselfs inside and can not be properly serializable.






          share|improve this answer























            Your Answer






            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "1"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55332724%2fjava-lang-stackoverflowerror-when-doing-spark-streaming%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            The difference between static and non static version is that when non static it sends them to all workers as the Stream closure, when static not by default, except it using by one of streaming lambda, which is not a case.



            While sending to the workers it trying to serialize an object. And it fails according to the provided stack trace. The reason most probably because those structures has cyclic declaration on themselfs inside and can not be properly serializable.






            share|improve this answer



























              0














              The difference between static and non static version is that when non static it sends them to all workers as the Stream closure, when static not by default, except it using by one of streaming lambda, which is not a case.



              While sending to the workers it trying to serialize an object. And it fails according to the provided stack trace. The reason most probably because those structures has cyclic declaration on themselfs inside and can not be properly serializable.






              share|improve this answer

























                0












                0








                0







                The difference between static and non static version is that when non static it sends them to all workers as the Stream closure, when static not by default, except it using by one of streaming lambda, which is not a case.



                While sending to the workers it trying to serialize an object. And it fails according to the provided stack trace. The reason most probably because those structures has cyclic declaration on themselfs inside and can not be properly serializable.






                share|improve this answer













                The difference between static and non static version is that when non static it sends them to all workers as the Stream closure, when static not by default, except it using by one of streaming lambda, which is not a case.



                While sending to the workers it trying to serialize an object. And it fails according to the provided stack trace. The reason most probably because those structures has cyclic declaration on themselfs inside and can not be properly serializable.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Mar 25 at 16:37









                Volodymyr ZubarievVolodymyr Zubariev

                1538




                1538





























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55332724%2fjava-lang-stackoverflowerror-when-doing-spark-streaming%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

                    SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

                    은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현