net.jpounz.lz4 exception when reading from kafka with spark streamingJava Execption while parsing JSON RDD from Kafka StreamError Accessing ORC files from pyspark using api newAPIHadoopFile, spark 1.2Spark com.databricks.spark.csv is not able to load a snappy compressed file using node-snappySparkStreaming+Kafka: Failed to get records after polling for 60000apache spark streaming kafka integration error JAVASpark dataframes are not all the sameSBT Test Error: java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStreamSpark Structured Streaming + Kafka: Stop query from crashing when Kafka message doesn't match JSON schemaIs it possible that creating broadcast variables within spark streaming transformation function

How is this practical and very old scene shot?

How to properly say asset/assets in German

I just started should I accept a farewell lunch for a coworker I don't know?

Can one use the present progressive or gerund like an adjective?

How can I deal with extreme temperatures in a hotel room?

Are all commands with an optional argument fragile?

Which is better for keeping data: primary partition or logical partition?

Security Patch SUPEE-11155 - Possible issues?

Just graduated with a master’s degree, but I internalised nothing

Copy group of files (Filename*) to backup (Filename*.bak)

How can a valley surrounded by mountains be fertile and rainy?

What verb for taking advantage fits in "I don't want to ________ on the friendship"?

Single level file directory

Movie with Zoltar in a trailer park named Paradise and a boy playing a video game then being recruited by aliens to fight in space

Have any large aeroplanes been landed - safely and without damage - in locations that they could not be flown away from?

Most important new papers in computational complexity

I hit a pipe with a mower and now it won't turn

Converting Geographic Coordinates into Lambert2008 coordinates

Are the requirements of a Horn of Valhalla cumulative?

What game is this character in the Pixels movie from?

Grant dbcreator only for databases matching prefix

Why is Japan trying to have a better relationship with Iran?

Closest Proximity of Oceans to Freshwater Springs

What is this mount with two buttons on side of Vivitar 75-205mm lens?



net.jpounz.lz4 exception when reading from kafka with spark streaming


Java Execption while parsing JSON RDD from Kafka StreamError Accessing ORC files from pyspark using api newAPIHadoopFile, spark 1.2Spark com.databricks.spark.csv is not able to load a snappy compressed file using node-snappySparkStreaming+Kafka: Failed to get records after polling for 60000apache spark streaming kafka integration error JAVASpark dataframes are not all the sameSBT Test Error: java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStreamSpark Structured Streaming + Kafka: Stop query from crashing when Kafka message doesn't match JSON schemaIs it possible that creating broadcast variables within spark streaming transformation function






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








0















I use spark 2.4.0 using python. and read data from the kafka_2.11-2.0.0 (binary not source). I m using spark-submit --jars sspark-streaming-kafka-0-8-assembly_2.11-2.4.0.jar script.py an error message appears in the error report, if any one can help , thanks :)



19/03/25 13:48:53 ERROR Utils: Uncaught exception in thread stdout writer for python
java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.<init>(Ljava/io/InputStream;Z)V
at org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:122)
at org.apache.spark.serializer.SerializerManager.wrapForCompression(SerializerManager.scala:163)
at org.apache.spark.serializer.SerializerManager.wrapStream(SerializerManager.scala:124)
at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:453)
at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:64)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:30)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:224)
at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:557)
at org.apache.spark.api.python.BasePythonRunner$WriterThread$$anonfun$run$1.apply(PythonRunner.scala:345)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:194)
Exception in thread "stdout writer for python" java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.<init>(Ljava/io/InputStream;Z)V
at org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:122)
at org.apache.spark.serializer.SerializerManager.wrapForCompression(SerializerManager.scala:163)
at org.apache.spark.serializer.SerializerManager.wrapStream(SerializerManager.scala:124)
at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:453)
at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:64)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:30)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:224)
at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:557)
at org.apache.spark.api.python.BasePythonRunner$WriterThread$$anonfun$run$1.apply(PythonRunner.scala:345)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:194)


The pom.xml of the jar file : spark-streaming-kafka-0-8-assembly_2.11-2.4.0.jar :






<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<version>2.4.0</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>spark-streaming-kafka-0-8-assembly_2.11</artifactId>
<packaging>jar</packaging>
<name>Spark Project External Kafka Assembly</name>
<url>http://spark.apache.org/</url>

<properties>
<sbt.project.name>streaming-kafka-0-8-assembly</sbt.project.name>
</properties>

<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_$scala.binary.version</artifactId>
<version>$project.version</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_$scala.binary.version</artifactId>
<version>$project.version</version>
<scope>provided</scope>
</dependency>
<!--
Demote already included in the Spark assembly.
-->
<dependency>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.lz4</groupId>
<artifactId>lz4-java</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-mapred</artifactId>
<classifier>$avro.mapred.classifier</classifier>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.curator</groupId>
<artifactId>curator-recipes</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.xerial.snappy</groupId>
<artifactId>snappy-java</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
</dependencies>

<build>
<outputDirectory>target/scala-$scala.binary.version/classes</outputDirectory>
<testOutputDirectory>target/scala-$scala.binary.version/test-classes</testOutputDirectory>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<configuration>
<shadedArtifactAttached>false</shadedArtifactAttached>
<artifactSet>
<includes>
<include>*:*</include>
</includes>
</artifactSet>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.DontIncludeResourceTransformer">
<resource>log4j.properties</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ApacheNoticeResourceTransformer"/>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>












share|improve this question






























    0















    I use spark 2.4.0 using python. and read data from the kafka_2.11-2.0.0 (binary not source). I m using spark-submit --jars sspark-streaming-kafka-0-8-assembly_2.11-2.4.0.jar script.py an error message appears in the error report, if any one can help , thanks :)



    19/03/25 13:48:53 ERROR Utils: Uncaught exception in thread stdout writer for python
    java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.<init>(Ljava/io/InputStream;Z)V
    at org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:122)
    at org.apache.spark.serializer.SerializerManager.wrapForCompression(SerializerManager.scala:163)
    at org.apache.spark.serializer.SerializerManager.wrapStream(SerializerManager.scala:124)
    at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
    at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
    at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:453)
    at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:64)
    at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
    at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
    at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:30)
    at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
    at scala.collection.Iterator$class.foreach(Iterator.scala:891)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
    at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:224)
    at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:557)
    at org.apache.spark.api.python.BasePythonRunner$WriterThread$$anonfun$run$1.apply(PythonRunner.scala:345)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
    at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:194)
    Exception in thread "stdout writer for python" java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.<init>(Ljava/io/InputStream;Z)V
    at org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:122)
    at org.apache.spark.serializer.SerializerManager.wrapForCompression(SerializerManager.scala:163)
    at org.apache.spark.serializer.SerializerManager.wrapStream(SerializerManager.scala:124)
    at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
    at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
    at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:453)
    at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:64)
    at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
    at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
    at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:30)
    at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
    at scala.collection.Iterator$class.foreach(Iterator.scala:891)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
    at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:224)
    at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:557)
    at org.apache.spark.api.python.BasePythonRunner$WriterThread$$anonfun$run$1.apply(PythonRunner.scala:345)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
    at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:194)


    The pom.xml of the jar file : spark-streaming-kafka-0-8-assembly_2.11-2.4.0.jar :






    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-parent_2.11</artifactId>
    <version>2.4.0</version>
    <relativePath>../../pom.xml</relativePath>
    </parent>

    <artifactId>spark-streaming-kafka-0-8-assembly_2.11</artifactId>
    <packaging>jar</packaging>
    <name>Spark Project External Kafka Assembly</name>
    <url>http://spark.apache.org/</url>

    <properties>
    <sbt.project.name>streaming-kafka-0-8-assembly</sbt.project.name>
    </properties>

    <dependencies>
    <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming-kafka-0-8_$scala.binary.version</artifactId>
    <version>$project.version</version>
    </dependency>
    <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming_$scala.binary.version</artifactId>
    <version>$project.version</version>
    <scope>provided</scope>
    </dependency>
    <!--
    Demote already included in the Spark assembly.
    -->
    <dependency>
    <groupId>commons-codec</groupId>
    <artifactId>commons-codec</artifactId>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>commons-lang</groupId>
    <artifactId>commons-lang</artifactId>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>com.google.protobuf</groupId>
    <artifactId>protobuf-java</artifactId>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>org.lz4</groupId>
    <artifactId>lz4-java</artifactId>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>org.apache.avro</groupId>
    <artifactId>avro-mapred</artifactId>
    <classifier>$avro.mapred.classifier</classifier>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>org.apache.curator</groupId>
    <artifactId>curator-recipes</artifactId>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>org.apache.zookeeper</groupId>
    <artifactId>zookeeper</artifactId>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>log4j</groupId>
    <artifactId>log4j</artifactId>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>org.slf4j</groupId>
    <artifactId>slf4j-api</artifactId>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>org.slf4j</groupId>
    <artifactId>slf4j-log4j12</artifactId>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>org.xerial.snappy</groupId>
    <artifactId>snappy-java</artifactId>
    <scope>provided</scope>
    </dependency>
    <dependency>
    </dependencies>

    <build>
    <outputDirectory>target/scala-$scala.binary.version/classes</outputDirectory>
    <testOutputDirectory>target/scala-$scala.binary.version/test-classes</testOutputDirectory>
    <plugins>
    <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-shade-plugin</artifactId>
    <configuration>
    <shadedArtifactAttached>false</shadedArtifactAttached>
    <artifactSet>
    <includes>
    <include>*:*</include>
    </includes>
    </artifactSet>
    <filters>
    <filter>
    <artifact>*:*</artifact>
    <excludes>
    <exclude>META-INF/*.SF</exclude>
    <exclude>META-INF/*.DSA</exclude>
    <exclude>META-INF/*.RSA</exclude>
    </excludes>
    </filter>
    </filters>
    </configuration>
    <executions>
    <execution>
    <phase>package</phase>
    <goals>
    <goal>shade</goal>
    </goals>
    <configuration>
    <transformers>
    <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
    <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
    <resource>reference.conf</resource>
    </transformer>
    <transformer implementation="org.apache.maven.plugins.shade.resource.DontIncludeResourceTransformer">
    <resource>log4j.properties</resource>
    </transformer>
    <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer"/>
    <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheNoticeResourceTransformer"/>
    </transformers>
    </configuration>
    </execution>
    </executions>
    </plugin>
    </plugins>
    </build>
    </project>












    share|improve this question


























      0












      0








      0


      0






      I use spark 2.4.0 using python. and read data from the kafka_2.11-2.0.0 (binary not source). I m using spark-submit --jars sspark-streaming-kafka-0-8-assembly_2.11-2.4.0.jar script.py an error message appears in the error report, if any one can help , thanks :)



      19/03/25 13:48:53 ERROR Utils: Uncaught exception in thread stdout writer for python
      java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.<init>(Ljava/io/InputStream;Z)V
      at org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:122)
      at org.apache.spark.serializer.SerializerManager.wrapForCompression(SerializerManager.scala:163)
      at org.apache.spark.serializer.SerializerManager.wrapStream(SerializerManager.scala:124)
      at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
      at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
      at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:453)
      at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:64)
      at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
      at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
      at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
      at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:30)
      at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
      at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
      at scala.collection.Iterator$class.foreach(Iterator.scala:891)
      at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
      at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:224)
      at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:557)
      at org.apache.spark.api.python.BasePythonRunner$WriterThread$$anonfun$run$1.apply(PythonRunner.scala:345)
      at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
      at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:194)
      Exception in thread "stdout writer for python" java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.<init>(Ljava/io/InputStream;Z)V
      at org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:122)
      at org.apache.spark.serializer.SerializerManager.wrapForCompression(SerializerManager.scala:163)
      at org.apache.spark.serializer.SerializerManager.wrapStream(SerializerManager.scala:124)
      at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
      at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
      at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:453)
      at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:64)
      at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
      at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
      at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
      at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:30)
      at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
      at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
      at scala.collection.Iterator$class.foreach(Iterator.scala:891)
      at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
      at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:224)
      at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:557)
      at org.apache.spark.api.python.BasePythonRunner$WriterThread$$anonfun$run$1.apply(PythonRunner.scala:345)
      at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
      at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:194)


      The pom.xml of the jar file : spark-streaming-kafka-0-8-assembly_2.11-2.4.0.jar :






      <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
      <modelVersion>4.0.0</modelVersion>
      <parent>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-parent_2.11</artifactId>
      <version>2.4.0</version>
      <relativePath>../../pom.xml</relativePath>
      </parent>

      <artifactId>spark-streaming-kafka-0-8-assembly_2.11</artifactId>
      <packaging>jar</packaging>
      <name>Spark Project External Kafka Assembly</name>
      <url>http://spark.apache.org/</url>

      <properties>
      <sbt.project.name>streaming-kafka-0-8-assembly</sbt.project.name>
      </properties>

      <dependencies>
      <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming-kafka-0-8_$scala.binary.version</artifactId>
      <version>$project.version</version>
      </dependency>
      <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming_$scala.binary.version</artifactId>
      <version>$project.version</version>
      <scope>provided</scope>
      </dependency>
      <!--
      Demote already included in the Spark assembly.
      -->
      <dependency>
      <groupId>commons-codec</groupId>
      <artifactId>commons-codec</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>commons-lang</groupId>
      <artifactId>commons-lang</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>com.google.protobuf</groupId>
      <artifactId>protobuf-java</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.lz4</groupId>
      <artifactId>lz4-java</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.avro</groupId>
      <artifactId>avro-mapred</artifactId>
      <classifier>$avro.mapred.classifier</classifier>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.curator</groupId>
      <artifactId>curator-recipes</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.zookeeper</groupId>
      <artifactId>zookeeper</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>log4j</groupId>
      <artifactId>log4j</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-api</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-log4j12</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.xerial.snappy</groupId>
      <artifactId>snappy-java</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      </dependencies>

      <build>
      <outputDirectory>target/scala-$scala.binary.version/classes</outputDirectory>
      <testOutputDirectory>target/scala-$scala.binary.version/test-classes</testOutputDirectory>
      <plugins>
      <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-shade-plugin</artifactId>
      <configuration>
      <shadedArtifactAttached>false</shadedArtifactAttached>
      <artifactSet>
      <includes>
      <include>*:*</include>
      </includes>
      </artifactSet>
      <filters>
      <filter>
      <artifact>*:*</artifact>
      <excludes>
      <exclude>META-INF/*.SF</exclude>
      <exclude>META-INF/*.DSA</exclude>
      <exclude>META-INF/*.RSA</exclude>
      </excludes>
      </filter>
      </filters>
      </configuration>
      <executions>
      <execution>
      <phase>package</phase>
      <goals>
      <goal>shade</goal>
      </goals>
      <configuration>
      <transformers>
      <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
      <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
      <resource>reference.conf</resource>
      </transformer>
      <transformer implementation="org.apache.maven.plugins.shade.resource.DontIncludeResourceTransformer">
      <resource>log4j.properties</resource>
      </transformer>
      <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer"/>
      <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheNoticeResourceTransformer"/>
      </transformers>
      </configuration>
      </execution>
      </executions>
      </plugin>
      </plugins>
      </build>
      </project>












      share|improve this question
















      I use spark 2.4.0 using python. and read data from the kafka_2.11-2.0.0 (binary not source). I m using spark-submit --jars sspark-streaming-kafka-0-8-assembly_2.11-2.4.0.jar script.py an error message appears in the error report, if any one can help , thanks :)



      19/03/25 13:48:53 ERROR Utils: Uncaught exception in thread stdout writer for python
      java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.<init>(Ljava/io/InputStream;Z)V
      at org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:122)
      at org.apache.spark.serializer.SerializerManager.wrapForCompression(SerializerManager.scala:163)
      at org.apache.spark.serializer.SerializerManager.wrapStream(SerializerManager.scala:124)
      at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
      at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
      at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:453)
      at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:64)
      at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
      at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
      at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
      at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:30)
      at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
      at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
      at scala.collection.Iterator$class.foreach(Iterator.scala:891)
      at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
      at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:224)
      at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:557)
      at org.apache.spark.api.python.BasePythonRunner$WriterThread$$anonfun$run$1.apply(PythonRunner.scala:345)
      at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
      at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:194)
      Exception in thread "stdout writer for python" java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.<init>(Ljava/io/InputStream;Z)V
      at org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:122)
      at org.apache.spark.serializer.SerializerManager.wrapForCompression(SerializerManager.scala:163)
      at org.apache.spark.serializer.SerializerManager.wrapStream(SerializerManager.scala:124)
      at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
      at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$3.apply(BlockStoreShuffleReader.scala:50)
      at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:453)
      at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:64)
      at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
      at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
      at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
      at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:30)
      at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
      at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
      at scala.collection.Iterator$class.foreach(Iterator.scala:891)
      at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
      at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:224)
      at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:557)
      at org.apache.spark.api.python.BasePythonRunner$WriterThread$$anonfun$run$1.apply(PythonRunner.scala:345)
      at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
      at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:194)


      The pom.xml of the jar file : spark-streaming-kafka-0-8-assembly_2.11-2.4.0.jar :






      <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
      <modelVersion>4.0.0</modelVersion>
      <parent>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-parent_2.11</artifactId>
      <version>2.4.0</version>
      <relativePath>../../pom.xml</relativePath>
      </parent>

      <artifactId>spark-streaming-kafka-0-8-assembly_2.11</artifactId>
      <packaging>jar</packaging>
      <name>Spark Project External Kafka Assembly</name>
      <url>http://spark.apache.org/</url>

      <properties>
      <sbt.project.name>streaming-kafka-0-8-assembly</sbt.project.name>
      </properties>

      <dependencies>
      <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming-kafka-0-8_$scala.binary.version</artifactId>
      <version>$project.version</version>
      </dependency>
      <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming_$scala.binary.version</artifactId>
      <version>$project.version</version>
      <scope>provided</scope>
      </dependency>
      <!--
      Demote already included in the Spark assembly.
      -->
      <dependency>
      <groupId>commons-codec</groupId>
      <artifactId>commons-codec</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>commons-lang</groupId>
      <artifactId>commons-lang</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>com.google.protobuf</groupId>
      <artifactId>protobuf-java</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.lz4</groupId>
      <artifactId>lz4-java</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.avro</groupId>
      <artifactId>avro-mapred</artifactId>
      <classifier>$avro.mapred.classifier</classifier>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.curator</groupId>
      <artifactId>curator-recipes</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.zookeeper</groupId>
      <artifactId>zookeeper</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>log4j</groupId>
      <artifactId>log4j</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-api</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-log4j12</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.xerial.snappy</groupId>
      <artifactId>snappy-java</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      </dependencies>

      <build>
      <outputDirectory>target/scala-$scala.binary.version/classes</outputDirectory>
      <testOutputDirectory>target/scala-$scala.binary.version/test-classes</testOutputDirectory>
      <plugins>
      <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-shade-plugin</artifactId>
      <configuration>
      <shadedArtifactAttached>false</shadedArtifactAttached>
      <artifactSet>
      <includes>
      <include>*:*</include>
      </includes>
      </artifactSet>
      <filters>
      <filter>
      <artifact>*:*</artifact>
      <excludes>
      <exclude>META-INF/*.SF</exclude>
      <exclude>META-INF/*.DSA</exclude>
      <exclude>META-INF/*.RSA</exclude>
      </excludes>
      </filter>
      </filters>
      </configuration>
      <executions>
      <execution>
      <phase>package</phase>
      <goals>
      <goal>shade</goal>
      </goals>
      <configuration>
      <transformers>
      <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
      <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
      <resource>reference.conf</resource>
      </transformer>
      <transformer implementation="org.apache.maven.plugins.shade.resource.DontIncludeResourceTransformer">
      <resource>log4j.properties</resource>
      </transformer>
      <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer"/>
      <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheNoticeResourceTransformer"/>
      </transformers>
      </configuration>
      </execution>
      </executions>
      </plugin>
      </plugins>
      </build>
      </project>








      <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
      <modelVersion>4.0.0</modelVersion>
      <parent>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-parent_2.11</artifactId>
      <version>2.4.0</version>
      <relativePath>../../pom.xml</relativePath>
      </parent>

      <artifactId>spark-streaming-kafka-0-8-assembly_2.11</artifactId>
      <packaging>jar</packaging>
      <name>Spark Project External Kafka Assembly</name>
      <url>http://spark.apache.org/</url>

      <properties>
      <sbt.project.name>streaming-kafka-0-8-assembly</sbt.project.name>
      </properties>

      <dependencies>
      <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming-kafka-0-8_$scala.binary.version</artifactId>
      <version>$project.version</version>
      </dependency>
      <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming_$scala.binary.version</artifactId>
      <version>$project.version</version>
      <scope>provided</scope>
      </dependency>
      <!--
      Demote already included in the Spark assembly.
      -->
      <dependency>
      <groupId>commons-codec</groupId>
      <artifactId>commons-codec</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>commons-lang</groupId>
      <artifactId>commons-lang</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>com.google.protobuf</groupId>
      <artifactId>protobuf-java</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.lz4</groupId>
      <artifactId>lz4-java</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.avro</groupId>
      <artifactId>avro-mapred</artifactId>
      <classifier>$avro.mapred.classifier</classifier>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.curator</groupId>
      <artifactId>curator-recipes</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.zookeeper</groupId>
      <artifactId>zookeeper</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>log4j</groupId>
      <artifactId>log4j</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-api</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-log4j12</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.xerial.snappy</groupId>
      <artifactId>snappy-java</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      </dependencies>

      <build>
      <outputDirectory>target/scala-$scala.binary.version/classes</outputDirectory>
      <testOutputDirectory>target/scala-$scala.binary.version/test-classes</testOutputDirectory>
      <plugins>
      <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-shade-plugin</artifactId>
      <configuration>
      <shadedArtifactAttached>false</shadedArtifactAttached>
      <artifactSet>
      <includes>
      <include>*:*</include>
      </includes>
      </artifactSet>
      <filters>
      <filter>
      <artifact>*:*</artifact>
      <excludes>
      <exclude>META-INF/*.SF</exclude>
      <exclude>META-INF/*.DSA</exclude>
      <exclude>META-INF/*.RSA</exclude>
      </excludes>
      </filter>
      </filters>
      </configuration>
      <executions>
      <execution>
      <phase>package</phase>
      <goals>
      <goal>shade</goal>
      </goals>
      <configuration>
      <transformers>
      <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
      <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
      <resource>reference.conf</resource>
      </transformer>
      <transformer implementation="org.apache.maven.plugins.shade.resource.DontIncludeResourceTransformer">
      <resource>log4j.properties</resource>
      </transformer>
      <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer"/>
      <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheNoticeResourceTransformer"/>
      </transformers>
      </configuration>
      </execution>
      </executions>
      </plugin>
      </plugins>
      </build>
      </project>





      <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
      <modelVersion>4.0.0</modelVersion>
      <parent>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-parent_2.11</artifactId>
      <version>2.4.0</version>
      <relativePath>../../pom.xml</relativePath>
      </parent>

      <artifactId>spark-streaming-kafka-0-8-assembly_2.11</artifactId>
      <packaging>jar</packaging>
      <name>Spark Project External Kafka Assembly</name>
      <url>http://spark.apache.org/</url>

      <properties>
      <sbt.project.name>streaming-kafka-0-8-assembly</sbt.project.name>
      </properties>

      <dependencies>
      <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming-kafka-0-8_$scala.binary.version</artifactId>
      <version>$project.version</version>
      </dependency>
      <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming_$scala.binary.version</artifactId>
      <version>$project.version</version>
      <scope>provided</scope>
      </dependency>
      <!--
      Demote already included in the Spark assembly.
      -->
      <dependency>
      <groupId>commons-codec</groupId>
      <artifactId>commons-codec</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>commons-lang</groupId>
      <artifactId>commons-lang</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>com.google.protobuf</groupId>
      <artifactId>protobuf-java</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.lz4</groupId>
      <artifactId>lz4-java</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.avro</groupId>
      <artifactId>avro-mapred</artifactId>
      <classifier>$avro.mapred.classifier</classifier>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.curator</groupId>
      <artifactId>curator-recipes</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.apache.zookeeper</groupId>
      <artifactId>zookeeper</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>log4j</groupId>
      <artifactId>log4j</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-api</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-log4j12</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      <groupId>org.xerial.snappy</groupId>
      <artifactId>snappy-java</artifactId>
      <scope>provided</scope>
      </dependency>
      <dependency>
      </dependencies>

      <build>
      <outputDirectory>target/scala-$scala.binary.version/classes</outputDirectory>
      <testOutputDirectory>target/scala-$scala.binary.version/test-classes</testOutputDirectory>
      <plugins>
      <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-shade-plugin</artifactId>
      <configuration>
      <shadedArtifactAttached>false</shadedArtifactAttached>
      <artifactSet>
      <includes>
      <include>*:*</include>
      </includes>
      </artifactSet>
      <filters>
      <filter>
      <artifact>*:*</artifact>
      <excludes>
      <exclude>META-INF/*.SF</exclude>
      <exclude>META-INF/*.DSA</exclude>
      <exclude>META-INF/*.RSA</exclude>
      </excludes>
      </filter>
      </filters>
      </configuration>
      <executions>
      <execution>
      <phase>package</phase>
      <goals>
      <goal>shade</goal>
      </goals>
      <configuration>
      <transformers>
      <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
      <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
      <resource>reference.conf</resource>
      </transformer>
      <transformer implementation="org.apache.maven.plugins.shade.resource.DontIncludeResourceTransformer">
      <resource>log4j.properties</resource>
      </transformer>
      <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer"/>
      <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheNoticeResourceTransformer"/>
      </transformers>
      </configuration>
      </execution>
      </executions>
      </plugin>
      </plugins>
      </build>
      </project>






      pyspark apache-kafka spark-streaming lz4 spark-streaming-kafka






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Mar 25 at 16:56







      Soufiane Benkhaldoun

















      asked Mar 25 at 14:13









      Soufiane BenkhaldounSoufiane Benkhaldoun

      14 bronze badges




      14 bronze badges






















          1 Answer
          1






          active

          oldest

          votes


















          0














          I have solved this, I was executing my code with the following command :






          bin/spark-submit --jars external/kafka-assembly/target/scala-*/spark-streaming-kafka-assembly-*.jar Path/kafka_test.py localhost:2181 test





          When i run it with this command it worked :






          spark-submit --packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.4.0 Path/kafka_test.py localhost:2181 test








          share|improve this answer






















            Your Answer






            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "1"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55339809%2fnet-jpounz-lz4-exception-when-reading-from-kafka-with-spark-streaming%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            I have solved this, I was executing my code with the following command :






            bin/spark-submit --jars external/kafka-assembly/target/scala-*/spark-streaming-kafka-assembly-*.jar Path/kafka_test.py localhost:2181 test





            When i run it with this command it worked :






            spark-submit --packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.4.0 Path/kafka_test.py localhost:2181 test








            share|improve this answer



























              0














              I have solved this, I was executing my code with the following command :






              bin/spark-submit --jars external/kafka-assembly/target/scala-*/spark-streaming-kafka-assembly-*.jar Path/kafka_test.py localhost:2181 test





              When i run it with this command it worked :






              spark-submit --packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.4.0 Path/kafka_test.py localhost:2181 test








              share|improve this answer

























                0












                0








                0







                I have solved this, I was executing my code with the following command :






                bin/spark-submit --jars external/kafka-assembly/target/scala-*/spark-streaming-kafka-assembly-*.jar Path/kafka_test.py localhost:2181 test





                When i run it with this command it worked :






                spark-submit --packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.4.0 Path/kafka_test.py localhost:2181 test








                share|improve this answer













                I have solved this, I was executing my code with the following command :






                bin/spark-submit --jars external/kafka-assembly/target/scala-*/spark-streaming-kafka-assembly-*.jar Path/kafka_test.py localhost:2181 test





                When i run it with this command it worked :






                spark-submit --packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.4.0 Path/kafka_test.py localhost:2181 test








                bin/spark-submit --jars external/kafka-assembly/target/scala-*/spark-streaming-kafka-assembly-*.jar Path/kafka_test.py localhost:2181 test





                bin/spark-submit --jars external/kafka-assembly/target/scala-*/spark-streaming-kafka-assembly-*.jar Path/kafka_test.py localhost:2181 test





                spark-submit --packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.4.0 Path/kafka_test.py localhost:2181 test





                spark-submit --packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.4.0 Path/kafka_test.py localhost:2181 test






                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Apr 5 at 9:39









                Soufiane BenkhaldounSoufiane Benkhaldoun

                14 bronze badges




                14 bronze badges


















                    Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.







                    Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.



















                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55339809%2fnet-jpounz-lz4-exception-when-reading-from-kafka-with-spark-streaming%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

                    SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

                    은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현