NoSuchElementException while using toDF from Spark / LivyGet property value from string using reflection in C#ClassNotFoundException spark-submit scalaSpark 1.6.1: creating DataFrame from RDD[Array[Error]]Spark: java.lang.UnsupportedOperationException: No Encoder found for java.time.LocalDateSpark Unit Test Fails with “ExceptionInInitializerError” with latest version of “json4s-jackson”Spark scala dataframe read and show multiline json fileI am getting error while loading my csv in spark using SQlcontextintermittent exception from a spark job: Schema for type scala.collection.Map[String,String] is not supportedSpark Structured Streaming MemoryStream + Row + Encoders issuehow to use Scala Maps with akka-kryo-serializer

Who are the remaining King/Queenslayers?

Why tighten down in a criss-cross pattern?

Loss of power when I remove item from the outlet

Greeting with "Ho"

Would it be a copyright violation if I made a character’s full name refer to a song?

Is it illegal to withhold someone's passport and green card in California?

Should developer taking test phones home or put in office?

Is it damaging to turn off a small fridge for two days every week?

If I wouldn't want to read the story, is writing it still a good idea?

Are all instances of trolls turning to stone ultimately references back to Tolkien?

What is "industrial ethernet"?

What is the meaning of "понаехать"?

What was the Shuttle Carrier Aircraft escape tunnel?

Hot coffee brewing solutions for deep woods camping

.NET executes a SQL query and Active Monitor shows multiple rows blocking each other

How long would it take to cross the Channel in 1890's?

How to get cool night-vision without lame drawbacks?

Is there a term for the belief that "if it's legal, it's moral"?

What can I do with a research project that is my university’s intellectual property?

Suggested order for Amazon Prime Doctor Who series

How dangerous are set-size assumptions?

Trainee keeps missing deadlines for independent learning

How do I set an alias to a terminal line?

Output of "$OSTYPE:6" on old releases of Mac OS X



NoSuchElementException while using toDF from Spark / Livy


Get property value from string using reflection in C#ClassNotFoundException spark-submit scalaSpark 1.6.1: creating DataFrame from RDD[Array[Error]]Spark: java.lang.UnsupportedOperationException: No Encoder found for java.time.LocalDateSpark Unit Test Fails with “ExceptionInInitializerError” with latest version of “json4s-jackson”Spark scala dataframe read and show multiline json fileI am getting error while loading my csv in spark using SQlcontextintermittent exception from a spark job: Schema for type scala.collection.Map[String,String] is not supportedSpark Structured Streaming MemoryStream + Row + Encoders issuehow to use Scala Maps with akka-kryo-serializer






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








0















I am attempting to produce a Spark Dataframe from within Spark, which has been initialised using apache Livy.



I first noticed this issue on this more complicated hbase call:



 import spark.implicits._

...

spark.sparkContext
.newAPIHadoopRDD(
conf,
classOf[TableInputFormat],
classOf[ImmutableBytesWritable],
classOf[Result]
)
.toDF()


But i found i could get the same thing to occur on a simple:



 import spark.implicits._

...

val filtersDf = filters.toDF()


Where, filtersDf is just a sequence of case classes.



The common issue is the *.toDF(), however it also occurs with *.toDS(), which makes me think that the implicit resolution on import spark.implicits._ is not working. The underlying objects to be converted to dataframes do have data.



The error stack looks like it relates to runtime implicit resolution using scala runtime reflection.



Note that i have checked and both spark and the compiled code both use the same version of Scala (2.11) .



The exception i get is:



java.lang.RuntimeException: java.util.NoSuchElementException: head of empty list
scala.collection.immutable.Nil$.head(List.scala:420)
scala.collection.immutable.Nil$.head(List.scala:417)
scala.collection.immutable.List.map(List.scala:277)
scala.reflect.internal.Symbols$Symbol.parentSymbols(Symbols.scala:2117)
scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:301)
scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:341)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply$mcV$sp(SymbolLoaders.scala:74)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply(SymbolLoaders.scala:71)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply(SymbolLoaders.scala:71)
scala.reflect.internal.SymbolTable.slowButSafeEnteringPhaseNotLaterThan(SymbolTable.scala:263)
scala.reflect.runtime.SymbolLoaders$LazyPackageType.complete(SymbolLoaders.scala:71)
scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1514)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.scala$reflect$runtime$SynchronizedSymbols$SynchronizedSymbol$$super$info(SynchronizedSymbols.scala:174)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:174)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.info(SynchronizedSymbols.scala:127)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.info(SynchronizedSymbols.scala:174)
scala.reflect.internal.Types$TypeRef.thisInfo(Types.scala:2194)
scala.reflect.internal.Types$TypeRef.baseClasses(Types.scala:2199)
scala.reflect.internal.tpe.FindMembers$FindMemberBase.<init>(FindMembers.scala:17)
scala.reflect.internal.tpe.FindMembers$FindMember.<init>(FindMembers.scala:219)
scala.reflect.internal.Types$Type.scala$reflect$internal$Types$Type$$findMemberInternal$1(Types.scala:1014)
scala.reflect.internal.Types$Type.findMember(Types.scala:1016)
scala.reflect.internal.Types$Type.memberBasedOnName(Types.scala:631)
scala.reflect.internal.Types$Type.member(Types.scala:600)
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
scala.reflect.internal.Mirrors$RootsBase.staticPackage(Mirrors.scala:204)
scala.reflect.runtime.JavaMirrors$JavaMirror.staticPackage(JavaMirrors.scala:82)
scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:263)
scala.reflect.runtime.JavaMirrors$class.scala$reflect$runtime$JavaMirrors$$createMirror(JavaMirrors.scala:32)
scala.reflect.runtime.JavaMirrors$$anonfun$runtimeMirror$1.apply(JavaMirrors.scala:49)
scala.reflect.runtime.JavaMirrors$$anonfun$runtimeMirror$1.apply(JavaMirrors.scala:47)
scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
scala.reflect.runtime.JavaMirrors$class.runtimeMirror(JavaMirrors.scala:46)
scala.reflect.runtime.JavaUniverse.runtimeMirror(JavaUniverse.scala:16)
scala.reflect.runtime.JavaUniverse.runtimeMirror(JavaUniverse.scala:16)


My working assumption is that I am missing a dependency or import and this is some kind of scala-ism.



I have yet to find any other references to this issue. Ultimately i think it is probably down to imports/dependencies but so far I can't quite see what it is. Any help greatly appreciated. I'm keen to know ways to fix the issue or alternatively to create data frames via less magical approaches than toDf().



Spark info:



Welcome to
____ __
/ __/__ ___ _____/ /__
_ / _ / _ `/ __/ '_/
/___/ .__/_,_/_/ /_/_ version 2.3.2.0-mapr-1901
/_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_191)









share|improve this question
























  • Can you check the count of the rdd before toDF() and check the results? Like, spark.sparkContext.newAPIHadoopRDD(conf,classOf[TableInputFormat],classOf[ImmutableBytesWritable],classOf[Result]).count()

    – Apurba Pandey
    Mar 25 at 5:46











  • I have checked and the rdd has data. But i don't think it is the RDD itself but more related to the toDF call which comes via an implicit. If you look closely at the error it is a scala reflection error.

    – ZenMasterZed
    Mar 25 at 7:12

















0















I am attempting to produce a Spark Dataframe from within Spark, which has been initialised using apache Livy.



I first noticed this issue on this more complicated hbase call:



 import spark.implicits._

...

spark.sparkContext
.newAPIHadoopRDD(
conf,
classOf[TableInputFormat],
classOf[ImmutableBytesWritable],
classOf[Result]
)
.toDF()


But i found i could get the same thing to occur on a simple:



 import spark.implicits._

...

val filtersDf = filters.toDF()


Where, filtersDf is just a sequence of case classes.



The common issue is the *.toDF(), however it also occurs with *.toDS(), which makes me think that the implicit resolution on import spark.implicits._ is not working. The underlying objects to be converted to dataframes do have data.



The error stack looks like it relates to runtime implicit resolution using scala runtime reflection.



Note that i have checked and both spark and the compiled code both use the same version of Scala (2.11) .



The exception i get is:



java.lang.RuntimeException: java.util.NoSuchElementException: head of empty list
scala.collection.immutable.Nil$.head(List.scala:420)
scala.collection.immutable.Nil$.head(List.scala:417)
scala.collection.immutable.List.map(List.scala:277)
scala.reflect.internal.Symbols$Symbol.parentSymbols(Symbols.scala:2117)
scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:301)
scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:341)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply$mcV$sp(SymbolLoaders.scala:74)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply(SymbolLoaders.scala:71)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply(SymbolLoaders.scala:71)
scala.reflect.internal.SymbolTable.slowButSafeEnteringPhaseNotLaterThan(SymbolTable.scala:263)
scala.reflect.runtime.SymbolLoaders$LazyPackageType.complete(SymbolLoaders.scala:71)
scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1514)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.scala$reflect$runtime$SynchronizedSymbols$SynchronizedSymbol$$super$info(SynchronizedSymbols.scala:174)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:174)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.info(SynchronizedSymbols.scala:127)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.info(SynchronizedSymbols.scala:174)
scala.reflect.internal.Types$TypeRef.thisInfo(Types.scala:2194)
scala.reflect.internal.Types$TypeRef.baseClasses(Types.scala:2199)
scala.reflect.internal.tpe.FindMembers$FindMemberBase.<init>(FindMembers.scala:17)
scala.reflect.internal.tpe.FindMembers$FindMember.<init>(FindMembers.scala:219)
scala.reflect.internal.Types$Type.scala$reflect$internal$Types$Type$$findMemberInternal$1(Types.scala:1014)
scala.reflect.internal.Types$Type.findMember(Types.scala:1016)
scala.reflect.internal.Types$Type.memberBasedOnName(Types.scala:631)
scala.reflect.internal.Types$Type.member(Types.scala:600)
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
scala.reflect.internal.Mirrors$RootsBase.staticPackage(Mirrors.scala:204)
scala.reflect.runtime.JavaMirrors$JavaMirror.staticPackage(JavaMirrors.scala:82)
scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:263)
scala.reflect.runtime.JavaMirrors$class.scala$reflect$runtime$JavaMirrors$$createMirror(JavaMirrors.scala:32)
scala.reflect.runtime.JavaMirrors$$anonfun$runtimeMirror$1.apply(JavaMirrors.scala:49)
scala.reflect.runtime.JavaMirrors$$anonfun$runtimeMirror$1.apply(JavaMirrors.scala:47)
scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
scala.reflect.runtime.JavaMirrors$class.runtimeMirror(JavaMirrors.scala:46)
scala.reflect.runtime.JavaUniverse.runtimeMirror(JavaUniverse.scala:16)
scala.reflect.runtime.JavaUniverse.runtimeMirror(JavaUniverse.scala:16)


My working assumption is that I am missing a dependency or import and this is some kind of scala-ism.



I have yet to find any other references to this issue. Ultimately i think it is probably down to imports/dependencies but so far I can't quite see what it is. Any help greatly appreciated. I'm keen to know ways to fix the issue or alternatively to create data frames via less magical approaches than toDf().



Spark info:



Welcome to
____ __
/ __/__ ___ _____/ /__
_ / _ / _ `/ __/ '_/
/___/ .__/_,_/_/ /_/_ version 2.3.2.0-mapr-1901
/_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_191)









share|improve this question
























  • Can you check the count of the rdd before toDF() and check the results? Like, spark.sparkContext.newAPIHadoopRDD(conf,classOf[TableInputFormat],classOf[ImmutableBytesWritable],classOf[Result]).count()

    – Apurba Pandey
    Mar 25 at 5:46











  • I have checked and the rdd has data. But i don't think it is the RDD itself but more related to the toDF call which comes via an implicit. If you look closely at the error it is a scala reflection error.

    – ZenMasterZed
    Mar 25 at 7:12













0












0








0








I am attempting to produce a Spark Dataframe from within Spark, which has been initialised using apache Livy.



I first noticed this issue on this more complicated hbase call:



 import spark.implicits._

...

spark.sparkContext
.newAPIHadoopRDD(
conf,
classOf[TableInputFormat],
classOf[ImmutableBytesWritable],
classOf[Result]
)
.toDF()


But i found i could get the same thing to occur on a simple:



 import spark.implicits._

...

val filtersDf = filters.toDF()


Where, filtersDf is just a sequence of case classes.



The common issue is the *.toDF(), however it also occurs with *.toDS(), which makes me think that the implicit resolution on import spark.implicits._ is not working. The underlying objects to be converted to dataframes do have data.



The error stack looks like it relates to runtime implicit resolution using scala runtime reflection.



Note that i have checked and both spark and the compiled code both use the same version of Scala (2.11) .



The exception i get is:



java.lang.RuntimeException: java.util.NoSuchElementException: head of empty list
scala.collection.immutable.Nil$.head(List.scala:420)
scala.collection.immutable.Nil$.head(List.scala:417)
scala.collection.immutable.List.map(List.scala:277)
scala.reflect.internal.Symbols$Symbol.parentSymbols(Symbols.scala:2117)
scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:301)
scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:341)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply$mcV$sp(SymbolLoaders.scala:74)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply(SymbolLoaders.scala:71)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply(SymbolLoaders.scala:71)
scala.reflect.internal.SymbolTable.slowButSafeEnteringPhaseNotLaterThan(SymbolTable.scala:263)
scala.reflect.runtime.SymbolLoaders$LazyPackageType.complete(SymbolLoaders.scala:71)
scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1514)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.scala$reflect$runtime$SynchronizedSymbols$SynchronizedSymbol$$super$info(SynchronizedSymbols.scala:174)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:174)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.info(SynchronizedSymbols.scala:127)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.info(SynchronizedSymbols.scala:174)
scala.reflect.internal.Types$TypeRef.thisInfo(Types.scala:2194)
scala.reflect.internal.Types$TypeRef.baseClasses(Types.scala:2199)
scala.reflect.internal.tpe.FindMembers$FindMemberBase.<init>(FindMembers.scala:17)
scala.reflect.internal.tpe.FindMembers$FindMember.<init>(FindMembers.scala:219)
scala.reflect.internal.Types$Type.scala$reflect$internal$Types$Type$$findMemberInternal$1(Types.scala:1014)
scala.reflect.internal.Types$Type.findMember(Types.scala:1016)
scala.reflect.internal.Types$Type.memberBasedOnName(Types.scala:631)
scala.reflect.internal.Types$Type.member(Types.scala:600)
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
scala.reflect.internal.Mirrors$RootsBase.staticPackage(Mirrors.scala:204)
scala.reflect.runtime.JavaMirrors$JavaMirror.staticPackage(JavaMirrors.scala:82)
scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:263)
scala.reflect.runtime.JavaMirrors$class.scala$reflect$runtime$JavaMirrors$$createMirror(JavaMirrors.scala:32)
scala.reflect.runtime.JavaMirrors$$anonfun$runtimeMirror$1.apply(JavaMirrors.scala:49)
scala.reflect.runtime.JavaMirrors$$anonfun$runtimeMirror$1.apply(JavaMirrors.scala:47)
scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
scala.reflect.runtime.JavaMirrors$class.runtimeMirror(JavaMirrors.scala:46)
scala.reflect.runtime.JavaUniverse.runtimeMirror(JavaUniverse.scala:16)
scala.reflect.runtime.JavaUniverse.runtimeMirror(JavaUniverse.scala:16)


My working assumption is that I am missing a dependency or import and this is some kind of scala-ism.



I have yet to find any other references to this issue. Ultimately i think it is probably down to imports/dependencies but so far I can't quite see what it is. Any help greatly appreciated. I'm keen to know ways to fix the issue or alternatively to create data frames via less magical approaches than toDf().



Spark info:



Welcome to
____ __
/ __/__ ___ _____/ /__
_ / _ / _ `/ __/ '_/
/___/ .__/_,_/_/ /_/_ version 2.3.2.0-mapr-1901
/_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_191)









share|improve this question
















I am attempting to produce a Spark Dataframe from within Spark, which has been initialised using apache Livy.



I first noticed this issue on this more complicated hbase call:



 import spark.implicits._

...

spark.sparkContext
.newAPIHadoopRDD(
conf,
classOf[TableInputFormat],
classOf[ImmutableBytesWritable],
classOf[Result]
)
.toDF()


But i found i could get the same thing to occur on a simple:



 import spark.implicits._

...

val filtersDf = filters.toDF()


Where, filtersDf is just a sequence of case classes.



The common issue is the *.toDF(), however it also occurs with *.toDS(), which makes me think that the implicit resolution on import spark.implicits._ is not working. The underlying objects to be converted to dataframes do have data.



The error stack looks like it relates to runtime implicit resolution using scala runtime reflection.



Note that i have checked and both spark and the compiled code both use the same version of Scala (2.11) .



The exception i get is:



java.lang.RuntimeException: java.util.NoSuchElementException: head of empty list
scala.collection.immutable.Nil$.head(List.scala:420)
scala.collection.immutable.Nil$.head(List.scala:417)
scala.collection.immutable.List.map(List.scala:277)
scala.reflect.internal.Symbols$Symbol.parentSymbols(Symbols.scala:2117)
scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:301)
scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:341)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply$mcV$sp(SymbolLoaders.scala:74)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply(SymbolLoaders.scala:71)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply(SymbolLoaders.scala:71)
scala.reflect.internal.SymbolTable.slowButSafeEnteringPhaseNotLaterThan(SymbolTable.scala:263)
scala.reflect.runtime.SymbolLoaders$LazyPackageType.complete(SymbolLoaders.scala:71)
scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1514)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.scala$reflect$runtime$SynchronizedSymbols$SynchronizedSymbol$$super$info(SynchronizedSymbols.scala:174)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:174)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.info(SynchronizedSymbols.scala:127)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.info(SynchronizedSymbols.scala:174)
scala.reflect.internal.Types$TypeRef.thisInfo(Types.scala:2194)
scala.reflect.internal.Types$TypeRef.baseClasses(Types.scala:2199)
scala.reflect.internal.tpe.FindMembers$FindMemberBase.<init>(FindMembers.scala:17)
scala.reflect.internal.tpe.FindMembers$FindMember.<init>(FindMembers.scala:219)
scala.reflect.internal.Types$Type.scala$reflect$internal$Types$Type$$findMemberInternal$1(Types.scala:1014)
scala.reflect.internal.Types$Type.findMember(Types.scala:1016)
scala.reflect.internal.Types$Type.memberBasedOnName(Types.scala:631)
scala.reflect.internal.Types$Type.member(Types.scala:600)
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
scala.reflect.internal.Mirrors$RootsBase.staticPackage(Mirrors.scala:204)
scala.reflect.runtime.JavaMirrors$JavaMirror.staticPackage(JavaMirrors.scala:82)
scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:263)
scala.reflect.runtime.JavaMirrors$class.scala$reflect$runtime$JavaMirrors$$createMirror(JavaMirrors.scala:32)
scala.reflect.runtime.JavaMirrors$$anonfun$runtimeMirror$1.apply(JavaMirrors.scala:49)
scala.reflect.runtime.JavaMirrors$$anonfun$runtimeMirror$1.apply(JavaMirrors.scala:47)
scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
scala.reflect.runtime.JavaMirrors$class.runtimeMirror(JavaMirrors.scala:46)
scala.reflect.runtime.JavaUniverse.runtimeMirror(JavaUniverse.scala:16)
scala.reflect.runtime.JavaUniverse.runtimeMirror(JavaUniverse.scala:16)


My working assumption is that I am missing a dependency or import and this is some kind of scala-ism.



I have yet to find any other references to this issue. Ultimately i think it is probably down to imports/dependencies but so far I can't quite see what it is. Any help greatly appreciated. I'm keen to know ways to fix the issue or alternatively to create data frames via less magical approaches than toDf().



Spark info:



Welcome to
____ __
/ __/__ ___ _____/ /__
_ / _ / _ `/ __/ '_/
/___/ .__/_,_/_/ /_/_ version 2.3.2.0-mapr-1901
/_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_191)






scala apache-spark reflection hbase livy






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Mar 25 at 8:58







ZenMasterZed

















asked Mar 24 at 23:33









ZenMasterZedZenMasterZed

736




736












  • Can you check the count of the rdd before toDF() and check the results? Like, spark.sparkContext.newAPIHadoopRDD(conf,classOf[TableInputFormat],classOf[ImmutableBytesWritable],classOf[Result]).count()

    – Apurba Pandey
    Mar 25 at 5:46











  • I have checked and the rdd has data. But i don't think it is the RDD itself but more related to the toDF call which comes via an implicit. If you look closely at the error it is a scala reflection error.

    – ZenMasterZed
    Mar 25 at 7:12

















  • Can you check the count of the rdd before toDF() and check the results? Like, spark.sparkContext.newAPIHadoopRDD(conf,classOf[TableInputFormat],classOf[ImmutableBytesWritable],classOf[Result]).count()

    – Apurba Pandey
    Mar 25 at 5:46











  • I have checked and the rdd has data. But i don't think it is the RDD itself but more related to the toDF call which comes via an implicit. If you look closely at the error it is a scala reflection error.

    – ZenMasterZed
    Mar 25 at 7:12
















Can you check the count of the rdd before toDF() and check the results? Like, spark.sparkContext.newAPIHadoopRDD(conf,classOf[TableInputFormat],classOf[ImmutableBytesWritable],classOf[Result]).count()

– Apurba Pandey
Mar 25 at 5:46





Can you check the count of the rdd before toDF() and check the results? Like, spark.sparkContext.newAPIHadoopRDD(conf,classOf[TableInputFormat],classOf[ImmutableBytesWritable],classOf[Result]).count()

– Apurba Pandey
Mar 25 at 5:46













I have checked and the rdd has data. But i don't think it is the RDD itself but more related to the toDF call which comes via an implicit. If you look closely at the error it is a scala reflection error.

– ZenMasterZed
Mar 25 at 7:12





I have checked and the rdd has data. But i don't think it is the RDD itself but more related to the toDF call which comes via an implicit. If you look closely at the error it is a scala reflection error.

– ZenMasterZed
Mar 25 at 7:12












0






active

oldest

votes














Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55329603%2fnosuchelementexception-while-using-todf-from-spark-livy%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55329603%2fnosuchelementexception-while-using-todf-from-spark-livy%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

Swift 4 - func physicsWorld not invoked on collision? The Next CEO of Stack OverflowHow to call Objective-C code from Swift#ifdef replacement in the Swift language@selector() in Swift?#pragma mark in Swift?Swift for loop: for index, element in array?dispatch_after - GCD in Swift?Swift Beta performance: sorting arraysSplit a String into an array in Swift?The use of Swift 3 @objc inference in Swift 4 mode is deprecated?How to optimize UITableViewCell, because my UITableView lags

Access current req object everywhere in Node.js ExpressWhy are global variables considered bad practice? (node.js)Using req & res across functionsHow do I get the path to the current script with Node.js?What is Node.js' Connect, Express and “middleware”?Node.js w/ express error handling in callbackHow to access the GET parameters after “?” in Express?Modify Node.js req object parametersAccess “app” variable inside of ExpressJS/ConnectJS middleware?Node.js Express app - request objectAngular Http Module considered middleware?Session variables in ExpressJSAdd properties to the req object in expressjs with Typescript