Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-127

Fix Java bean schema inference in Scala 2.10 for Spark 2.2.0

    • Type: Icon: Task Task
    • Resolution: Fixed
    • Priority: Icon: Major - P3 Major - P3
    • 2.2.0
    • Affects Version/s: None
    • Component/s: None
    • Labels:
      None

      Currently Scala 2.10 throws the following error with the wrapped inferSchema method:

      error] == Enclosing template or block ==
      [error] 
      [error] DefDef( // def reflectSchema[T](beanClass: Class[T]): org.apache.spark.sql.types.StructType in object MongoInferSchema
      [error]   <method> <triedcooking>
      [error]   "reflectSchema"
      [error]   []
      [error]   // 1 parameter list
      [error]   ValDef( // beanClass: Class[T]
      [error]     <param> <triedcooking>
      [error]     "beanClass"
      [error]     <tpt> // tree.tpe=Class
      [error]     <empty>
      [error]   )
      [error]   <tpt> // tree.tpe=org.apache.spark.sql.types.StructType
      [error]   Block( // tree.tpe=org.apache.spark.sql.types.StructType
      [error]     Apply( // def invoked(id: Int,dataDir: String): Unit in object Invoker
      [error]       "scoverage"."Invoker"."invoked" // def invoked(id: Int,dataDir: String): Unit in object Invoker
      [error]       // 2 arguments
      [error]       2522
      [error]       "/home/travis/build/rozza/mongo-spark/target/scala-2.10/scoverage-data"
      [error]     )
      [error]     Apply( // final def asInstanceOf[T0](): T0 in class Any, tree.tpe=org.apache.spark.sql.types.StructType
      [error]       TypeApply( // final def asInstanceOf[T0](): T0 in class Any, tree.tpe=()org.apache.spark.sql.types.StructType
      [error]         org.apache.spark.sql.catalyst.JavaTypeInference.inferDataType(beanClass)._1()."asInstanceOf" // final def asInstanceOf[T0](): T0 in class Any, tree.tpe=[T0]()T0
      [error]         <tpt> // tree.tpe=org.apache.spark.sql.types.StructType
      [error]       )
      [error]       Nil
      [error]     )
      [error]   )
      [error] )
      [error] 
      [error] == Expanded type of tree ==
      [error] 
      [error] TypeRef(
      [error]   TypeSymbol(final class DataFrameWriter[T] extends Object)
      [error]   normalize = PolyType(
      [error]     typeParams = List(TypeParam(T))
      [error]     resultType = TypeRef(
      [error]       TypeSymbol(final class DataFrameWriter[T] extends Object)
      [error]       args = List(TypeParamTypeRef(TypeParam(T)))
      [error]     )
      [error]   )
      [error] )
      [error] 
      [error] uncaught exception during compilation: scala.reflect.internal.Types$TypeError
      scala.reflect.internal.Types$TypeError: bad symbolic reference. A signature in JavaTypeInference.class refers to term reflect
      in package com.google.common which is not available.
      It may be completely missing from the current classpath, or the version on
      the classpath might be incompatible with the version used when compiling JavaTypeInference.class
      

      Appears when compiling Scala is not happy with how Spark shadows the library.

            Assignee:
            Unassigned Unassigned
            Reporter:
            ross@mongodb.com Ross Lawley
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

              Created:
              Updated:
              Resolved: