-
Notifications
You must be signed in to change notification settings - Fork 1k
Open
Description
When executing a traversal of an HBase graph using SparkComputer, I get the following error. This is based on titan-1.0.0 with hadoop-2 and hbase-1.0.
06:47:57.278 [Executor task launch worker-0] ERROR org.apache.spark.executor.Executor - Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.ClassCastException: org.apache.hadoop.hbase.mapreduce.TableInputFormatBase$1 cannot be cast to org.apache.hadoop.hbase.mapreduce.TableRecordReader
at com.thinkaurelius.titan.hadoop.formats.hbase.HBaseBinaryInputFormat.createRecordReader(HBaseBinaryInputFormat.java:47) ~[titan-hadoop-core-1.1.0-SNAPSHOT.jar:na]
at com.thinkaurelius.titan.hadoop.formats.util.GiraphInputFormat.createRecordReader(GiraphInputFormat.java:53) ~[titan-hadoop-core-1.1.0-SNAPSHOT.jar:na]
at org.apache.spark.rdd.NewHadoopRDD$$anon$1.<init>(NewHadoopRDD.scala:151) ~[spark-core_2.10-1.5.2.jar:1.5.2]
at org.apache.spark.rdd.NewHadoopRDD.compute(NewHadoopRDD.scala:124) ~[spark-core_2.10-1.5.2.jar:1.5.2]
at org.apache.spark.rdd.NewHadoopRDD.compute(NewHadoopRDD.scala:65) ~[spark-core_2.10-1.5.2.jar:1.5.2]
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300) ~[spark-core_2.10-1.5.2.jar:1.5.2]
at org.apache.spark.rdd.RDD.iterator(RDD.scala:264) ~[spark-core_2.10-1.5.2.jar:1.5.2]
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) ~[spark-core_2.10-1.5.2.jar:1.5.2]
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300) ~[spark-core_2.10-1.5.2.jar:1.5.2]
at org.apache.spark.rdd.RDD.iterator(RDD.scala:264) ~[spark-core_2.10-1.5.2.jar:1.5.2]
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) ~[spark-core_2.10-1.5.2.jar:1.5.2]
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) ~[spark-core_2.10-1.5.2.jar:1.5.2]
at org.apache.spark.scheduler.Task.run(Task.scala:88) ~[spark-core_2.10-1.5.2.jar:1.5.2]
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) ~[spark-core_2.10-1.5.2.jar:1.5.2]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_60]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_60]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_60]
Metadata
Metadata
Assignees
Labels
No labels