楼主求教下,请问你的程序是怎么 spark-submit的?我自己也写了一个kafka-streaming 的程序,结果是 standalone local模式可以运行,standalone cluster模式报Caused by: java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka/KafkaUtils$ 这个错误。[mw_shl_code=shell,true]spark-submit --master spark://h005:6066 \
--class FromKafkaToPhoenix \
--deploy-mode cluster \
--supervise \
--executor-memory 2G \
--total-executor-cores 10 \
--conf "spark.driver.extraClassPath=hbase-protocol-0.98.12.1-hadoop2.jar" \
--conf "spark.executor.extraClassPath=hbase-protocol-0.98.12.1-hadoop2.jar" \
--jars spark-streaming-kafka-assembly_2.10-1.4.1.jar,guava-12.0.1.jar,hbase-client-0.98.12.1-hadoop2.jar,hbase-common-0.98.12.1-hadoop2.jar,hbase-protocol
-0.98.12.1-hadoop2.jar,hbase-server-0.98.12.1-hadoop2.jar,htrace-core-2.04.jar,jdom-2.0.5.jar,jruby-complete-1.6.8.jar,mysql-connector-java-5.0.8.jar,phoenix-core-4.4.
0-HBase-0.98.jar,phoenix-spark-4.4.0-HBase-0.98.jar,spark-assembly-1.4.1-hadoop2.2.0.jar \
testkafka.jar h001:9092 testkafka[/mw_shl_code]
|