本帖最后由 ltne 于 2018-8-23 11:26 编辑
用spark-submit 提交python程序,里面需要引用pywt的module,现在集群每个节点的python环境都安装了pywt,每个节点运行python,import pywt是没有问题的,但是执行spark-submit程序的时候提示找不到,而且启动pyspark也是可以正常import pywt的。
之后将python环境打包,用zip -r ,python用的是anaconda2 , zip -r anaconda2.zip anaconda2/ 打包这个文件夹,传到hdfs上面,
./spark-submit --master yarn --deploy-mode cluster --archives 'hdfs://master:9000/user/hadoop/anaconda2.zip#anaconda2' --conf 'spark.yarn.appMasterEnv.PYSPARK_PYTHON=./anaconda2/anaconda2/bin/python2' /home/hadoop/lt/platform/feature.py
运行时提示
18/08/23 11:15:49 INFO ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: User class threw exception: java.io.IOException: Cannot run program "./anaconda2/anaconda2/bin/python2": error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
at org.apache.spark.deploy.PythonRunner$.main(PythonRunner.scala:91)
at org.apache.spark.deploy.PythonRunner.main(PythonRunner.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:706)
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:247)
at java.lang.ProcessImpl.start(ProcessImpl.java:134)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
不知道是否zip打包问题,还是什么问题。 |
|