支持支持 |
以解决。重新下载了一个源码包。试了试 |
DEBUG] (f) versionsPropertyName = maven.project.dependencies.versions [DEBUG] -- end configuration -- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [ 1.215 s] [INFO] Apache Hadoop Project POM .......................... FAILURE [ 0.156 s] [INFO] Apache Hadoop Annotations .......................... SKIPPED [INFO] Apache Hadoop Assemblies ........................... SKIPPED [INFO] Apache Hadoop Project Dist POM ..................... SKIPPED [INFO] Apache Hadoop Maven Plugins ........................ SKIPPED [INFO] Apache Hadoop MiniKDC .............................. SKIPPED [INFO] Apache Hadoop Auth ................................. SKIPPED [INFO] Apache Hadoop Auth Examples ........................ SKIPPED [INFO] Apache Hadoop Common ............................... SKIPPED [INFO] Apache Hadoop NFS .................................. SKIPPED [INFO] Apache Hadoop KMS .................................. SKIPPED [INFO] Apache Hadoop Common Project ....................... SKIPPED [INFO] Apache Hadoop HDFS ................................. SKIPPED [INFO] Apache Hadoop HttpFS ............................... SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED [INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED [INFO] Apache Hadoop HDFS Project ......................... SKIPPED [INFO] hadoop-yarn ........................................ SKIPPED [INFO] hadoop-yarn-api .................................... SKIPPED [INFO] hadoop-yarn-common ................................. SKIPPED [INFO] hadoop-yarn-server ................................. SKIPPED [INFO] hadoop-yarn-server-common .......................... SKIPPED [INFO] hadoop-yarn-server-nodemanager ..................... SKIPPED [INFO] hadoop-yarn-server-web-proxy ....................... SKIPPED [INFO] hadoop-yarn-server-applicationhistoryservice ....... SKIPPED [INFO] hadoop-yarn-server-resourcemanager ................. SKIPPED [INFO] hadoop-yarn-server-tests ........................... SKIPPED [INFO] hadoop-yarn-client ................................. SKIPPED [INFO] hadoop-yarn-applications ........................... SKIPPED [INFO] hadoop-yarn-applications-distributedshell .......... SKIPPED [INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SKIPPED [INFO] hadoop-yarn-site ................................... SKIPPED [INFO] hadoop-yarn-registry ............................... SKIPPED [INFO] hadoop-yarn-project ................................ SKIPPED [INFO] hadoop-mapreduce-client ............................ SKIPPED [INFO] hadoop-mapreduce-client-core ....................... SKIPPED [INFO] hadoop-mapreduce-client-common ..................... SKIPPED [INFO] hadoop-mapreduce-client-shuffle .................... SKIPPED [INFO] hadoop-mapreduce-client-app ........................ SKIPPED [INFO] hadoop-mapreduce-client-hs ......................... SKIPPED [INFO] hadoop-mapreduce-client-jobclient .................. SKIPPED [INFO] hadoop-mapreduce-client-hs-plugins ................. SKIPPED [INFO] Apache Hadoop MapReduce Examples ................... SKIPPED [INFO] hadoop-mapreduce ................................... SKIPPED [INFO] Apache Hadoop MapReduce Streaming .................. SKIPPED [INFO] Apache Hadoop Distributed Copy ..................... SKIPPED [INFO] Apache Hadoop Archives ............................. SKIPPED [INFO] Apache Hadoop Rumen ................................ SKIPPED [INFO] Apache Hadoop Gridmix .............................. SKIPPED [INFO] Apache Hadoop Data Join ............................ SKIPPED [INFO] Apache Hadoop Ant Tasks ............................ SKIPPED [INFO] Apache Hadoop Extras ............................... SKIPPED [INFO] Apache Hadoop Pipes ................................ SKIPPED [INFO] Apache Hadoop OpenStack support .................... SKIPPED [INFO] Apache Hadoop Amazon Web Services support .......... SKIPPED [INFO] Apache Hadoop Client ............................... SKIPPED [INFO] Apache Hadoop Mini-Cluster ......................... SKIPPED [INFO] Apache Hadoop Scheduler Load Simulator ............. SKIPPED [INFO] Apache Hadoop Tools Dist ........................... SKIPPED [INFO] Apache Hadoop Tools ................................ SKIPPED [INFO] Apache Hadoop Distribution ......................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 4.479 s [INFO] Finished at: 2015-11-22T09:44:24+08:00 [INFO] Final Memory: 46M/362M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (create-testdirs) on project hadoop-project: Error executing ant tasks: /usr/local/hadoop-2.6.0-src/hadoop-project/target/antrun/build-main.xml (No such file or directory) -> [Help 1] org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (create-testdirs) on project hadoop-project: Error executing ant tasks: /usr/local/hadoop-2.6.0-src/hadoop-project/target/antrun/build-main.xml (No such file or directory) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80) at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106) at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863) at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288) at org.apache.maven.cli.MavenCli.main(MavenCli.java:199) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289) at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415) at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356) Caused by: org.apache.maven.plugin.MojoExecutionException: Error executing ant tasks: /usr/local/hadoop-2.6.0-src/hadoop-project/target/antrun/build-main.xml (No such file or directory) at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:360) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207) ... 20 more Caused by: java.io.FileNotFoundException: /usr/local/hadoop-2.6.0-src/hadoop-project/target/antrun/build-main.xml (No such file or directory) at java.io.FileOutputStream.open(Native Method) at java.io.FileOutputStream.<init>(FileOutputStream.java:221) at java.io.FileOutputStream.<init>(FileOutputStream.java:110) at org.codehaus.plexus.util.FileUtils.fileWrite(FileUtils.java:470) at org.apache.maven.plugin.antrun.AntRunMojo.writeTargetToProjectFile(AntRunMojo.java:608) at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:263) ... 22 more [ERROR] [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluen ... oExecutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-project whx@whx-desktop:/usr/local/hadoop-2.6.0-src$ ls -l total 108 -rw-r--r-- 1 root root 12091 11月 21 21:27 BUILDING.txt drwxr-xr-x 2 root root 4096 11月 21 21:27 dev-support drwxr-xr-x 3 root root 4096 11月 21 21:27 hadoop-assemblies drwxr-xr-x 2 root root 4096 11月 21 21:27 hadoop-client drwxr-xr-x 10 root root 4096 11月 21 21:27 hadoop-common-project drwxr-xr-x 2 root root 4096 11月 21 21:27 hadoop-dist drwxr-xr-x 6 root root 4096 11月 21 21:27 hadoop-hdfs-project drwxr-xr-x 9 root root 4096 11月 21 21:27 hadoop-mapreduce-project drwxr-xr-x 3 root root 4096 11月 21 21:27 hadoop-maven-plugins drwxr-xr-x 2 root root 4096 11月 21 21:27 hadoop-minicluster drwxr-xr-x 3 root root 4096 11月 22 09:30 hadoop-project drwxr-xr-x 2 root root 4096 11月 22 09:04 hadoop-project-dist drwxr-xr-x 15 root root 4096 11月 21 21:27 hadoop-tools drwxr-xr-x 3 root root 4096 11月 21 21:27 hadoop-yarn-project -rw-r--r-- 1 root root 15429 11月 21 21:27 LICENSE.txt -rw-r--r-- 1 root root 101 11月 21 21:27 NOTICE.txt -rw-r--r-- 1 root root 18081 11月 21 21:27 pom.xml -rw-r--r-- 1 root root 1366 11月 21 21:27 README.txt whx@whx-desktop:/usr/local/hadoop-2.6.0-src$ cd hadoop-project hadoop-project/ hadoop-project-dist/ whx@whx-desktop:/usr/local/hadoop-2.6.0-src$ cd hadoop-project/src/ whx@whx-desktop:/usr/local/hadoop-2.6.0-src/hadoop-project/src$ cd .. whx@whx-desktop:/usr/local/hadoop-2.6.0-src/hadoop-project$ ls -l total 92 -rw-r--r-- 1 root root 43367 11月 22 09:30 pom.xml -rw-r--r-- 1 root root 43503 11月 22 09:29 pom.xml~ drwxr-xr-x 3 root root 4096 11月 21 21:27 src whx@whx-desktop:/usr/local/hadoop-2.6.0-src/hadoop-project$ 求问楼主如何解决此问题,十分感谢 |
不错,学习了 |
感谢楼主,根据楼主步骤引入源代码后,提示如下问题: JDK版本1.7_49 使用mac 操作系统。。。。 |
pig2 发表于 2015-5-11 23:28 首先,非常感谢博主的回复,我以后在学习源码过程中,一定向博主学习,其次,想问博主,在工作过程中,你们主要在使用哪些hadoop生态圈,主要做什么工作??? |
tang 发表于 2015-5-11 19:51 抓住一点,边debug,边阅读。 比如你上传文件,都是调用了哪些hadoop源码。 跑mapreduce调用了哪些源码,不要单纯的去阅读,很枯燥 |
楼主,请问怎么阅读hadoop2.x的源代码,内容如此的多,具体该怎样去读 |