分享

使用官方原版flume写日志到HDFS异常

gm100861 发表于 2016-2-29 11:32:57 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 2 17279
我现在使用的是CentOS 7, flume 1.5.0官方原版.

如果我把flume安装在hadoop的机器上,则flume就能正常写入日志到hdfs当中.如果不安装在hadoop的机器上,写入日志就会报错.我大概看的懂是说jar包找不到.但是我不知道,他都依赖于哪些jar包,求各位大神帮我看一下.

配置文件如下
[mw_shl_code=applescript,true]a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type = avro
a1.sources.r1.port = 44446
a1.sources.r1.bind = 0.0.0.0
a1.sources.r1.channels = c1

# Describe the sink
a1.sinks.k1.type = hdfs
a1.sinks.k1.channel = c1
a1.sinks.k1.hdfs.path = hdfs://192.168.92:9000/user/flume/syslogtcp
a1.sinks.k1.hdfs.filePrefix = Syslog
a1.sinks.k1.hdfs.round = true
a1.sinks.k1.hdfs.roundValue = 10
a1.sinks.k1.hdfs.roundUnit = minute

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1[/mw_shl_code]

启动命令如下
[mw_shl_code=applescript,true][root@localhost apache-flume-1.5.0-bin]# ./bin/flume-ng agent --conf conf --conf-file ./conf/hdfs_sink.properties --name a1 -Dfme.root.logger=INFO,LOGFILE[/mw_shl_code]

日志如下
[mw_shl_code=applescript,true]29 Feb 2016 05:57:10,301 INFO  [lifecycleSupervisor-1-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start:61)  - Configuration provider starting
29 Feb 2016 05:57:10,308 INFO  [conf-file-poller-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:133)  - Reloading configuration file:./conf/hdfs_sink.properties
29 Feb 2016 05:57:10,312 INFO  [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016)  - Processing:k1
29 Feb 2016 05:57:10,313 INFO  [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016)  - Processing:k1
29 Feb 2016 05:57:10,313 INFO  [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:930)  - Added sinks: k1 Agent: a1
29 Feb 2016 05:57:10,313 INFO  [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016)  - Processing:k1
29 Feb 2016 05:57:10,314 INFO  [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016)  - Processing:k1
29 Feb 2016 05:57:10,314 INFO  [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016)  - Processing:k1
29 Feb 2016 05:57:10,314 INFO  [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016)  - Processing:k1
29 Feb 2016 05:57:10,314 INFO  [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016)  - Processing:k1
29 Feb 2016 05:57:10,322 INFO  [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration.validateConfiguration:140)  - Post-validation flume configuration contains configuration for agents: [a1]
29 Feb 2016 05:57:10,323 INFO  [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.loadChannels:150)  - Creating channels
29 Feb 2016 05:57:10,330 INFO  [conf-file-poller-0] (org.apache.flume.channel.DefaultChannelFactory.create:40)  - Creating instance of channel c1 type memory
29 Feb 2016 05:57:10,336 INFO  [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.loadChannels:205)  - Created channel c1
29 Feb 2016 05:57:10,337 INFO  [conf-file-poller-0] (org.apache.flume.source.DefaultSourceFactory.create:39)  - Creating instance of source r1, type avro
29 Feb 2016 05:57:10,354 INFO  [conf-file-poller-0] (org.apache.flume.sink.DefaultSinkFactory.create:40)  - Creating instance of sink: k1, type: hdfs
29 Feb 2016 05:57:10,372 ERROR [conf-file-poller-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:145)  - Failed to start agent because dependencies were not found in classpath. Error follows.
java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration
        at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:38)
        at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:36)
        at org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:108)
        at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:210)
        at org.apache.flume.sink.hdfs.HDFSEventSink.authenticate(HDFSEventSink.java:553)
        at org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:272)
        at org.apache.flume.conf.Configurables.configure(Configurables.java:41)
        at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:418)
        at org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:103)
        at org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:140)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 17 more
29 Feb 2016 06:21:20,766 INFO  [agent-shutdown-hook] (org.apache.flume.lifecycle.LifecycleSupervisor.stop:79)  - Stopping lifecycle supervisor 11
29 Feb 2016 06:21:20,778 INFO  [agent-shutdown-hook] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider.stop:83)  - Configuration provider stopping[/mw_shl_code]

flume的所有jar包如下
[mw_shl_code=applescript,true][root@localhost lib]# ls
apache-log4j-extras-1.1.jar    flume-hdfs-sink-1.5.0.jar                 flume-tools-1.5.0.jar                   log4j-1.2.17.jar
async-1.4.0.jar                flume-irc-sink-1.5.0.jar                  flume-twitter-source-1.5.0.jar          mapdb-0.9.9.jar
asynchbase-1.5.0.jar           flume-jdbc-channel-1.5.0.jar              gson-2.2.2.jar                          mina-core-2.0.4.jar
avro-1.7.3.jar                 flume-jms-source-1.5.0.jar                guava-11.0.2.jar                        netty-3.5.12.Final.jar
avro-ipc-1.7.3.jar             flume-ng-configuration-1.5.0.jar          hadoop-common-2.5.2.jar                 paranamer-2.3.jar
commons-cli-1.2.jar            flume-ng-core-1.5.0.jar                   hadoop-mapreduce-client-core-2.5.2.jar  protobuf-java-2.5.0.jar
commons-codec-1.8.jar          flume-ng-elasticsearch-sink-1.5.0.jar     httpclient-4.2.1.jar                    servlet-api-2.5-20110124.jar
commons-collections-3.2.1.jar  flume-ng-embedded-agent-1.5.0.jar         httpcore-4.2.1.jar                      slf4j-api-1.6.1.jar
commons-dbcp-1.4.jar           flume-ng-hbase-sink-1.5.0.jar             irclib-1.10.jar                         slf4j-log4j12-1.6.1.jar
commons-io-2.1.jar             flume-ng-log4jappender-1.5.0.jar          jackson-core-asl-1.9.3.jar              snappy-java-1.0.4.1.jar
commons-lang-2.5.jar           flume-ng-morphline-solr-sink-1.5.0.jar    jackson-mapper-asl-1.9.3.jar            twitter4j-core-3.0.3.jar
commons-logging-1.1.1.jar      flume-ng-node-1.5.0.jar                   jetty-6.1.26.jar                        twitter4j-media-support-3.0.3.jar
commons-pool-1.5.4.jar         flume-ng-sdk-1.5.0.jar                    jetty-util-6.1.26.jar                   twitter4j-stream-3.0.3.jar
derby-10.8.2.2.jar             flume-scribe-source-1.5.0.jar             joda-time-2.1.jar                       velocity-1.7.jar
flume-avro-source-1.5.0.jar    flume-spillable-memory-channel-1.5.0.jar  jsr305-1.3.9.jar                        zookeeper-3.3.6.jar
flume-file-channel-1.5.0.jar   flume-thrift-source-1.5.0.jar             libthrift-0.7.0.jar[/mw_shl_code]

安装在hadoop的那台机器,能正常写入hdfs,配置都是一样的.我不见得每一个Agent上都要搞一份hadoop上去吧,求各位帮帮我.急...

已有(3)人评论

跳转到指定楼层
gm100861 发表于 2016-2-29 12:48:36
hadoop-hdfs
hadoop-mapreduce-client-core
hadoop-common
hadoop-auth
commons-configuration

需要以上几个包,一个个试的,总算试出来了.希望对后面再遇到这个问题的朋友能有帮助

点评

点个赞  发表于 2016-2-29 14:20
回复

使用道具 举报

wscl1213 发表于 2016-2-29 11:58:56
flume需要,那就需要安装。
也有一种调用方法就是远程调用。但是flume应该还不支持。
具体应该是缺hadoop config包,当然楼主也可以自己测试具体定位下缺哪些包。
只要把hadoop的share下的包复制过去应该就可以了,这个官方似乎也没有这方面的
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条