liuzhixin137 发表于 2016-4-15 17:47:34

sqoop 从MySQL中导入数据到hive中,一直报链接数据库失败错误的解决

我看这个贴 说改成IP地址就OK,但是我的还是不行。http://www.aboutyun.com/thread-9302-1-1.html

出错如下:# sqoop import --connect jdbc:mysql://192.168.56.101:3306/sessionanalysis --username root--table t_customer_access_log_20141022--hive-import --tablet_customer_access_log_20141022 -m 1         
Warning:does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning:does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning:does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/04/09 19:26:17 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/04/09 19:26:18 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
16/04/09 19:26:18 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
16/04/09 19:26:18 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/04/09 19:26:18 INFO tool.CodeGenTool: Beginning code generation
16/04/09 19:26:18 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `t_customer_access_log_20141022` AS t LIMIT 1
16/04/09 19:26:18 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `t_customer_access_log_20141022` AS t LIMIT 1
16/04/09 19:26:18 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
Note: /tmp/sqoop-root/compile/eb667ad013b9dc31d5dd017c3ec40c6c/t_customer_access_log_20141022.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/04/09 19:26:20 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/eb667ad013b9dc31d5dd017c3ec40c6c/t_customer_access_log_20141022.jar
16/04/09 19:26:20 WARN manager.MySQLManager: It looks like you are importing from mysql.
16/04/09 19:26:20 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
16/04/09 19:26:20 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
16/04/09 19:26:20 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
16/04/09 19:26:20 WARN manager.CatalogQueryManager: The table t_customer_access_log_20141022 contains a multi-column primary key. Sqoop will default to the column id only for this job.
16/04/09 19:26:20 WARN manager.CatalogQueryManager: The table t_customer_access_log_20141022 contains a multi-column primary key. Sqoop will default to the column id only for this job.
16/04/09 19:26:20 INFO mapreduce.ImportJobBase: Beginning import of t_customer_access_log_20141022
16/04/09 19:26:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/04/09 19:26:20 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
16/04/09 19:26:21 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
16/04/09 19:26:21 INFO client.RMProxy: Connecting to ResourceManager at sparkproject1/192.168.56.101:8032
16/04/09 19:26:23 INFO db.DBInputFormat: Using read commited transaction isolation
16/04/09 19:26:23 INFO mapreduce.JobSubmitter: number of splits:1
16/04/09 19:26:24 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1460176692400_0007
16/04/09 19:26:24 INFO impl.YarnClientImpl: Submitted application application_1460176692400_0007
16/04/09 19:26:24 INFO mapreduce.Job: The url to track the job: http://sparkproject1:8088/proxy/application_1460176692400_0007/
16/04/09 19:26:24 INFO mapreduce.Job: Running job: job_1460176692400_0007
16/04/09 19:26:32 INFO mapreduce.Job: Job job_1460176692400_0007 running in uber mode : false
16/04/09 19:26:32 INFO mapreduce.Job:map 0% reduce 0%
16/04/09 19:26:39 INFO mapreduce.Job: Task Id : attempt_1460176692400_0007_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'root'@'sparkproject2' (using password: NO)
      at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
      at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
      at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
      at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:746)
      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
      at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:415)
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
      at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'root'@'sparkproject2' (using password: NO)
      at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
      at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
      ... 9 more
Caused by: java.sql.SQLException: Access denied for user 'root'@'sparkproject2' (using password: NO)
      at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:998)
      at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3847)
      at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3783)
      at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:871)
      at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4292)
      at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1259)
      at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2249)
      at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2280)
      at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2079)
      at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:794)
      at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      at com.mysql.jdbc.Util.handleNewInstance(Util.java:400)
      at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:399)
      at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325)
      at java.sql.DriverManager.getConnection(DriverManager.java:571)
      at java.sql.DriverManager.getConnection(DriverManager.java:215)
      at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
      at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
      ... 10 more

16/04/09 19:26:43 INFO mapreduce.Job: Task Id : attempt_1460176692400_0007_m_000000_1, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'root'@'sparkproject2' (using password: NO)
      at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
      at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
      at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
      at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:746)
      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
      at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:415)
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
      at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'root'@'sparkproject2' (using password: NO)
      at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
      at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
      ... 9 more
Caused by: java.sql.SQLException: Access denied for user 'root'@'sparkproject2' (using password: NO)
      at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:998)
      at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3847)
      at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3783)
      at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:871)
      at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4292)
      at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1259)
      at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2249)
      at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2280)
      at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2079)
      at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:794)
      at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      at com.mysql.jdbc.Util.handleNewInstance(Util.java:400)
      at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:399)
      at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325)
      at java.sql.DriverManager.getConnection(DriverManager.java:571)
      at java.sql.DriverManager.getConnection(DriverManager.java:215)
      at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
      at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
      ... 10 more

16/04/09 19:26:47 INFO mapreduce.Job: Task Id : attempt_1460176692400_0007_m_000000_2, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'root'@'sparkproject2' (using password: NO)
      at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
      at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
      at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
      at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:746)
      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
      at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:415)
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
      at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'root'@'sparkproject2' (using password: NO)
      at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
      at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
      ... 9 more
Caused by: java.sql.SQLException: Access denied for user 'root'@'sparkproject2' (using password: NO)
      at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:998)
      at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3847)
      at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3783)
      at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:871)
      at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4292)
      at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1259)
      at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2249)
      at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2280)
      at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2079)
      at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:794)
      at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      at com.mysql.jdbc.Util.handleNewInstance(Util.java:400)
      at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:399)
      at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325)
      at java.sql.DriverManager.getConnection(DriverManager.java:571)
      at java.sql.DriverManager.getConnection(DriverManager.java:215)
      at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
      at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
      ... 10 more

16/04/09 19:26:53 INFO mapreduce.Job:map 100% reduce 0%
16/04/09 19:26:53 INFO mapreduce.Job: Job job_1460176692400_0007 failed with state FAILED due to: Task failed task_1460176692400_0007_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

16/04/09 19:26:53 INFO mapreduce.Job: Counters: 8
      Job Counters
                Failed map tasks=4
                Launched map tasks=4
                Other local map tasks=4
                Total time spent by all maps in occupied slots (ms)=13781
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=13781
                Total vcore-seconds taken by all map tasks=13781
                Total megabyte-seconds taken by all map tasks=14111744
16/04/09 19:26:53 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
16/04/09 19:26:53 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 32.143 seconds (0 bytes/sec)
16/04/09 19:26:53 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
16/04/09 19:26:53 INFO mapreduce.ImportJobBase: Retrieved 0 records.
16/04/09 19:26:53 ERROR tool.ImportTool: Error during import: Import job failed!

弄了好久还是不行。求帮助。多谢!

Alkaloid0515 发表于 2016-4-15 17:58:00

检测下面两项
1.是否授权
GRANT ALL PRIVILEGES ON *.* TO 'root'@'sparkproject2' IDENTIFIED BY 'root' WITH GRANT OPTION;

2.mysql驱动是否正确
3.是否提供密码




liuzhixin137 发表于 2016-4-15 17:49:21

@pig2 大神

liuzhixin137 发表于 2016-4-15 17:50:21

@Riordon 王小龙。

liuzhixin137 发表于 2016-4-15 19:10:22

多谢。已经找到了,就是没有授权的问题。多谢!

liuzhixin137 发表于 2016-4-15 19:11:00

Alkaloid0515 发表于 2016-4-15 17:58
检测下面两项
1.是否授权
GRANT ALL PRIVILEGES ON *.* TO 'root'@'sparkproject2' IDENTIFIED BY 'root' ...

多谢多谢!

a530491093 发表于 2016-4-21 13:44:10

路过 !!!!
页: [1]
查看完整版本: sqoop 从MySQL中导入数据到hive中,一直报链接数据库失败错误的解决