/usr/local/service/sqoop/bin/sqoop-import
--connect jdbc:mysql://$mysqlIP/test
--username root
-P --table sqoop_test
-m 4
--target-dir /sqoop
--fields-terminated-by 't'
/usr/local/service/sqoop/bin/sqoop-export
--connect jdbc:mysql://$mysqlIP/test
--username root
-P --table sqoop_test
-m 4
--export-dir /sqoop
--input-fields-terminated-by 't'
注意:增量模式入到hdfs中无法自动适配分区,会造成分区数据倾斜。应该尽量进行全量重新入库。
/usr/local/service/sqoop/bin/sqoop-export
--connect jdbc:mysql://$mysqlIP/test
--username root
-P --table sqoop_test
--check-column id --incremental append --last-value 3
--export-dir /sqoop
--input-fields-terminated-by 't'
/usr/local/service/sqoop/bin/sqoop-export
--connect jdbc:mysql://$mysqlIP/test
--username root
-P --table sqoop_test
--check-column time --incremental lastmodified --merge-key id --last-value '2020-10-21 16:02:29'
--export-dir /sqoop
--input-fields-terminated-by 't'
/usr/local/service/sqoop/bin/sqoop-import
--connect jdbc:mysql://$mysqlIP/test --username
root -P --table sqoop_test
--hive-import --hive-database db_sqoop_test --hive-table sqoop_test
如果hive存储格式不是text,就必须用这种方式
/usr/local/service/sqoop/bin/sqoop-import
--connect jdbc:mysql://$mysqlIP/test --username
root -P --table sqoop_test
--hcatalog-database test_dlm --hcatalog-table table_name
--hive-partition-key dt --hive-partition-value 201905
如果hive表有分区,就必须带上--hive-partition-key和--hive-partition-value参数
MySQL 中的表字段名字和 Hive 中的表字段名字必须完全一致
使用hdfs文件的方式,需要保证分隔符能正确分割内容,否则会报错
/usr/local/service/sqoop/bin/sqoop-export
--connect jdbc:mysql://$mysqlIP/test --username root -P
--table table_from_hive
--export-dir /usr/hive/warehouse/hive_to_sqoop.db/hive_test
/usr/local/service/sqoop/bin/sqoop-export
--connect jdbc:mysql://$mysqlIP/test --username root -P
--table table_from_hive
--hcatalog-database hive_to_sqoop --hcatalog-table hive_test
--hive-partition-key dt --hive-partition-value 201905
如果失败,可以到yarn manager resource上去看错误日志
Error: JAVA.io.IOException: Can't export data, please check failed map task logs at org.Apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:122) at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169) Caused by: java.lang.RuntimeException: Can't parse input data: '50.00' at bill_detail_song_201905_sqoop_cos.__loadFromFields(bill_detail_song_201905_sqoop_cos.java:2925) at bill_detail_song_201905_sqoop_cos.parse(bill_detail_song_201905_sqoop_cos.java:2410) at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89) ... 10 more Caused by: java.lang.NumberFormatException: For input string: "50.00" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) at java.lang.Long.parseLong(Long.java:589) at java.lang.Long.valueOf(Long.java:803) at bill_detail_song_201905_sqoop_cos.__loadFromFields(bill_detail_song_201905_sqoop_cos.java:2616) ... 12 more
检查后发现是bizkey字段含有,跟分隔符冲突,重新导入时指定分隔符即可。建议一般分隔符使用|,复杂的使用||,避免使用,;空格、制表符等常见分隔符。