mysql
(4)导入数据(执行时间:20 秒左右)
注意,刚才已经退出 MySQL,回到了 Shell 命令提示符状态。下面就可以执行数据导入操作,
1. cd /usr/local/sqoop
2. ./bin/sqoop export --connect jdbc:mysql://localhost:3306/website --username root --passwo
rd 123456 --table user_action --export-dir '/user/hive/warehouse/website.db/user_action'
--fields-terminated-by '\t'; #
导入命令
Shell 命令
字段解释:
./bin/sqoop export ##表示数据从 hive 复制到 mysql 中
--connect jdbc:mysql://localhost:3306/dblab
--username root #mysql 登陆用户名
--password hadoop #登录密码
--table user_action #mysql 中的表,即将被导入的表名称
--export-dir '/user/hive/warehouse/dblab.db/user_action ' #hive 中被导出的文件
--fields-terminated-by '\t' #Hive 中被导出的文件字段的分隔符
16/11/28 09:18:12 INFO mapreduce.Job: Job job_local1006738657_0001 completed successfully
16/11/28 09:18:12 INFO mapreduce.Job: Counters: 20
File System Counters
FILE: Number of bytes read=72216458
FILE: Number of bytes written=73973600
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=38989562
HDFS: Number of bytes written=0
HDFS: Number of read operations=78
HDFS: Number of large read operations=0
HDFS: Number of write operations=0
Map-Reduce Framework
评论0