user@master ~ $ hadoop fs -rmr mysql_data否則,等一下執行 sqoop create-hive-table 或 sqoop import 時,會出現以下的訊息。
13/12/22 14:14:55 INFO hive.HiveImport: Hive import complete. 13/12/22 14:14:55 INFO hive.HiveImport: Export directory is not empty, keeping it.
user@master ~ $ export DBID=帳號 user@master ~ $ sqoop create-hive-table --connect "jdbc:mysql://master.3du.me/$DBID" --table mysql_data --username $DBID -P --hive-table my_$DBID Enter password: 輸入密碼
13/12/22 13:44:43 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override 13/12/22 13:44:43 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc. 13/12/22 13:44:43 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 13/12/22 13:44:44 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `mysql_data` AS t LIMIT 1 13/12/22 13:44:44 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `mysql_data` AS t LIMIT 1 .. 略 ..
user@master ~ $ hadoop fs -ls /user/hive/warehouse Found 3 items drwx------ - jazz supergroup 0 2013-12-17 00:24 /user/hive/warehouse/dummy drwx------ - $DBID supergroup 0 2013-12-22 12:25 /user/hive/warehouse/my_$DBID
user@master ~ $ hadoop fs -ls /user/hive/warehouse/my_$DBID
--hive-import --hive-table my_$DBID
參數)
user@master ~ $ export DBID=帳號 user@master ~ $ sqoop import --connect "jdbc:mysql://master.3du.me/$DBID" --table mysql_data --username $DBID -P --hive-import --hive-table my_$DBID Enter password: 輸入密碼
13/12/22 13:51:58 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override 13/12/22 13:51:58 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc. 13/12/22 13:51:58 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 13/12/22 13:51:58 INFO tool.CodeGenTool: Beginning code generation 13/12/22 13:51:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `mysql_data` AS t LIMIT 1 13/12/22 13:51:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `mysql_data` AS t LIMIT 1 ... 略 ... 13/12/22 13:54:59 INFO hive.HiveImport: OK 13/12/22 13:54:59 INFO hive.HiveImport: Time taken: 5.73 seconds 13/12/22 13:55:00 INFO hive.HiveImport: Loading data to table default.mysql_data 13/12/22 13:55:00 INFO hive.HiveImport: OK 13/12/22 13:55:00 INFO hive.HiveImport: Time taken: 1.693 seconds 13/12/22 13:55:00 INFO hive.HiveImport: Hive import complete.
user@master ~ $ hadoop fs -ls /user/hive/warehouse/my_$DBID Found 2 items -rw------- 3 user user 0 2013-12-22 13:54 /user/hive/warehouse/mysql_data/_SUCCESS -rw------- 3 user user 16 2013-12-22 13:54 /user/hive/warehouse/mysql_data/part-m-00000
user@master ~ $ hadoop fs -cat /user/hive/warehouse/my_$DBID/part-m-00000 No encryption was performed by peer. 13/12/22 14:00:36 INFO util.NativeCodeLoader: Loaded the native-hadoop library 1Hello 2World