如何将数据写入HDFS并转存到hive表?

1、直接用命令
这种要把文件先传上去

[root@cdh01 ~]# ll |grep test
-rw-r--r--  1 root root       527 May  9 13:47 test.txt

[root@cdh01 ~]# cat test.txt 
{"name":"testName1", "age":21}
{"name":"testName2", "age":22}
{"name":"testName3", "age":23}
{"name":"testName4", "age":24}
{"name":"testName5", "age":25}
{"name":"testName6", "age":26}
{"name":"testName7", "age":27}
{"name":"testName8", "age":28}
{"name":"testName9", "age":29}
{"name":"testName10","age":30}
{"name":"testName11","age":31}
{"name":"testName12","age":32}
{"name":"testName13","age":33}
{"name":"testName14","age":34}
{"name":"testName15","age":35}
{"name":"testName16","age":36}
{"name":"testName17","age":37}

[root@cdh01 ~]#hadoop fs -put /root/test.txt /user/test.txt
#需要注意的是操作的目录是需要获取权限,比如/user/hive 则是用户hive操目录,没有授权的情况下root也不可操作
[root@cdh01 ~]#  hadoop fs -cat /user/test.txt
{"name":"testName1", "age":21}
{"name":"testName2", "age":22}
{"name":"testName3", "age":23}
{"name":"testName4", "age":24}
{"name":"testName5", "age":25}
{"name":"testName6", "age":26}
{"name":"testName7", "age":27}
{"name":"testName8", "age":28}
{"name":"testName9", "age":29}

顺便把数据加载到hive
建表(基于json)

CREATE TABLE test_table(name string, age int)
ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe'
STORED AS TEXTFILE;
#这里的路径是HDFS的文件路径
load data inpath '/user/test.txt' into table test_table;
image.png
image.png

查询结果看看


image.png

load之后,原文件被移动到数据库下


image.png
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容