关注公众号:分享电脑学习
回复"百度云盘" 可以免费获取所有学习文档的代码(不定期更新)
云盘目录说明:
tools目录是安装包
res 目录是每一个课件对应的代码和资源等
doc 目录是一些第三方的文档工具
承接上一篇文档《新增访客数量MR统计之MR数据输出到MySQL》
hive-1.2.1的版本可以直接映射HBase已经存在的表
如果说想在hive创建表,同时HBase不存在对应的表,也想做映射,那么采用编译后的hive版本hive-1.2.1-hbase
1. Hive中创建外部表,关联hbase
CREATEEXTERNALTABLEevent_log_20180728(
keystring,
plstring,
verstring,
s_timestring,
u_udstring,
u_sdstring,
enstring)
STOREDBY'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITHSERDEPROPERTIES ("hbase.columns.mapping"=":key,info:pl,info:ver,info:s_time,info:u_ud,info:u_sd,info:en")
TBLPROPERTIES("hbase.table.name"="event_log_20180728");
统计多少个新用户:
selectcount(*)fromevent_log_20180728whereen="e_l";
2. 提取数据,进行初步的数据过滤操作,最终将数据保存到临时表
创建临时表
CREATETABLEstats_hourly_tmp01(
plstring,
verstring,
s_timestring,
u_udstring,
u_sdstring,
enstring,
`date`string,
hourint
);
将原始数据提取到临时表中
INSERTOVERWRITETABLEstats_hourly_tmp01
SELECT pl,ver,s_time,u_ud,u_sd,en,
from_unixtime(cast(s_time/1000asint),'yyyy-MM-dd'),hour(from_unixtime(cast(s_time/1000asint),'yyyy-MM-dd HH:mm:ss'))
FROMevent_log_20200510
WHERE en="e_l"oren="e_pv";
SELECTfrom_unixtime(cast(s_time/1000asint),'yyyy-MM-dd'),from_unixtime(cast(s_time/1000asint),'yyyy-MM-dd HH:mm:ss')FROMevent_log_20180728;
查看结果
3. 具体kpi的分析
创建临时表保存数据结果
CREATETABLEstats_hourly_tmp02(
plstring,
verstring,
`date`string,
kpistring,
hourint,
valueint
);
统计活跃用户 u_ud 有多少就有多少用户
统计platform维度是:(name,version)
INSERTOVERWRITETABLEstats_hourly_tmp02
SELECT pl,ver,`date`,'hourly_new_install_users'askpi,hour,COUNT(distinctu_ud)asv
FROM stats_hourly_tmp01
WHERE en="e_l"
GROUPBYpl,ver,`date`,hour;
查看结果:
统计会话长度指标
会话长度 = 一个会话中最后一条记录的时间 - 第一条的记录时间 = maxtime - mintime
步骤:
1. 计算出每个会话的会话长度 group by u_sd
2. 统计每个区间段的总会话长度
统计platform维度是:(name,version)
INSERTINTOTABLE
SELECT pl,ver,`date`,'hourly_session_length'askpi,hour,sum(s_length)/1000asv
FROM (
SELECTpl,ver,`date`,hour,u_sd,(max(s_time) -min(s_time))ass_length
FROM stats_hourly_tmp01
GROUPBYpl,ver,`date`,hour,u_sd
) tmp
GROUPBYpl,ver,`date`,hour;
查看结果
将tmp02的数据转换为和mysql表结构一致的数据
窄表转宽表 => 转换的结果保存到临时表中
CREATETABLEstats_hourly_tmp03(
plstring, verstring,`date`string, kpistring,
hour00int, hour01int, hour02int, hour03int,
hour04int, hour05int, hour06int, hour07int,
hour08int, hour09int, hour10int, hour11int,
hour12int, hour13int, hour14int, hour15int,
hour16int, hour17int, hour18int, hour19int,
hour20int, hour21int, hour22int, hour23int
);
INSERTOVERWRITETABLEstats_hourly_tmp03
SELECT pl,ver,`date`,kpi,
max(casewhenhour=0thenvalueelse0end)ash0,
max(casewhenhour=1thenvalueelse0end)ash1,
max(casewhenhour=2thenvalueelse0end)ash2,
max(casewhenhour=3thenvalueelse0end)ash3,
max(casewhenhour=4thenvalueelse0end)ash4,
max(casewhenhour=5thenvalueelse0end)ash5,
max(casewhenhour=6thenvalueelse0end)ash6,
max(casewhenhour=7thenvalueelse0end)ash7,
max(casewhenhour=8thenvalueelse0end)ash8,
max(casewhenhour=9thenvalueelse0end)ash9,
max(casewhenhour=10thenvalueelse0end)ash10,
max(casewhenhour=11thenvalueelse0end)ash11,
max(casewhenhour=12thenvalueelse0end)ash12,
max(casewhenhour=13thenvalueelse0end)ash13,
max(casewhenhour=14thenvalueelse0end)ash14,
max(casewhenhour=15thenvalueelse0end)ash15,
max(casewhenhour=16thenvalueelse0end)ash16,
max(casewhenhour=17thenvalueelse0end)ash17,
max(casewhenhour=18thenvalueelse0end)ash18,
max(casewhenhour=19thenvalueelse0end)ash19,
max(casewhenhour=20thenvalueelse0end)ash20,
max(casewhenhour=21thenvalueelse0end)ash21,
max(casewhenhour=22thenvalueelse0end)ash22,
max(casewhenhour=23thenvalueelse0end)ash23
FROM stats_hourly_tmp02
GROUPBYpl,ver,`date`,kpi;
selecthour14,hour15,hour16fromstats_hourly_tmp03;
结果:
将维度的属性值转换为id,使用UDF进行转换
1. 将udf文件夹中的所有自定义HIVE的UDF放到项目中
2. 使用run maven install环境进行打包
3. 将打包形成的jar文件上传到HDFS上的/jar文件夹中
4. hive中创建自定义函数,命令如下:
createfunctiondateconverteras'com.xlgl.wzy.hive.udf.DateDimensionConverterUDF'usingjar'hdfs://master:9000/jar/transformer-0.0.1.jar';
createfunctionkpiconverteras'com.xlgl.wzy.hive.udf.KpiDimensionConverterUDF'usingjar'hdfs://master:9000/jar/transformer-0.0.1.jar';
createfunctionplatformconverteras'com.xlgl.wzy.hive.udf.PlatformDimensionConverterUDF'usingjar'hdfs://master:9000/jar/transformer-0.0.1.jar';
创建hive中对应mysql的最终表结构
CREATETABLEstats_hourly(
platform_dimension_idint,
date_dimension_idint,
kpi_dimension_idint,
hour00int, hour01int, hour02int, hour03int,
hour04int, hour05int, hour06int, hour07int,
hour08int, hour09int, hour10int, hour11int,
hour12int, hour13int, hour14int, hour15int,
hour16int, hour17int, hour18int, hour19int,
hour20int, hour21int, hour22int, hour23int
);
INSERTOVERWRITETABLEstats_hourly
SELECT
platformconverter(pl,ver),dateconverter(`date`,'day'),kpiconverter(kpi),
hour00,hour01,hour02,hour03,
hour04,hour05,hour06,hour07,
hour08,hour09,hour10,hour11,
hour12,hour13,hour14,hour15,
hour16,hour17,hour18,hour19,
hour20,hour21,hour22,hour23
FROMstats_hourly_tmp03;
导出sqoop-》mysql
bin/sqoop export \
--connect jdbc:mysql://master:3306/test \
--username root \
--password123456\
--table stats_hourly \
--export-dir/user/hive/warehouse/log_lx.db/stats_hourly \
-m1\
--input-fields-terminated-by'\001'
查询mysql