问题导读:
1、如何统计商品每日销量排行Top10?
2、如何统计每个月订单付款率?
3、如何设置数据可视化漏斗分析?
4、如何统计全国商品销售情况?
上一篇:大数据项目之电商数仓(总结)(六):系统业务数据仓库
第11章 需求八:商品每日销量排行Top10
11.1 DWS层
使用用户购买商品明细宽表作为DWS数据
11.2 ADS层
11.2.1 建表语句
drop table if exists ads_goods_order_count_day;
create external table ads_goods_order_count_day(
dt string comment '统计日期',
sku_id string comment '商品id',
order_count bigint comment '下单次数'
) COMMENT '商品下单top10'
stored as parquet
location '/warehouse/gmall/dws/ads_goods_order_count_day/'; 复制代码
11.2.2 导入数据
insert into table ads_goods_order_count_day
select
'2019-02-10',
sku_id,
sum(order_count) order_totla_count
from
dws_sale_detail_daycount
where
dt='2019-02-10'
group by
sku_id
order by
order_totla_count
desc
limit 10; 复制代码
11.2.3 导入数据脚本
1)在/home/kgg/bin目录下创建脚本ads_goods_order_count_day.sh
[kgg@hadoop102 bin]$ vim ads_goods_order_count_day.sh
在脚本中填写如下内容
#!/bin/bash
# 定义变量方便修改
APP=gmall
hive=/opt/module/hive/bin/hive
hadoop=/opt/module/hadoop-2.7.2/bin/hadoop
# 如果是输入的日期按照取输入日期;如果没输入日期取当前时间的前一天
if [ -n "$1" ] ;then
do_date=$1
else
do_date=`date -d "-1 day" +%F`
fi
echo "===日志日期为 $do_date==="
sql="
insert into table "$APP".ads_goods_order_count_day
select
'$do_date',
sku_id,
sum(order_count) order_totla_count
from
"$APP".dws_sale_detail_daycount
where
dt='$do_date'
group by
sku_id
order by
order_totla_count
desc
limit 10;
"
$hive -e "$sql" 复制代码
2)增加脚本执行权限
[kgg@hadoop102 bin]$ chmod 777 ads_goods_order_count_day.sh 复制代码
3)脚本使用
[kgg@hadoop102 module]$ ads_goods_order_count_day.sh 2019-02-20 复制代码
4)查询结果
hive (gmall)> select * from ads_goods_order_count_day; 复制代码
5)脚本执行时间
复制代码
第12章 需求九:统计每个月订单付款率
12.1 DWS层
采用用户行为宽表作为DWS层
12.2 ADS层
12.2.1 建表语句<div>drop table if exists ads_order2pay_mn;
create external table ads_order2pay_mn (
`dt` string COMMENT '统计日期',
`order_u_count` bigint COMMENT '下单人数',
`payment_u_count` bigint COMMENT '支付人数',
`order2payment_convert_ratio` decimal(10,2) COMMENT '下单到支付的转化率'
) COMMENT ''
row format delimited fields terminated by '\t'
location '/warehouse/gmall/ads/ ads_order2pay_mn /';</div> 复制代码
12.2.2 导入数据
insert into table ads_order2pay_mn
select
'2019-02-10',
ua.order_count,
ua.payment_count,
cast(ua.payment_count/ua.order_count as decimal(10,2)) order2payment_convert_ratio
from
(
select
dt,
sum(if(order_count>0,1,0)) order_count,
sum(if(payment_count>0,1,0)) payment_count
from
dws_user_action
where
date_format(dt,'yyyy-MM')= date_format('2019-02-10','yyyy-MM')
group by dt
)ua; 复制代码
12.2.3 导入数据脚本
1)在/home/kgg/bin目录下创建脚本ads_order2pay_mn.sh
[kgg@hadoop102 bin]$ vim ads_order2pay_mn.sh
在脚本中填写如下内容
#!/bin/bash
# 定义变量方便修改
APP=gmall
hive=/opt/module/hive/bin/hive
hadoop=/opt/module/hadoop-2.7.2/bin/hadoop
# 如果是输入的日期按照取输入日期;如果没输入日期取当前时间的前一天
if [ -n "$1" ] ;then
do_date=$1
else
do_date=`date -d "-1 day" +%F`
fi
echo "===日志日期为 $do_date==="
sql="
insert into table "$APP".ads_order2pay_mn
select
'$do_date',
ua.order_count,
ua.payment_count,
cast(ua.payment_count/ua.order_count as decimal(10,2)) order2payment_convert_ratio
from
(
select
dt,
sum(if(order_count>0,1,0)) order_count,
sum(if(payment_count>0,1,0)) payment_count
from
"$APP".dws_user_action
where
date_format(dt,'yyyy-MM')=date_format('$do_date','yyyy-MM')
group by dt
)ua;
"
$hive -e "$sql" 复制代码
2)增加脚本执行权限
[kgg@hadoop102 bin]$ chmod 777 ads_order2pay_mn.sh 复制代码
3)脚本使用
[kgg@hadoop102 module]$ ads_order2pay_mn.sh 2019-02-20 复制代码
4)查询结果
hive (gmall)> select * from ads_order2pay_mn; 复制代码
5)脚本执行时间
复制代码
第13章 数据可视化
13.1 在MySQL中创建表
13.1.1 每日活跃统计
1)在MySQL中创建ads_uv_count表
DROP TABLE IF EXISTS `ads_uv_count`;
CREATE TABLE `ads_uv_count` (
`dt` varchar(255) DEFAULT NULL COMMENT '统计日期',
`day_count` bigint(200) DEFAULT NULL COMMENT '当日用户数量',
`wk_count` bigint(200) DEFAULT NULL COMMENT '当周用户数量',
`mn_count` bigint(200) DEFAULT NULL COMMENT '当月用户数量',
`is_weekend` varchar(200) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT NULL COMMENT 'Y,N是否是周末,用于得到本周最终结果',
`is_monthend` varchar(200) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT NULL COMMENT 'Y,N是否是月末,用于得到本月最终结果'
) ENGINE = InnoDB CHARACTER SET = utf8 COLLATE = utf8_general_ci COMMENT = '每日活跃用户数量' ROW_FORMAT = Dynamic; 复制代码
2)向MySQL中插入如下数据
INSERT INTO `ads_uv_count` VALUES ('2019-03-01 14:10:04', 20, 30, 100, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-02 14:12:48', 35, 50, 100, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-03 14:14:07', 25, 640, 3300, 'Y', 'Y');
INSERT INTO `ads_uv_count` VALUES ('2019-03-04 14:14:14', 10, 23, 123, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-05 14:14:21', 80, 121, 131, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-06 14:14:38', 30, 53, 453, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-07 14:33:27', 20, 31, 453, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-08 14:33:39', 10, 53, 453, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-09 14:33:47', 10, 34, 453, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-10 14:33:54', 10, 653, 8453, 'Y', 'Y');
INSERT INTO `ads_uv_count` VALUES ('2019-03-11 14:34:04', 100, 453, 1453, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-12 14:34:10', 101, 153, 134, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-13 14:34:16', 100, 286, 313, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-14 14:34:22', 100, 45, 453, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-15 14:34:29', 100, 345, 3453, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-16 14:34:35', 101, 453, 453, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-17 14:34:41', 100, 678, 9812, 'Y', 'Y');
INSERT INTO `ads_uv_count` VALUES ('2019-03-18 14:34:46', 100, 186, 193, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-19 14:34:53', 453, 686, 712, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-20 14:34:57', 452, 786, 823, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-21 14:35:02', 214, 58, 213, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-22 14:35:08', 76, 78, 95, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-23 14:35:13', 76, 658, 745, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-24 14:35:19', 76, 687, 9300, 'Y', 'Y');
INSERT INTO `ads_uv_count` VALUES ('2019-03-25 14:35:25', 76, 876, 923, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-26 14:35:30', 76, 456, 511, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-27 14:35:35', 76, 456, 623, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-28 14:35:41', 43, 753, 4000, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-29 14:35:47', 76, 876, 4545, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-30 14:35:57', 76, 354, 523, 'N', 'N');
INSERT INTO `ads_uv_count` VALUES ('2019-03-31 14:36:02', 43, 634, 6213, 'Y', 'Y'); 复制代码
13.1.2 留存率统计
1)在MySQL中创建ads_user_retention_day_rate表
DROP TABLE IF EXISTS `ads_user_retention_day_rate`;
CREATE TABLE `ads_user_retention_day_rate` (
`stat_date` varchar(255) DEFAULT NULL COMMENT '统计日期',
`create_date` varchar(255) DEFAULT NULL COMMENT '设备新增日期',
`retention_day` bigint(200) DEFAULT NULL COMMENT '截止当前日期留存天数',
`retention_count` bigint(200) DEFAULT NULL COMMENT '留存数量',
`new_mid_count` bigint(200) DEFAULT NULL COMMENT '当日设备新增数量',
`retention_ratio` decimal(10, 2) DEFAULT NULL COMMENT '留存率'
) ENGINE = InnoDB CHARACTER SET = utf8 COLLATE = utf8_general_ci COMMENT = '每日用户留存情况' ROW_FORMAT = Dynamic; 复制代码
2)向MySQL中插入数据
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-09','2019-03-08', 1,88, 99, 0.78);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-10','2019-03-08', 2,77, 88, 0.68);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-11','2019-03-08', 3,66, 77, 0.58);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-12','2019-03-08', 4,55, 66, 0.48);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-13','2019-03-08', 5,44, 55, 0.38);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-14','2019-03-08', 6,33, 44, 0.28);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-10','2019-03-09', 1,77, 88, 0.56);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-11','2019-03-09', 2,66, 77, 0.46);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-12','2019-03-09', 3,55, 66, 0.36);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-13','2019-03-09', 4,44, 55, 0.26);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-14','2019-03-09', 5,33, 44, 0.16);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-11','2019-03-10', 1,66, 77, 0.55);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-12','2019-03-10', 2,55, 66, 0.45);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-13','2019-03-10', 3,44, 55, 0.35);
INSERT INTO `ads_user_retention_day_rate` VALUES ('2019-03-14','2019-03-10', 4,33, 44, 0.25); 复制代码
13.1.3 漏斗分析
1)在MySQL中创建ads_user_action_convert_day表
DROP TABLE IF EXISTS `ads_user_action_convert_day`;
CREATE TABLE `ads_user_action_convert_day` (
`dt` varchar(200) DEFAULT NULL COMMENT '统计日期',
`total_visitor_m_count` bigint(20) DEFAULT NULL COMMENT '总访问人数',
`order_u_count` bigint(20) DEFAULT NULL COMMENT '下单人数',
`visitor2order_convert_ratio` decimal(10, 2) DEFAULT NULL COMMENT '购物车到下单转化率',
`payment_u_count` bigint(20) DEFAULT NULL COMMENT '支付人数',
`order2payment_convert_ratio` decimal(10, 2) DEFAULT NULL COMMENT '下单到支付的转化率'
) ENGINE = InnoDB CHARACTER SET = utf8 COLLATE = utf8_general_ci COMMENT = '每日用户行为转化率统计' ROW_FORMAT = Dynamic; 复制代码
2)向MySQL中插入数据
INSERT INTO `ads_user_action_convert_day` VALUES ('2019-04-28 19:36:18', 10000, 3000, 0.25, 2000, 0.15); 复制代码
13.1.4 GMV统计
1)在MySQL中创建ads_gmv_sum_day表
DROP TABLE IF EXISTS ads_gmv_sum_day;
CREATE TABLE ads_gmv_sum_day(
`dt` varchar(200) DEFAULT NULL COMMENT '统计日期',
`gmv_count` bigint(20) DEFAULT NULL COMMENT '当日gmv订单个数',
`gmv_amount` decimal(16, 2) DEFAULT NULL COMMENT '当日gmv订单总金额',
`gmv_payment` decimal(16, 2) DEFAULT NULL COMMENT '当日支付金额'
) ENGINE = InnoDB CHARACTER SET = utf8 COLLATE = utf8_general_ci COMMENT = '每日活跃用户数量' ROW_FORMAT = Dynamic; 复制代码
2)向MySQL中插入数据
INSERT INTO `ads_gmv_sum_day` VALUES ('2019-03-01 22:51:37', 1000, 210000.00, 2000.00);
INSERT INTO `ads_gmv_sum_day` VALUES ('2019-05-08 22:52:32', 3434, 12413.00, 1.00);
INSERT INTO `ads_gmv_sum_day` VALUES ('2019-07-13 22:52:51', 1222, 324345.00, 1.00);
INSERT INTO `ads_gmv_sum_day` VALUES ('2019-09-13 22:53:08', 2344, 12312.00, 1.00); 复制代码
13.1.5 全国商品销售
1)在MySQL中创建ads_gmv_sum_province表
DROP TABLE IF EXISTS `ads_gmv_sum_province`;
CREATE TABLE `ads_gmv_sum_province` (
`province` varchar(255) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT NULL,
`gmv` bigint(255) DEFAULT NULL,
`remark` varchar(255) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT NULL
) ENGINE = InnoDB CHARACTER SET = utf8 COLLATE = utf8_general_ci ROW_FORMAT = Dynamic; 复制代码
2)向MySQL中插入数据
INSERT INTO `ads_gmv_sum_province` VALUES ('北京', 2000, '');
INSERT INTO `ads_gmv_sum_province` VALUES ('辽宁', 30000, '沈阳:21.1%,大连:20%,鞍山:35%');
INSERT INTO `ads_gmv_sum_province` VALUES ('浙江', 8002, '杭州:20%,舟山:50%'); 复制代码
13.2 WEB页面查看
1)运行spring-boot-echarts-master程序
2)在web页面上查看显示结果
http://localhost:8080/active 复制代码
13.3 Sqoop导出脚本
1)编写Sqoop导出脚本
在/home/kgg/bin目录下创建脚本sqoop_export.sh
[kgg@hadoop102 bin]$ vim sqoop_export.sh
在脚本中填写如下内容
#!/bin/bash
db_name=gmall
export_data() {
/opt/module/sqoop/bin/sqoop export \
--connect "jdbc:mysql://hadoop102:3306/${db_name}?useUnicode=true&characterEncoding=utf-8" \
--username root \
--password 000000 \
--table $1 \
--num-mappers 1 \
--export-dir /warehouse/$db_name/ads/$1 \
--input-fields-terminated-by "\t" \
--update-mode allowinsert \
--update-key $2 \
--input-null-string '\\N' \
--input-null-non-string '\\N'
}
case $1 in
"ads_uv_count")
export_data "ads_uv_count" "dt"
;;
"ads_user_action_convert_day")
export_data "ads_user_action_convert_day" "dt"
;;
"ads_gmv_sum_day")
export_data "ads_gmv_sum_day" "dt"
;;
"all")
export_data "ads_uv_count" "dt"
export_data "ads_user_action_convert_day" "dt"
export_data "ads_gmv_sum_day" "dt"
;;
esac 复制代码
关于导出update还是insert的问题
--update-mode:
updateonly 只更新,无法插入新数据
allowinsert 允许新增
--update-key:允许更新的情况下,指定哪些字段匹配视为同一条数据,进行更新而不增加。多个字段用逗号分隔。
--input-null-string和--input-null-non-string:
分别表示,将字符串列和非字符串列的空串和“null”转义。
官网地址:http://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html
Sqoop will by default import NULL values as string null. Hive is however using string \N to denote NULL values and therefore predicates dealing with NULL(like IS NULL) will not work correctly. You should append parameters --null-string and --null-non-string in case of import job or --input-null-string and --input-null-non-string in case of an export job if you wish to properly preserve NULL values. Because sqoop is using those parameters in generated code, you need to properly escape value \N to \\N:
Hive中的Null在底层是以“\N”来存储,而MySQL中的Null在底层就是Null,为了保证数据两端的一致性。在导出数据时采用--input-null-string和--input-null-non-string两个参数。导入数据时采用--null-string和--null-non-string。
3)执行Sqoop导出脚本
[kgg@hadoop102 bin]$ chmod 777 sqoop_export.sh
[kgg@hadoop102 bin]$ sqoop_export.sh all 复制代码
4)在MySQL中查看结果
SELECT * FROM ads_uv_count;
SELECT * FROM ads_user_retention_day_rate;
SELECT * FROM ads_user_action_convert_day;
SELECT * FROM ads_gmv_sum_day;
SELECT * FROM ads_gmv_sum_province; 复制代码
最新经典文章,欢迎关注公众号