数仓-DWD层

1)对用户行为数据解析。

2)对业务数据采用维度模型重新建模。

订单明细事实表(事务型事实表)

建表语句:

DROP TABLE IF EXISTS dwd_order_detail;
CREATE EXTERNAL TABLE dwd_order_detail (`id` STRING COMMENT '订单编号',`order_id` STRING COMMENT '订单号',`user_id` STRING COMMENT '用户id',`sku_id` STRING COMMENT 'sku商品id',`province_id` STRING COMMENT '省份ID',`activity_id` STRING COMMENT '活动ID',`activity_rule_id` STRING COMMENT '活动规则ID',`coupon_id` STRING COMMENT '优惠券ID',`create_time` STRING COMMENT '创建时间',`source_type` STRING COMMENT '来源类型',`source_id` STRING COMMENT '来源编号',`sku_num` BIGINT COMMENT '商品数量',`original_amount` DECIMAL(16,2) COMMENT '原始价格',`split_activity_amount` DECIMAL(16,2) COMMENT '活动优惠分摊',`split_coupon_amount` DECIMAL(16,2) COMMENT '优惠券优惠分摊',`split_final_amount` DECIMAL(16,2) COMMENT '最终价格分摊'
) COMMENT '订单明细事实表表'
PARTITIONED BY (`dt` STRING)
STORED AS PARQUET
LOCATION '/warehouse/gmall/dwd/dwd_order_detail/'
TBLPROPERTIES ("parquet.compression"="lzo");

每日装载:

insert overwrite table dwd_order_detail partition(dt='2021-09-16')
selectod.id,od.order_id,oi.user_id,od.sku_id,oi.province_id,oda.activity_id,oda.activity_rule_id,odc.coupon_id,od.create_time,od.source_type,od.source_id,od.sku_num,od.order_price*od.sku_num,od.split_activity_amount,od.split_coupon_amount,od.split_final_amount
from
(select*from ods_order_detailwhere dt='2021-09-16'
)od
left join
(selectid,user_id,province_idfrom ods_order_infowhere dt='2021-09-16'
)oi
on od.order_id=oi.id
left join
(selectorder_detail_id,activity_id,activity_rule_idfrom ods_order_detail_activitywhere dt='2021-09-16'
)oda
on od.id=oda.order_detail_id
left join
(selectorder_detail_id,coupon_idfrom ods_order_detail_couponwhere dt='2021-09-16'
)odc
on od.id=odc.order_detail_id;

支付事实表(累积型快照事实表)

建表语句:

DROP TABLE IF EXISTS dwd_payment_info;
CREATE EXTERNAL TABLE dwd_payment_info (`id` STRING COMMENT '编号',`order_id` STRING COMMENT '订单编号',`user_id` STRING COMMENT '用户编号',`province_id` STRING COMMENT '地区ID',`trade_no` STRING COMMENT '交易编号',`out_trade_no` STRING COMMENT '对外交易编号',`payment_type` STRING COMMENT '支付类型',`payment_amount` DECIMAL(16,2) COMMENT '支付金额',`payment_status` STRING COMMENT '支付状态',`create_time` STRING COMMENT '创建时间',--调用第三方支付接口的时间`callback_time` STRING COMMENT '完成时间'--支付完成时间,即支付成功回调时间
) COMMENT '支付事实表表'
PARTITIONED BY (`dt` STRING)
STORED AS PARQUET
LOCATION '/warehouse/gmall/dwd/dwd_payment_info/'
TBLPROPERTIES ("parquet.compression"="lzo");

每日装载:

insert overwrite table dwd_payment_info partition(dt)
selectnvl(new.id,old.id),nvl(new.order_id,old.order_id),nvl(new.user_id,old.user_id),nvl(new.province_id,old.province_id),nvl(new.trade_no,old.trade_no),nvl(new.out_trade_no,old.out_trade_no),nvl(new.payment_type,old.payment_type),nvl(new.payment_amount,old.payment_amount),nvl(new.payment_status,old.payment_status),nvl(new.create_time,old.create_time),nvl(new.callback_time,old.callback_time),nvl(date_format(nvl(new.callback_time,old.callback_time),'yyyy-MM-dd'),'9999-99-99')
from
(select id,order_id,user_id,province_id,trade_no,out_trade_no,payment_type,payment_amount,payment_status,create_time,callback_timefrom dwd_payment_infowhere dt = '9999-99-99'
)old
full outer join
(selectpi.id,pi.out_trade_no,pi.order_id,pi.user_id,oi.province_id,pi.payment_type,pi.trade_no,pi.payment_amount,pi.payment_status,pi.create_time,pi.callback_timefrom(select * from ods_payment_info where dt='2021-09-16')pileft join(select id,province_id from ods_order_info where dt='2021-09-16')oion pi.order_id=oi.id
)new
on old.id=new.id;

订单事实表(累积型快照事实表)

建表语句:

DROP TABLE IF EXISTS dwd_order_info;
CREATE EXTERNAL TABLE dwd_order_info(`id` STRING COMMENT '编号',`order_status` STRING COMMENT '订单状态',`user_id` STRING COMMENT '用户ID',`province_id` STRING COMMENT '地区ID',`payment_way` STRING COMMENT '支付方式',`delivery_address` STRING COMMENT '邮寄地址',`out_trade_no` STRING COMMENT '对外交易编号',`tracking_no` STRING COMMENT '物流单号',`create_time` STRING COMMENT '创建时间(未支付状态)',`payment_time` STRING COMMENT '支付时间(已支付状态)',`cancel_time` STRING COMMENT '取消时间(已取消状态)',`finish_time` STRING COMMENT '完成时间(已完成状态)',`refund_time` STRING COMMENT '退款时间(退款中状态)',`refund_finish_time` STRING COMMENT '退款完成时间(退款完成状态)',`expire_time` STRING COMMENT '过期时间',`feight_fee` DECIMAL(16,2) COMMENT '运费',`feight_fee_reduce` DECIMAL(16,2) COMMENT '运费减免',`activity_reduce_amount` DECIMAL(16,2) COMMENT '活动减免',`coupon_reduce_amount` DECIMAL(16,2) COMMENT '优惠券减免',`original_amount` DECIMAL(16,2) COMMENT '订单原始价格',`final_amount` DECIMAL(16,2) COMMENT '订单最终价格'
) COMMENT '订单事实表'
PARTITIONED BY (`dt` STRING)
STORED AS PARQUET
LOCATION '/warehouse/gmall/dwd/dwd_order_info/'
TBLPROPERTIES ("parquet.compression"="lzo");

每日装载:

insert overwrite table dwd_order_info partition(dt)
selectnvl(new.id,old.id),nvl(new.order_status,old.order_status),nvl(new.user_id,old.user_id),nvl(new.province_id,old.province_id),nvl(new.payment_way,old.payment_way),nvl(new.delivery_address,old.delivery_address),nvl(new.out_trade_no,old.out_trade_no),nvl(new.tracking_no,old.tracking_no),nvl(new.create_time,old.create_time),nvl(new.payment_time,old.payment_time),nvl(new.cancel_time,old.cancel_time),nvl(new.finish_time,old.finish_time),nvl(new.refund_time,old.refund_time),nvl(new.refund_finish_time,old.refund_finish_time),nvl(new.expire_time,old.expire_time),nvl(new.feight_fee,old.feight_fee),nvl(new.feight_fee_reduce,old.feight_fee_reduce),nvl(new.activity_reduce_amount,old.activity_reduce_amount),nvl(new.coupon_reduce_amount,old.coupon_reduce_amount),nvl(new.original_amount,old.original_amount),nvl(new.final_amount,old.final_amount),casewhen new.cancel_time is not null then date_format(new.cancel_time,'yyyy-MM-dd')when new.finish_time is not null and date_add(date_format(new.finish_time,'yyyy-MM-dd'),7)='2020-06-15' and new.refund_time is null then '2020-06-15'when new.refund_finish_time is not null then date_format(new.refund_finish_time,'yyyy-MM-dd')when new.expire_time is not null then date_format(new.expire_time,'yyyy-MM-dd')else '9999-99-99'end
from
(selectid,order_status,user_id,province_id,payment_way,delivery_address,out_trade_no,tracking_no,create_time,payment_time,cancel_time,finish_time,refund_time,refund_finish_time,expire_time,feight_fee,feight_fee_reduce,activity_reduce_amount,coupon_reduce_amount,original_amount,final_amountfrom dwd_order_infowhere dt='9999-99-99'
)old
full outer join
(selectoi.id,oi.order_status,oi.user_id,oi.province_id,oi.payment_way,oi.delivery_address,oi.out_trade_no,oi.tracking_no,oi.create_time,times.ts['1002'] payment_time,times.ts['1003'] cancel_time,times.ts['1004'] finish_time,times.ts['1005'] refund_time,times.ts['1006'] refund_finish_time,oi.expire_time,feight_fee,feight_fee_reduce,activity_reduce_amount,coupon_reduce_amount,original_amount,final_amountfrom(select*from ods_order_infowhere dt='2021-09-16')oileft join(selectorder_id,str_to_map(concat_ws(',',collect_set(concat(order_status,'=',operate_time))),',','=') tsfrom ods_order_status_logwhere dt='2021-09-16'group by order_id)timeson oi.id=times.order_id
)new
on old.id=new.id;

DWD层业务数据每日装载脚本

编写脚本及执行

(1)在/home/muzili/bin目录下创建脚本ods_to_dwd_db.sh

[muzili@hadoop102 bin]$ vim ods_to_dwd_db.sh

在脚本中填写如下内容:

#!/bin/bashAPP=gmall
# 如果是输入的日期按照取输入日期;如果没输入日期取当前时间的前一天
if [ -n "$2" ] ;thendo_date=$2
else do_date=`date -d "-1 day" +%F`
fi# 假设某累积型快照事实表,某天所有的业务记录全部完成,则会导致9999-99-99分区的数据未被覆盖,从而导致数据重复,该函数根据9999-99-99分区的数据的末次修改时间判断其是否被覆盖了,如果未被覆盖,就手动清理
clear_data(){current_date=`date +%F`current_date_timestamp=`date -d "$current_date" +%s`last_modified_date=`hadoop fs -ls /warehouse/gmall/dwd/$1 | grep '9999-99-99' | awk '{print $6}'`last_modified_date_timestamp=`date -d "$last_modified_date" +%s`if [[ $last_modified_date_timestamp -lt $current_date_timestamp ]]; thenecho "clear table $1 partition(dt=9999-99-99)"hadoop fs -rm -r -f /warehouse/gmall/dwd/$1/dt=9999-99-99/*fi
}dwd_order_info="
set hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat;
set hive.exec.dynamic.partition.mode=nonstrict;
insert overwrite table ${APP}.dwd_order_info partition(dt)
selectnvl(new.id,old.id),nvl(new.order_status,old.order_status),nvl(new.user_id,old.user_id),nvl(new.province_id,old.province_id),nvl(new.payment_way,old.payment_way),nvl(new.delivery_address,old.delivery_address),nvl(new.out_trade_no,old.out_trade_no),nvl(new.tracking_no,old.tracking_no),nvl(new.create_time,old.create_time),nvl(new.payment_time,old.payment_time),nvl(new.cancel_time,old.cancel_time),nvl(new.finish_time,old.finish_time),nvl(new.refund_time,old.refund_time),nvl(new.refund_finish_time,old.refund_finish_time),nvl(new.expire_time,old.expire_time),nvl(new.feight_fee,old.feight_fee),nvl(new.feight_fee_reduce,old.feight_fee_reduce),nvl(new.activity_reduce_amount,old.activity_reduce_amount),nvl(new.coupon_reduce_amount,old.coupon_reduce_amount),nvl(new.original_amount,old.original_amount),nvl(new.final_amount,old.final_amount),casewhen new.cancel_time is not null then date_format(new.cancel_time,'yyyy-MM-dd')when new.finish_time is not null and date_add(date_format(new.finish_time,'yyyy-MM-dd'),7)='$do_date' and new.refund_time is null then '$do_date'when new.refund_finish_time is not null then date_format(new.refund_finish_time,'yyyy-MM-dd')when new.expire_time is not null then date_format(new.expire_time,'yyyy-MM-dd')else '9999-99-99'end
from
(selectid,order_status,user_id,province_id,payment_way,delivery_address,out_trade_no,tracking_no,create_time,payment_time,cancel_time,finish_time,refund_time,refund_finish_time,expire_time,feight_fee,feight_fee_reduce,activity_reduce_amount,coupon_reduce_amount,original_amount,final_amountfrom ${APP}.dwd_order_infowhere dt='9999-99-99'
)old
full outer join
(selectoi.id,oi.order_status,oi.user_id,oi.province_id,oi.payment_way,oi.delivery_address,oi.out_trade_no,oi.tracking_no,oi.create_time,times.ts['1002'] payment_time,times.ts['1003'] cancel_time,times.ts['1004'] finish_time,times.ts['1005'] refund_time,times.ts['1006'] refund_finish_time,oi.expire_time,feight_fee,feight_fee_reduce,activity_reduce_amount,coupon_reduce_amount,original_amount,final_amountfrom(select*from ${APP}.ods_order_infowhere dt='$do_date')oileft join(selectorder_id,str_to_map(concat_ws(',',collect_set(concat(order_status,'=',operate_time))),',','=') tsfrom ${APP}.ods_order_status_logwhere dt='$do_date'group by order_id)timeson oi.id=times.order_id
)new
on old.id=new.id;"dwd_order_detail="
set hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat;
insert overwrite table ${APP}.dwd_order_detail partition(dt='$do_date')
selectod.id,od.order_id,oi.user_id,od.sku_id,oi.province_id,oda.activity_id,oda.activity_rule_id,odc.coupon_id,od.create_time,od.source_type,od.source_id,od.sku_num,od.order_price*od.sku_num,od.split_activity_amount,od.split_coupon_amount,od.split_final_amount
from
(select*from ${APP}.ods_order_detailwhere dt='$do_date'
)od
left join
(selectid,user_id,province_idfrom ${APP}.ods_order_infowhere dt='$do_date'
)oi
on od.order_id=oi.id
left join
(selectorder_detail_id,activity_id,activity_rule_idfrom ${APP}.ods_order_detail_activitywhere dt='$do_date'
)oda
on od.id=oda.order_detail_id
left join
(selectorder_detail_id,coupon_idfrom ${APP}.ods_order_detail_couponwhere dt='$do_date'
)odc
on od.id=odc.order_detail_id;"dwd_payment_info="
set hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat;
set hive.exec.dynamic.partition.mode=nonstrict;
insert overwrite table ${APP}.dwd_payment_info partition(dt)
selectnvl(new.id,old.id),nvl(new.order_id,old.order_id),nvl(new.user_id,old.user_id),nvl(new.province_id,old.province_id),nvl(new.trade_no,old.trade_no),nvl(new.out_trade_no,old.out_trade_no),nvl(new.payment_type,old.payment_type),nvl(new.payment_amount,old.payment_amount),nvl(new.payment_status,old.payment_status),nvl(new.create_time,old.create_time),nvl(new.callback_time,old.callback_time),nvl(date_format(nvl(new.callback_time,old.callback_time),'yyyy-MM-dd'),'9999-99-99')
from
(select id,order_id,user_id,province_id,trade_no,out_trade_no,payment_type,payment_amount,payment_status,create_time,callback_timefrom ${APP}.dwd_payment_infowhere dt = '9999-99-99'
)old
full outer join
(selectpi.id,pi.out_trade_no,pi.order_id,pi.user_id,oi.province_id,pi.payment_type,pi.trade_no,pi.payment_amount,pi.payment_status,pi.create_time,pi.callback_timefrom(select * from ${APP}.ods_payment_info where dt='$do_date')pileft join(select id,province_id from ${APP}.ods_order_info where dt='$do_date')oion pi.order_id=oi.id
)new
on old.id=new.id;"dwd_cart_info="
set hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat;
insert overwrite table ${APP}.dwd_cart_info partition(dt='$do_date')
selectid,user_id,sku_id,source_type,source_id,cart_price,is_ordered,create_time,operate_time,order_time,sku_num
from ${APP}.ods_cart_info
where dt='$do_date';"dwd_comment_info="
set hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat;
insert overwrite table ${APP}.dwd_comment_info partition(dt='$do_date')
selectid,user_id,sku_id,spu_id,order_id,appraise,create_time
from ${APP}.ods_comment_info where dt='$do_date';"dwd_favor_info="
set hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat;
insert overwrite table ${APP}.dwd_favor_info partition(dt='$do_date')
selectid,user_id,sku_id,spu_id,is_cancel,create_time,cancel_time
from ${APP}.ods_favor_info
where dt='$do_date';"dwd_coupon_use="
set hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat;
set hive.exec.dynamic.partition.mode=nonstrict;
insert overwrite table ${APP}.dwd_coupon_use partition(dt)
selectnvl(new.id,old.id),nvl(new.coupon_id,old.coupon_id),nvl(new.user_id,old.user_id),nvl(new.order_id,old.order_id),nvl(new.coupon_status,old.coupon_status),nvl(new.get_time,old.get_time),nvl(new.using_time,old.using_time),nvl(new.used_time,old.used_time),nvl(new.expire_time,old.expire_time),coalesce(date_format(nvl(new.used_time,old.used_time),'yyyy-MM-dd'),date_format(nvl(new.expire_time,old.expire_time),'yyyy-MM-dd'),'9999-99-99')
from
(selectid,coupon_id,user_id,order_id,coupon_status,get_time,using_time,used_time,expire_timefrom ${APP}.dwd_coupon_usewhere dt='9999-99-99'
)old
full outer join
(selectid,coupon_id,user_id,order_id,coupon_status,get_time,using_time,used_time,expire_timefrom ${APP}.ods_coupon_usewhere dt='$do_date'
)new
on old.id=new.id;"dwd_order_refund_info="
set hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat;
insert overwrite table ${APP}.dwd_order_refund_info partition(dt='$do_date')
selectri.id,ri.user_id,ri.order_id,ri.sku_id,oi.province_id,ri.refund_type,ri.refund_num,ri.refund_amount,ri.refund_reason_type,ri.create_time
from
(select * from ${APP}.ods_order_refund_info where dt='$do_date'
)ri
left join
(select id,province_id from ${APP}.ods_order_info where dt='$do_date'
)oi
on ri.order_id=oi.id;"dwd_refund_payment="
set hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat;
set hive.exec.dynamic.partition.mode=nonstrict;
insert overwrite table ${APP}.dwd_refund_payment partition(dt)
selectnvl(new.id,old.id),nvl(new.user_id,old.user_id),nvl(new.order_id,old.order_id),nvl(new.sku_id,old.sku_id),nvl(new.province_id,old.province_id),nvl(new.trade_no,old.trade_no),nvl(new.out_trade_no,old.out_trade_no),nvl(new.payment_type,old.payment_type),nvl(new.refund_amount,old.refund_amount),nvl(new.refund_status,old.refund_status),nvl(new.create_time,old.create_time),nvl(new.callback_time,old.callback_time),nvl(date_format(nvl(new.callback_time,old.callback_time),'yyyy-MM-dd'),'9999-99-99')
from
(selectid,user_id,order_id,sku_id,province_id,trade_no,out_trade_no,payment_type,refund_amount,refund_status,create_time,callback_timefrom ${APP}.dwd_refund_paymentwhere dt='9999-99-99'
)old
full outer join
(selectrp.id,user_id,order_id,sku_id,province_id,trade_no,out_trade_no,payment_type,refund_amount,refund_status,create_time,callback_timefrom(selectid,out_trade_no,order_id,sku_id,payment_type,trade_no,refund_amount,refund_status,create_time,callback_timefrom ${APP}.ods_refund_paymentwhere dt='$do_date')rpleft join(selectid,user_id,province_idfrom ${APP}.ods_order_infowhere dt='$do_date')oion rp.order_id=oi.id
)new
on old.id=new.id;"case $1 indwd_order_info )hive -e "$dwd_order_info"clear_data dwd_order_info;;dwd_order_detail )hive -e "$dwd_order_detail";;dwd_payment_info )hive -e "$dwd_payment_info"clear_data dwd_payment_info;;dwd_cart_info )hive -e "$dwd_cart_info";;dwd_comment_info )hive -e "$dwd_comment_info";;dwd_favor_info )hive -e "$dwd_favor_info";;dwd_coupon_use )hive -e "$dwd_coupon_use"clear_data dwd_coupon_use;;dwd_order_refund_info )hive -e "$dwd_order_refund_info";;dwd_refund_payment )hive -e "$dwd_refund_payment"clear_data dwd_refund_payment;;all )hive -e "$dwd_order_info$dwd_order_detail$dwd_payment_info$dwd_cart_info$dwd_comment_info$dwd_favor_info$dwd_coupon_use$dwd_order_refund_info$dwd_refund_payment"clear_data dwd_order_infoclear_data dwd_payment_infoclear_data dwd_coupon_useclear_data dwd_refund_payment;;
esac

(2)增加脚本执行权限

[muzili@hadoop102 bin]$ chmod 777 ods_to_dwd_db.sh

(3)执行脚本

[muzili@hadoop102 bin]$ ods_to_dwd_db.sh all 2021-09-16

(4)查看数据是否导入成功

附加:

--ap_raw_detailcreate table if not exists dwd.ap_raw_detail(
serverreceivetimestamp bigint comment '数据接收时的时间戳',
gbno string comment '国标码',
sn string comment '设备sn',
deveice_mac string comment '设备MAC',
timestamp bigint,
mac string comment '采集到的mac',
channel bigint,
signal bigint
) comment 'AP数据(真实MAC)'
partitioned by(
filedate string comment '文件所属日期'
) stored as parquet;--查询
select serverreceivetimestamp,gbno,sn,deveice_mac,
mac.timestamp,mac.mac,mac.channel,mac.signal
from ods.ap_raw lateral view explode(macs) tmp as mac
where filedate='20210905' limit 10;--导入数据
insert overwrite table dwd.ap_raw_detail PARTITION (filedate='20210905')
select serverreceivetimestamp,gbno,sn,deveice_mac,
mac_info.timestamp,mac_info.mac,mac_info.channel,mac_info.signal
from ods.ap_raw lateral view explode(macs) tmp_table as mac_info
where filedate='20210905';

Hive的dwd层建表语句及加载数据相关推荐

  1. Hive JSON数据处理:get_json_objec、json_tuple、用JSON Serde加载数据

    Hive JSON数据处理 文章目录 Hive JSON数据处理 将JSON保存为字符串,用JSON函数处理 get_json_object json_tuple 用JSON Serde加载数据 将J ...

  2. mysql分片建表语句_Mysql元数据如何生成Hive建表语句注释脚本详解

    前言 本文主要给大家介绍了关于Mysql元数据生成Hive建表语句注释脚本的相关内容,分享出来供大家参考学习,下面话不多说了,来一起看看详细的介绍: 最近在将数据从Mysql 等其他关系型数据库 抽取 ...

  3. 使用java代码编写脚本,把oracle建表语句变成hive建表语句

    使用java代码编写脚本,把oracle建表语句变成hive建表语句 java代码 测试oracle.sql 生成hive创表语句 java代码 import java.io.File; import ...

  4. mysql hive 建表语句_Mysql元数据如何生成Hive建表语句注释脚本详解

    前言 本文主要给大家介绍了关于Mysql元数据生成Hive建表语句注释脚本的相关内容,分享出来供大家参考学习,下面话不多说了,来一起看看详细的介绍: 最近在将数据从Mysql 等其他关系型数据库 抽取 ...

  5. Hive建表语句详解--CREATE TABLE

    创建表的三种方法 Hive创建表的方式(默认路径/user/hive/warehouse,也可以location指定,主要针对external表) 1.使用create命令创建一个新表,带分区 CRE ...

  6. mysql2hive建表语句的转换

    (基于SQL实现异构数据源DDL转换) 准备一张维度表:dim_ddl_convert,建表语句如下: DROP TABLE IF EXISTS `dim_ddl_convert`; CREATE T ...

  7. java oracle 建表语句_在oracle数据库中创建表语句如何写?

    表名:servicelogs 字段名称类型 idint (主键,自增) servicetypevarchar (100) outsysvarchar (100) methodnamevarchar ( ...

  8. mysql建表语句主键自增_MYSQL索引-上

    前言 今天同事做数据清理的时候发现我这边有一张表没有主键,这个表有两个字段,ID和Name,ID作者唯一索引,在我印象里,一个数据表如果没有主键索引,它会内部创建主键索引,创建的标准就是唯一性,我觉得 ...

  9. 完整版使用Shell脚本在多个服务器同时实现Mysql建表语句和删除表

    文章目录 一.创建表结构 二.删除表结构 前言:我们在项目开发过程中可能会遇到类似这样的需求,比如在多个Mysql服务器同时建表语句和删除表,这时候我们肯定不可能一个服务器一个服务器的创建,所以我们要 ...

最新文章

  1. 【FFmpeg】FFmpeg常用基本命令
  2. python中dict和lambda结合的小例子
  3. 普通人看Build 2019 值得留心的亮点有哪些?
  4. 【HDU - 6186】CS Course(按位与,按位或,按位异或的区间统计,二进制拆位)
  5. Hadoop1——创建虚拟机
  6. open jdk 证书 问题
  7. pytorch nonzero_[深度学习框架]PyTorch常用代码段
  8. Mysql面试常见题
  9. php文件同名怎么办,php根据文件不同关闭同名进程
  10. arcmap拓扑错误检查器不亮_ARCGIS拓扑检查步骤与修正拓扑错误技巧
  11. Flutter 修改APP名称和logo
  12. 【成功的忙人】在北京30套房的小哥:人一闲,就废了!
  13. 智能DNS解析搭建成功
  14. Unity线性空间下移动设备上烘焙变暗问题处理
  15. Python实现FFT及IFFT
  16. Kotlin高仿微信-第20篇-个人信息
  17. 查询服务器硬盘上电时间,鲁大师检测硬盘通电时间怎么操作?检测硬盘通电时间教程...
  18. php 获取手机屏幕宽度,Swift-使用UIScreen类获取屏幕尺寸
  19. linux服务器实训心得体会,linux实训心得体会
  20. C语言中怎么表示派 -π

热门文章

  1. Ubuntu server 设置root密码
  2. 巴黎时装周儿童单元深圳站代言人伊朵,精彩演绎儿童时装
  3. 计算机应用最普遍的汉字字符编码是什么,计算机中目前最普遍使用的汉字字符编码是什么...
  4. c++陈维兴第三版3.35_C++面向对象程序设计教程第3版—陈维兴,林小茶课后习题答案...
  5. .NET 对于构建系统应用的探索历程
  6. 用C语言判断五子棋,c语言 五子棋危险判断 求大神解释啊
  7. HashMap的put方法原理
  8. ARCore:ARCore开发的起步
  9. 重启计算机连线的标志是,win10右下角总出现一个小地球图标怎么办_win10电脑网络连接图标变成地球如何解决...
  10. nbu客户端卸载_NBU客户端安装失败