數據模擬
1.日志數據的生成
1)數據准備,下載地址:https://pan.baidu.com/s/1aaf2-Fuchi3v-0cJqLw4EQ 提取碼:sbim
2)在/opt/module/下創建applog目錄,將application.properties、application.yml、gmall2020-mock-log-2021-01-22.jar、path.json、logback.xml上傳至該目錄

3)運行Jar包(生成2021-06-08號的數據:首日數據)
java -jar gmall2020-mock-log-2021-01-22.jar
4)查詢生成的數據(復制其中的一個時間戳,查看數據生成日期)
tail -f log/app.2021-06-08.log
date -d @1623060886
5)將applog目錄分發
xsync /opt/module/applog/
6)將該數據使用Flume采集到HDFS上
vim /opt/module/flume/job/file_to_kafka.conf
#1、定義agent、source、channel的名稱 a1.sources = r1 a1.channels = c1 #2、描述source a1.sources.r1.type = TAILDIR #定義斷點續傳文件 a1.sources.r1.positionFile = /opt/module/flume/position.json #定義監控的文件組 a1.sources.r1.filegroups = f1 #制定文件組監控的文件 a1.sources.r1.filegroups.f1 = /opt/module/applog/log/app.* #定義source每個批次采集的數據 a1.sources.r1.batchSize = 100 #3、描述攔截器[過濾非json數據] #定義攔截器名稱 a1.sources.r1.interceptors = i1 #定義攔截器類型 a1.sources.r1.interceptors.i1.type = com.yuange.flume.interceptor.ETLInterceptor$Builder #4、描述channel a1.channels.c1.type = org.apache.flume.channel.kafka.KafkaChannel #指定kafka集群地址 a1.channels.c1.kafka.bootstrap.servers = hadoop102:9092,hadoop103:9092,hadoop104:9092 #指定寫入的topic的名稱 a1.channels.c1.kafka.topic = applog #指定數據是否以Event數據格式寫入kafka a1.channels.c1.parseAsFlumeEvent = false #5、關聯source->channel a1.sources.r1.channels = c1
vim /opt/module/flume/job/kafka-to-hdfs.conf
#1、定義agent、channel、source、sink名稱 a1.sources = r1 a1.channels = c1 a1.sinks = k1 # ##2、描述source a1.sources.r1.type = org.apache.flume.source.kafka.KafkaSource ##指定kafka集群地址 a1.sources.r1.kafka.bootstrap.servers = hadoop102:9092,hadoop103:9092 ##指定從哪個topic讀取數據 a1.sources.r1.kafka.topics = applog ##指定消費者組的id a1.sources.r1.kafka.consumer.group.id = g1 ##指定source從kafka一個批次拉取多少條消息: batchSize<=事務容量<=channel容量 a1.sources.r1.batchSize = 100 ##指定消費者組第一個消費topic的數據的時候從哪里開始消費 a1.sources.r1.kafka.consumer.auto.offset.reset = earliest # ##3、描述攔截器 a1.sources.r1.interceptors = i1 a1.sources.r1.interceptors.i1.type = com.yuange.flume.interceptor.MyTimestampInterceptor$Builder # ##4、描述channel a1.channels.c1.type = file ##指定數據保存在本地磁盤哪個目錄 a1.channels.c1.dataDirs = /opt/module/flume/datas ##指定channel內存中event的指針數據 a1.channels.c1.checkpointDir = /opt/module/flume/checkpoint ##指定checkpoint的持久化的間隔時間 a1.channels.c1.checkpointInterval = 5000 ##指定channel容量 a1.channels.c1.capacity = 1000000 # ##5、描述sink a1.sinks.k1.type = hdfs ##指定數據存儲目錄 a1.sinks.k1.hdfs.path = hdfs://hadoop102:8020/origin_data/gmall/log/topic_log/%Y-%m-%d ##指定文件的前綴 a1.sinks.k1.hdfs.filePrefix = log- ##指定滾動生成文件的時間間隔 a1.sinks.k1.hdfs.rollInterval = 30 ##指定滾動生成文件的大小 a1.sinks.k1.hdfs.rollSize = 132120576 ##指定寫入多少個event之后滾動生成新文件<一般禁用> a1.sinks.k1.hdfs.rollCount = 0 ##指定sink每個批次從channel拉取多少數據 a1.sinks.k1.hdfs.batchSize = 100 ##指定寫入hdfs的時候壓縮格式 a1.sinks.k1.hdfs.codeC = lzop ##指定文件寫入的是格式 SequenceFile-序列化文件, DataStream-文本文件, CompressedStream-壓縮文件 a1.sinks.k1.hdfs.fileType = CompressedStream #a1.sinks.k1.hdfs.fileType = DataStream # ##6、關聯source->channel->sink a1.sources.r1.channels = c1 a1.sinks.k1.channel = c1
vim /home/atguigu/bin/flume-first.sh
#! /bin/bash #1、判斷參數是否傳入 if [ $# -lt 1 ] then echo "必須傳入參數..." exit fi #2、根據參數匹配邏輯 case $1 in "start") for host in hadoop102 hadoop103 do echo "--------------------$host flume啟動采集---------------------------" ssh $host "nohup /opt/module/flume/bin/flume-ng agent -n a1 -c /opt/module/flume/conf/ -f /opt/module/flume/job/file_to_kafka.conf >/opt/module/flume/logs.txt 2>&1 &" done ;; "stop") for host in hadoop102 hadoop103 do echo "--------------------$host flume停止采集-----------------------" ssh $host "ps -ef |grep file_to_kafka |grep -v grep |awk '{print \$2}'| xargs -n1 kill" done ;; *) echo "參數輸入錯誤....." ;; esac
vim /home/atguigu/bin/flume-second.sh
#! /bin/bash if [ $# -lt 1 ] then echo "參數個數錯誤!!!" exit fi case $1 in "start") for i in hadoop104 do echo " --------啟動 $i 消費flume-------" ssh $i "nohup /opt/module/flume/bin/flume-ng agent --conf /opt/module/flume/conf/ --conf-file /opt/module/flume/job/kafka-to-hdfs.conf --name a1 >/opt/module/flume/logs.txt 2>&1 &" done ;; "stop") for i in hadoop104 do echo " --------停止 $i 消費flume-------" ssh $i "ps -ef | grep kafka-to-hdfs | grep -v grep |awk '{print \$2}' | xargs -n1 kill" done ;; *) echo "參數格式錯誤!!!" exit ;; esac
7)啟動Flume數據采集腳本(啟動之前先分發)
flume-first.sh start
flume-second.sh start
8)編寫腳本一鍵造數據
(1)在家目錄的bin目錄中新建 lg.sh
vim lg.sh
#!/bin/bash #調用后,在hadoop102和hadoop103運行日志生成程序 for i in hadoop102 hadoop103 do ssh $i "cd /opt/module/applog ; java -jar gmall2020-mock-log-2021-01-22.jar > /dev/null 2>&1 &" done
(2)在家目錄的bin目錄中新建 makeLogData.sh
vim makeLogData.sh
#!/bin/bash # makeLogData.sh 起始日期 10(造多少天的數據) if [ -n "$1" ] then startdate=$1 else # 默認使用當前日期的前一天 echo 請傳入起始日期! exit fi if [ -n "$2" ] ;then days=$2 else days=10 fi function createLogConfs() { logContent="logging.config=./logback.xml\n mock.date=$1\n mock.type=log\n mock.startup.count=200\n mock.max.mid=500000\n mock.max.uid=500\n mock.max.sku-id=35\n mock.page.during-time-ms=20000\n mock.error.rate=3\n mock.log.sleep=10\n mock.detail.source-type-rate=40:25:15:20\n mock.if_get_coupon_rate=75\n mock.max.coupon-id=3\n mock.search.keyword=圖書,小米,iphone11,電視,口紅,ps5,蘋果手機,小米盒子\n" echo -e $logContent> /opt/module/applog/application.properties xsync /opt/module/applog/application.properties } for((i=0;i<$days;i++)) do do_date=`date -d "$startdate $i day" +'%F'` echo ------------------當前生成數據日期:$do_date-------------------- createLogConfs $do_date lg.sh sleep 5s done echo 全部結束!
9)分發腳本並賦予腳本可執行權限
chmod +x lg.sh makeLogData.sh
xsync lg.sh makeLogData.sh
10)運行造腳本(模擬2021-06-09 ----- 2021-06-10: 每日)
makeLogData.sh 2021-06-09 2
11)驗證數據

tail -f app.2021-06-08.log
date -d @1623324862



2.業務數據的生成
1)連接mysql,然后創建gmall數據庫(編碼選擇utf8,格式選擇通用)

2)創建完成之后運行gmall的sql腳本,完成之后刷新數據庫,發現表已經創建進來

3)在/opt/module/下創建mock目錄,然后將application.properties、gmall2020-mock-db-2021-01-22.jar文件上傳至該目錄
mkdir /opt/module/mock

vim application.properties
logging.level.root=info spring.datasource.driver-class-name=com.mysql.jdbc.Driver spring.datasource.url=jdbc:mysql://hadoop102:3306/gmall?characterEncoding=utf-8&useSSL=false&serverTimezone=GMT%2B8 spring.datasource.username=root spring.datasource.password=root123 logging.pattern.console=%m%n mybatis-plus.global-config.db-config.field-strategy=not_null #業務日期 mock.date=2021-06-08 #第一次生成數據,需要將以下兩個重置配置設置為1 #是否重置 mock.clear=0 #是否重置用戶 mock.clear.user=0 #生成新用戶數量 mock.user.count=100 #男性比例 mock.user.male-rate=20 #用戶數據變化概率 mock.user.update-rate:20 #收藏取消比例 mock.favor.cancel-rate=10 #收藏數量 mock.favor.count=100 #每個用戶添加購物車的概率 mock.cart.user-rate=50 #每次每個用戶最多添加多少種商品進購物車 mock.cart.max-sku-count=8 #每個商品最多買幾個 mock.cart.max-sku-num=3 #購物車來源 用戶查詢,商品推廣,智能推薦, 促銷活動 mock.cart.source-type-rate=60:20:10:10 #用戶下單比例 mock.order.user-rate=50 #用戶從購物中購買商品比例 mock.order.sku-rate=50 #是否參加活動 mock.order.join-activity=1 #是否使用購物券 mock.order.use-coupon=1 #購物券領取人數 mock.coupon.user-count=100 #支付比例 mock.payment.rate=70 #支付方式 支付寶:微信 :銀聯 mock.payment.payment-type=30:60:10 #評價比例 好:中:差:自動 mock.comment.appraise-rate=30:10:10:50 #退款原因比例:質量問題 商品描述與實際描述不一致 缺貨 號碼不合適 拍錯 不想買了 其他 mock.refund.reason-rate=30:10:20:5:15:5:5
4)運行Jar,生成數據到MySQL(若是第一次生成,則修改配置文件中的mock.clear=1以及mock.clear.user=1)
java -jar gmall2020-mock-db-2021-01-22.jar
5)查看gmall數據庫

6)安轉Sqoop
(1)下載地址:http://mirrors.hust.edu.cn/apache/sqoop/1.4.6/
(2)上傳安裝包sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz到hadoop102的/opt/software路徑中
(3)解壓sqoop安裝包到指定目錄,如:
tar -zxvf sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz -C /opt/module/
(4)進入/opt/module/目錄,重命名sqoop-1.4.6.bin__hadoop-2.0.4-alpha
mv sqoop-1.4.6.bin__hadoop-2.0.4-alpha/ sqoop
(5) 進入到/opt/module/sqoop/conf目錄,重命名配置文件
mv sqoop-env-template.sh sqoop-env.sh
(6) 修改配置文件
vim sqoop-env.sh
#添加如下內容 export HADOOP_COMMON_HOME=/opt/module/hadoop-3.1.3 export HADOOP_MAPRED_HOME=/opt/module/hadoop-3.1.3 export HIVE_HOME=/opt/module/hive export ZOOKEEPER_HOME=/opt/module/zookeeper export ZOOCFGDIR=/opt/module/zookeeper/conf
(7)拷貝jdbc驅動到sqoop的lib目錄下
cp /opt/software/mysql-connector-java-5.1.27-bin.jar /opt/module/sqoop/lib/
(8)我們可以通過某一個command來驗證sqoop配置是否正確
bin/sqoop help
(9)測試Sqoop是否能夠成功連接數據庫
bin/sqoop list-databases --connect jdbc:mysql://hadoop102:3306/ --username root --password root123
7)編寫腳本,一鍵造數據
(1)在用戶家目錄下的bin目錄下創建makeDBData.sh
vim makeDBData.sh
#!/bin/bash if [ -n "$1" ] then startdate=$1 else # 默認使用當前日期的前一天 echo 請傳入起始日期! exit fi if [ -n "$2" ] ;then days=$2 else days=10 fi #動態生成application.properties function createconfs() { content=" logging.level.root=info\n spring.datasource.driver-class-name=com.mysql.jdbc.Driver\n spring.datasource.url=jdbc:mysql://hadoop102:3306/gmall?characterEncoding=utf-8&useSSL=false&serverTimezone=GMT%2B8\n spring.datasource.username=root\n spring.datasource.password=root123\n logging.pattern.console=%m%n\n mybatis-plus.global-config.db-config.field-strategy=not_null\n mock.date=$1\n mock.clear=0\n mock.clear.user=0\n mock.user.count=100\n mock.user.male-rate=20\n mock.user.update-rate:20\n mock.favor.cancel-rate=10\n mock.favor.count=100\n mock.cart.user-rate=50\n mock.cart.max-sku-count=8\n mock.cart.max-sku-num=3\n mock.cart.source-type-rate=60:20:10:10\n mock.order.user-rate=50\n mock.order.sku-rate=50\n mock.order.join-activity=1\n mock.order.use-coupon=1\n mock.coupon.user-count=100\n mock.payment.rate=70\n mock.payment.payment-type=30:60:10\n mock.comment.appraise-rate=30:10:10:50\n mock.refund.reason-rate=30:10:20:5:15:5:5\n " echo -e $content> /opt/module/mock/application.properties } for((i=0;i<$days;i++)) do do_date=`date -d "$startdate $i day" +'%F'` echo ------------------當前生成數據日期:$do_date-------------------- createconfs $do_date cd /opt/module/mock java -jar gmall2020-mock-db-2021-01-22.jar mysql_sqoop_hdfs_everyday.sh all $do_date done echo 全部結束!
(2)在家目錄的bin目錄繼續創建mysql_sqoop_hdfs_init.sh
vim /home/atguigu/bin/mysql_sqoop_hdfs_init.sh
#! /bin/bash APP=gmall sqoop=/opt/module/sqoop/bin/sqoop # 如果是輸入的日期按照取輸入日期;如果沒輸入日期取當前時間的前一天 if [ -n "$2" ] ;then do_date=$2 else echo "請傳入日期參數" exit fi import_data(){ $sqoop import \ --connect jdbc:mysql://hadoop102:3306/$APP \ --username root \ --password root123 \ --target-dir /origin_data/$APP/db/$1/$do_date \ --delete-target-dir \ --query "$2 where \$CONDITIONS" \ --num-mappers 1 \ --fields-terminated-by '\t' \ --compress \ --compression-codec lzop \ --null-string '\\N' \ --null-non-string '\\N' hadoop jar /opt/module/hadoop-3.1.3/share/hadoop/common/hadoop-lzo-0.4.20.jar com.hadoop.compression.lzo.DistributedLzoIndexer /origin_data/$APP/db/$1/$do_date } import_order_info(){ import_data order_info "select id, total_amount, order_status, user_id, payment_way, delivery_address, out_trade_no, create_time, operate_time, expire_time, tracking_no, province_id, activity_reduce_amount, coupon_reduce_amount, original_total_amount, feight_fee, feight_fee_reduce from order_info" } import_coupon_use(){ import_data coupon_use "select id, coupon_id, user_id, order_id, coupon_status, get_time, using_time, used_time, expire_time from coupon_use" } import_order_status_log(){ import_data order_status_log "select id, order_id, order_status, operate_time from order_status_log" } import_user_info(){ import_data "user_info" "select id, login_name, nick_name, name, phone_num, email, user_level, birthday, gender, create_time, operate_time from user_info" } import_order_detail(){ import_data order_detail "select id, order_id, sku_id, sku_name, order_price, sku_num, create_time, source_type, source_id, split_total_amount, split_activity_amount, split_coupon_amount from order_detail" } import_payment_info(){ import_data "payment_info" "select id, out_trade_no, order_id, user_id, payment_type, trade_no, total_amount, subject, payment_status, create_time, callback_time from payment_info" } import_comment_info(){ import_data comment_info "select id, user_id, sku_id, spu_id, order_id, appraise, create_time from comment_info" } import_order_refund_info(){ import_data order_refund_info "select id, user_id, order_id, sku_id, refund_type, refund_num, refund_amount, refund_reason_type, refund_status, create_time from order_refund_info" } import_sku_info(){ import_data sku_info "select id, spu_id, price, sku_name, sku_desc, weight, tm_id, category3_id, is_sale, create_time from sku_info" } import_base_category1(){ import_data "base_category1" "select id, name from base_category1" } import_base_category2(){ import_data "base_category2" "select id, name, category1_id from base_category2" } import_base_category3(){ import_data "base_category3" "select id, name, category2_id from base_category3" } import_base_province(){ import_data base_province "select id, name, region_id, area_code, iso_code, iso_3166_2 from base_province" } import_base_region(){ import_data base_region "select id, region_name from base_region" } import_base_trademark(){ import_data base_trademark "select id, tm_name from base_trademark" } import_spu_info(){ import_data spu_info "select id, spu_name, category3_id, tm_id from spu_info" } import_favor_info(){ import_data favor_info "select id, user_id, sku_id, spu_id, is_cancel, create_time, cancel_time from favor_info" } import_cart_info(){ import_data cart_info "select id, user_id, sku_id, cart_price, sku_num, sku_name, create_time, operate_time, is_ordered, order_time, source_type, source_id from cart_info" } import_coupon_info(){ import_data coupon_info "select id, coupon_name, coupon_type, condition_amount, condition_num, activity_id, benefit_amount, benefit_discount, create_time, range_type, limit_num, taken_count, start_time, end_time, operate_time, expire_time from coupon_info" } import_activity_info(){ import_data activity_info "select id, activity_name, activity_type, start_time, end_time, create_time from activity_info" } import_activity_rule(){ import_data activity_rule "select id, activity_id, activity_type, condition_amount, condition_num, benefit_amount, benefit_discount, benefit_level from activity_rule" } import_base_dic(){ import_data base_dic "select dic_code, dic_name, parent_code, create_time, operate_time from base_dic" } import_order_detail_activity(){ import_data order_detail_activity "select id, order_id, order_detail_id, activity_id, activity_rule_id, sku_id, create_time from order_detail_activity" } import_order_detail_coupon(){ import_data order_detail_coupon "select id, order_id, order_detail_id, coupon_id, coupon_use_id, sku_id, create_time from order_detail_coupon" } import_refund_payment(){ import_data refund_payment "select id, out_trade_no, order_id, sku_id, payment_type, trade_no, total_amount, subject, refund_status, create_time, callback_time from refund_payment" } import_sku_attr_value(){ import_data sku_attr_value "select id, attr_id, value_id, sku_id, attr_name, value_name from sku_attr_value" } import_sku_sale_attr_value(){ import_data sku_sale_attr_value "select id, sku_id, spu_id, sale_attr_value_id, sale_attr_id, sale_attr_name, sale_attr_value_name from sku_sale_attr_value" } case $1 in "order_info") import_order_info ;; "base_category1") import_base_category1 ;; "base_category2") import_base_category2 ;; "base_category3") import_base_category3 ;; "order_detail") import_order_detail ;; "sku_info") import_sku_info ;; "user_info") import_user_info ;; "payment_info") import_payment_info ;; "base_province") import_base_province ;; "base_region") import_base_region ;; "base_trademark") import_base_trademark ;; "activity_info") import_activity_info ;; "cart_info") import_cart_info ;; "comment_info") import_comment_info ;; "coupon_info") import_coupon_info ;; "coupon_use") import_coupon_use ;; "favor_info") import_favor_info ;; "order_refund_info") import_order_refund_info ;; "order_status_log") import_order_status_log ;; "spu_info") import_spu_info ;; "activity_rule") import_activity_rule ;; "base_dic") import_base_dic ;; "order_detail_activity") import_order_detail_activity ;; "order_detail_coupon") import_order_detail_coupon ;; "refund_payment") import_refund_payment ;; "sku_attr_value") import_sku_attr_value ;; "sku_sale_attr_value") import_sku_sale_attr_value ;; "all") import_base_category1 import_base_category2 import_base_category3 import_order_info import_order_detail import_sku_info import_user_info import_payment_info import_base_region import_base_province import_base_trademark import_activity_info import_cart_info import_comment_info import_coupon_use import_coupon_info import_favor_info import_order_refund_info import_order_status_log import_spu_info import_activity_rule import_base_dic import_order_detail_activity import_order_detail_coupon import_refund_payment import_sku_attr_value import_sku_sale_attr_value ;; esac
(3)在家目錄的bin目錄繼續創建mysql_sqoop_hdfs_everyday.sh
vim /home/atguigu/bin/mysql_sqoop_hdfs_everyday.sh
#! /bin/bash APP=gmall sqoop=/opt/module/sqoop/bin/sqoop if [ -n "$2" ] ;then do_date=$2 else do_date=`date -d '-1 day' +%F` fi import_data(){ $sqoop import \ --connect jdbc:mysql://hadoop102:3306/$APP \ --username root \ --password root123 \ --target-dir /origin_data/$APP/db/$1/$do_date \ --delete-target-dir \ --query "$2 and \$CONDITIONS" \ --num-mappers 1 \ --fields-terminated-by '\t' \ --compress \ --compression-codec lzop \ --null-string '\\N' \ --null-non-string '\\N' hadoop jar /opt/module/hadoop-3.1.3/share/hadoop/common/hadoop-lzo-0.4.20.jar com.hadoop.compression.lzo.DistributedLzoIndexer /origin_data/$APP/db/$1/$do_date } import_order_info(){ import_data order_info "select id, total_amount, order_status, user_id, payment_way, delivery_address, out_trade_no, create_time, operate_time, expire_time, tracking_no, province_id, activity_reduce_amount, coupon_reduce_amount, original_total_amount, feight_fee, feight_fee_reduce from order_info where (date_format(create_time,'%Y-%m-%d')='$do_date' or date_format(operate_time,'%Y-%m-%d')='$do_date')" } import_coupon_use(){ import_data coupon_use "select id, coupon_id, user_id, order_id, coupon_status, get_time, using_time, used_time, expire_time from coupon_use where (date_format(get_time,'%Y-%m-%d')='$do_date' or date_format(using_time,'%Y-%m-%d')='$do_date' or date_format(used_time,'%Y-%m-%d')='$do_date' or date_format(expire_time,'%Y-%m-%d')='$do_date')" } import_order_status_log(){ import_data order_status_log "select id, order_id, order_status, operate_time from order_status_log where date_format(operate_time,'%Y-%m-%d')='$do_date'" } import_user_info(){ import_data "user_info" "select id, login_name, nick_name, name, phone_num, email, user_level, birthday, gender, create_time, operate_time from user_info where (DATE_FORMAT(create_time,'%Y-%m-%d')='$do_date' or DATE_FORMAT(operate_time,'%Y-%m-%d')='$do_date')" } import_order_detail(){ import_data order_detail "select id, order_id, sku_id, sku_name, order_price, sku_num, create_time, source_type, source_id, split_total_amount, split_activity_amount, split_coupon_amount from order_detail where DATE_FORMAT(create_time,'%Y-%m-%d')='$do_date'" } import_payment_info(){ import_data "payment_info" "select id, out_trade_no, order_id, user_id, payment_type, trade_no, total_amount, subject, payment_status, create_time, callback_time from payment_info where (DATE_FORMAT(create_time,'%Y-%m-%d')='$do_date' or DATE_FORMAT(callback_time,'%Y-%m-%d')='$do_date')" } import_comment_info(){ import_data comment_info "select id, user_id, sku_id, spu_id, order_id, appraise, create_time from comment_info where date_format(create_time,'%Y-%m-%d')='$do_date'" } import_order_refund_info(){ import_data order_refund_info "select id, user_id, order_id, sku_id, refund_type, refund_num, refund_amount, refund_reason_type, refund_status, create_time from order_refund_info where date_format(create_time,'%Y-%m-%d')='$do_date'" } import_sku_info(){ import_data sku_info "select id, spu_id, price, sku_name, sku_desc, weight, tm_id, category3_id, is_sale, create_time from sku_info where 1=1" } import_base_category1(){ import_data "base_category1" "select id, name from base_category1 where 1=1" } import_base_category2(){ import_data "base_category2" "select id, name, category1_id from base_category2 where 1=1" } import_base_category3(){ import_data "base_category3" "select id, name, category2_id from base_category3 where 1=1" } import_base_province(){ import_data base_province "select id, name, region_id, area_code, iso_code, iso_3166_2 from base_province where 1=1" } import_base_region(){ import_data base_region "select id, region_name from base_region where 1=1" } import_base_trademark(){ import_data base_trademark "select id, tm_name from base_trademark where 1=1" } import_spu_info(){ import_data spu_info "select id, spu_name, category3_id, tm_id from spu_info where 1=1" } import_favor_info(){ import_data favor_info "select id, user_id, sku_id, spu_id, is_cancel, create_time, cancel_time from favor_info where 1=1" } import_cart_info(){ import_data cart_info "select id, user_id, sku_id, cart_price, sku_num, sku_name, create_time, operate_time, is_ordered, order_time, source_type, source_id from cart_info where 1=1" } import_coupon_info(){ import_data coupon_info "select id, coupon_name, coupon_type, condition_amount, condition_num, activity_id, benefit_amount, benefit_discount, create_time, range_type, limit_num, taken_count, start_time, end_time, operate_time, expire_time from coupon_info where 1=1" } import_activity_info(){ import_data activity_info "select id, activity_name, activity_type, start_time, end_time, create_time from activity_info where 1=1" } import_activity_rule(){ import_data activity_rule "select id, activity_id, activity_type, condition_amount, condition_num, benefit_amount, benefit_discount, benefit_level from activity_rule where 1=1" } import_base_dic(){ import_data base_dic "select dic_code, dic_name, parent_code, create_time, operate_time from base_dic where 1=1" } import_order_detail_activity(){ import_data order_detail_activity "select id, order_id, order_detail_id, activity_id, activity_rule_id, sku_id, create_time from order_detail_activity where date_format(create_time,'%Y-%m-%d')='$do_date'" } import_order_detail_coupon(){ import_data order_detail_coupon "select id, order_id, order_detail_id, coupon_id, coupon_use_id, sku_id, create_time from order_detail_coupon where date_format(create_time,'%Y-%m-%d')='$do_date'" } import_refund_payment(){ import_data refund_payment "select id, out_trade_no, order_id, sku_id, payment_type, trade_no, total_amount, subject, refund_status, create_time, callback_time from refund_payment where (DATE_FORMAT(create_time,'%Y-%m-%d')='$do_date' or DATE_FORMAT(callback_time,'%Y-%m-%d')='$do_date')" } import_sku_attr_value(){ import_data sku_attr_value "select id, attr_id, value_id, sku_id, attr_name, value_name from sku_attr_value where 1=1" } import_sku_sale_attr_value(){ import_data sku_sale_attr_value "select id, sku_id, spu_id, sale_attr_value_id, sale_attr_id, sale_attr_name, sale_attr_value_name from sku_sale_attr_value where 1=1" } case $1 in "order_info") import_order_info ;; "base_category1") import_base_category1 ;; "base_category2") import_base_category2 ;; "base_category3") import_base_category3 ;; "order_detail") import_order_detail ;; "sku_info") import_sku_info ;; "user_info") import_user_info ;; "payment_info") import_payment_info ;; "base_province") import_base_province ;; "activity_info") import_activity_info ;; "cart_info") import_cart_info ;; "comment_info") import_comment_info ;; "coupon_info") import_coupon_info ;; "coupon_use") import_coupon_use ;; "favor_info") import_favor_info ;; "order_refund_info") import_order_refund_info ;; "order_status_log") import_order_status_log ;; "spu_info") import_spu_info ;; "activity_rule") import_activity_rule ;; "base_dic") import_base_dic ;; "order_detail_activity") import_order_detail_activity ;; "order_detail_coupon") import_order_detail_coupon ;; "refund_payment") import_refund_payment ;; "sku_attr_value") import_sku_attr_value ;; "sku_sale_attr_value") import_sku_sale_attr_value ;; "all") import_base_category1 import_base_category2 import_base_category3 import_order_info import_order_detail import_sku_info import_user_info import_payment_info import_base_trademark import_activity_info import_cart_info import_comment_info import_coupon_use import_coupon_info import_favor_info import_order_refund_info import_order_status_log import_spu_info import_activity_rule import_base_dic import_order_detail_activity import_order_detail_coupon import_refund_payment import_sku_attr_value import_sku_sale_attr_value ;; esac
(4)賦予可執行權限
chmod +x makeDBData.sh mysql_sqoop_hdfs_init.sh mysql_sqoop_hdfs_everyday.sh
8)啟動腳本(模擬存量數據)
(1)2021-06-08 首日數據(上傳至HDFS)
mysql_sqoop_hdfs_init.sh all 2021-06-08
(2)2021-06-09----2021-06-10 : 每日同步
makeDBData.sh 2021-06-09 2

