配置編譯環境
1.ant
下載地址:https://ant.apache.org/bindownload.cgi
執行解壓命令安裝到 /usr/local 目錄下: tar -zxvf apache-ant-1.9.13-bin.tar.gz -C /usr/local
重命名: mv apache-ant-1.9.13-bin.tar.gz ant
配置環境變量:
vim /etc/profil
export ANT_HOME=/usr/local/apache-ant-1.10.2
export PATH=$PATH:$ANT_HOME/bin
source /etc/profile 2.配置java環境變量------略(要求jdk1.8)
3.配置nodeJs
3.1安裝nodeJs依賴包
yum install -y gcc-c++ make
3.2執行命令下載NodeJs安裝包
curl --silent --location https://rpm.nodesource.com/setup_5.x | bash -
4.下載 azkaban-plugins
地址
下載好解壓包:azkaban-plugins-3.00.zip
上傳到配置ant的機器上執行: unzip azkaban-plugins-3.00.zip 命令進行解壓。
5.編譯azkaban-plugins
5.1cd 到azkaban-plugins目錄
5.2執行 ant 命令進行編譯
---------------------------------------------------------------------------------------------------------------------------------------
安裝插件
web-server插件安裝
-
-
- hdfsviewer
- javaviewer
- jobsummary
- reportal
- pigvisualizer
-
安裝比較簡單,其它解壓到plugins/viewer既可(hdfsviewer需要改名為hdfs,reportal需要用viewer目錄內的內容),最終結構如下:
./plugins/viewer ├── hdfs │ ├── conf │ ├── extlib │ ├── lib │ └── package.version ├── javaviewer │ ├── conf │ ├── extlib │ ├── lib │ ├── package.version │ └── web ├── jobsummary │ ├── conf │ ├── extlib │ ├── lib │ ├── package.version │ └── web ├── pigvisualizer │ ├── conf │ ├── extlib │ ├── lib │ ├── package.version │ └── web └── reportal ├── conf ├── lib └── web
exec-server 插件:
主要是配置jobtypes:
# jobtypes/common.properties # jobtypes/commonprivate.properties hadoop.home=/hadoop/home/path hive.home=/hadoop/hive/path pig.home=/hadoop/pig/path spark.home=/hadoop/spark/path # jobtypes/hadoopJava 默認值 # jobtypes/hive/private.properties jobtype.class=azkaban.jobtype.HadoopHiveJob hive.aux.jar.path=${hive.home}/aux/lib jobtype.classpath=${hadoop.home}/conf,${hadoop.home}/lib/*,${hive.home}/lib/*,${hive.home}/conf,${hive.aux.jar.path} # jobtypes/java 默認值 # pig使用對應版本,高於0.12使用0.12 # jobtypes/pig-0.12.0/plugin.properties pig.listener.visualizer=false jobtype.classpath=${pig.home}/lib/*,${pig.home}/* # jobtypes/pig-0.12.0/private.properties jobtype.class=azkaban.jobtype.HadoopPigJob jobtype.classpath=${hadoop.home}/conf,${hadoop.home}/lib/*,lib/* # jobtypes/saprk 里少azkaban-jobtype-3.0.0.jar,從別的插件里copy過來 # jobtypes/spark/private.properties jobtype.class=azkaban.jobtype.HadoopSparkJob hadoop.classpath=${hadoop.home}/lib jobtype.classpath=${hadoop.classpath}:${spark.home}/conf:${spark.home}/jars/*