配置编译环境
1.ant
下载地址:https://ant.apache.org/bindownload.cgi
执行解压命令安装到 /usr/local 目录下: tar -zxvf apache-ant-1.9.13-bin.tar.gz -C /usr/local
重命名: mv apache-ant-1.9.13-bin.tar.gz ant
配置环境变量:
vim /etc/profil
export ANT_HOME=/usr/local/apache-ant-1.10.2
export PATH=$PATH:$ANT_HOME/bin
source /etc/profile 2.配置java环境变量------略(要求jdk1.8)
3.配置nodeJs
3.1安装nodeJs依赖包
yum install -y gcc-c++ make
3.2执行命令下载NodeJs安装包
curl --silent --location https://rpm.nodesource.com/setup_5.x | bash -
4.下载 azkaban-plugins
地址
下载好解压包:azkaban-plugins-3.00.zip
上传到配置ant的机器上执行: unzip azkaban-plugins-3.00.zip 命令进行解压。
5.编译azkaban-plugins
5.1cd 到azkaban-plugins目录
5.2执行 ant 命令进行编译
---------------------------------------------------------------------------------------------------------------------------------------
安装插件
web-server插件安装
-
-
- hdfsviewer
- javaviewer
- jobsummary
- reportal
- pigvisualizer
-
安装比较简单,其它解压到plugins/viewer既可(hdfsviewer需要改名为hdfs,reportal需要用viewer目录内的内容),最终结构如下:
./plugins/viewer ├── hdfs │ ├── conf │ ├── extlib │ ├── lib │ └── package.version ├── javaviewer │ ├── conf │ ├── extlib │ ├── lib │ ├── package.version │ └── web ├── jobsummary │ ├── conf │ ├── extlib │ ├── lib │ ├── package.version │ └── web ├── pigvisualizer │ ├── conf │ ├── extlib │ ├── lib │ ├── package.version │ └── web └── reportal ├── conf ├── lib └── web
exec-server 插件:
主要是配置jobtypes:
# jobtypes/common.properties # jobtypes/commonprivate.properties hadoop.home=/hadoop/home/path hive.home=/hadoop/hive/path pig.home=/hadoop/pig/path spark.home=/hadoop/spark/path # jobtypes/hadoopJava 默认值 # jobtypes/hive/private.properties jobtype.class=azkaban.jobtype.HadoopHiveJob hive.aux.jar.path=${hive.home}/aux/lib jobtype.classpath=${hadoop.home}/conf,${hadoop.home}/lib/*,${hive.home}/lib/*,${hive.home}/conf,${hive.aux.jar.path} # jobtypes/java 默认值 # pig使用对应版本,高于0.12使用0.12 # jobtypes/pig-0.12.0/plugin.properties pig.listener.visualizer=false jobtype.classpath=${pig.home}/lib/*,${pig.home}/* # jobtypes/pig-0.12.0/private.properties jobtype.class=azkaban.jobtype.HadoopPigJob jobtype.classpath=${hadoop.home}/conf,${hadoop.home}/lib/*,lib/* # jobtypes/saprk 里少azkaban-jobtype-3.0.0.jar,从别的插件里copy过来 # jobtypes/spark/private.properties jobtype.class=azkaban.jobtype.HadoopSparkJob hadoop.classpath=${hadoop.home}/lib jobtype.classpath=${hadoop.classpath}:${spark.home}/conf:${spark.home}/jars/*