Linux下面部署和手工執行kettleJob


login as: hadoop
hadoop@192.168.0.2's password:
Last login: Sat Mar 2 17:48:03 2013 from 192.168.0.13
[hadoop@hadoop-etl-2-175 ~]$ crontab -l
31 19 * * * /bin/sh /pentaho/Plan/a.sh
15 23 * * * /bin/sh /pentaho/Plan/b.sh
10 7 * * * /bin/sh /pentaho/Plan/c.sh
30 16 * * * /bin/sh /pentaho/Plan/d.sh
0 13 * * * /bin/sh /pentaho/Plan/e.sh
0 13 * * 7 /bin/sh /pentaho/Plan/week1.sh
10 13 * * 1 /bin/sh /pentaho/Plan/week2.sh
0 12 1 * * /bin/sh /pentaho/Plan/month1.sh
[hadoop@hadoop-etl-2-175 ~]$

linux 下面的批處理文件格式為:

sh------------------------------------------------------------------------------------------------------------

#!/bin/sh

export JAVA_HOME=/usr/local/java
export HADOOP_HOME=/hadoop/hadoop

cd /pentaho/pentaho/data-integration

./kitchen.sh -rep 192.168.0.13.PDI_Repository -user username -pass password -dir /目錄名稱 -job job名稱 -level=basic>>/pentaho/Plan/job_loaddw.log

解釋:-level=basic>>/pentaho/Plan/job名稱.log 設置日志級別 輸入日志

sh------------------------------------------------------------------------------------------------------------

手工執行kettle job 的命令如下:

login as: hadoop
hadoop@192.168.0.2's password:
Last login: Thu Apr 5 11:09:24 2012 from 192.168.0.13
[hadoop@hadoop-etl-2-175 ~]$ cd ..
[hadoop@hadoop-etl-2-175 home]$ cd ..
[hadoop@hadoop-etl-2-175 /]$ ls
: esvn lib mnt pentaho srv
bin etc lib64 mysql proc sys
boot etl lost+found net root tmp
dev hadoop media netxtreme2-6.2.23-1.src.rpm sbin usr
eclipse home misc opt selinux var
[hadoop@hadoop-etl-2-175 /]$ cd pentaho
[hadoop@hadoop-etl-2-175 pentaho]$ cd pentaho
[hadoop@hadoop-etl-2-175 pentaho]$ cd data*
[hadoop@hadoop-etl-2-175 data-integration]$
./kitchen.sh -rep IP地址.PDI_Repository -user username -pass password -dir /job路徑 -job job名稱

 

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM