背景:在使用kettle 6進行大量數據並行抽取時,偶爾會出現「Unknown error in KarafBlueprintWatcher」的錯誤,詳細的報錯信息可以查看下面的代碼塊。
ERROR: Bundle pentaho-big-data-api-runtimeTest [76] Error starting mvn:pentaho/pentaho-big-data-api-runtimeTest/6.1.0.1-196 (org.osgi.framework.BundleException: Unable to acquire global lock for resolve.)
org.osgi.framework.BundleException: Unable to acquire global lock for resolve.
at org.apache.felix.framework.Felix.resolveBundleRevision(Felix.java:4006)
at org.apache.felix.framework.Felix.startBundle(Felix.java:2045)
at org.apache.felix.framework.Felix.setActiveStartLevel(Felix.java:1299)
at org.apache.felix.framework.FrameworkStartLevelImpl.run(FrameworkStartLevelImpl.java:304)
at java.lang.Thread.run(Thread.java:748)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data/etl/data-integration/launcher/../lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/data/etl/data-integration/plugins/pentaho-big-data-plugin/lib/slf4j-log4j12-1.7.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
07:22:52,715 ERROR [KarafLifecycleListener] Error in Blueprint Watcher
org.pentaho.osgi.api.IKarafBlueprintWatcher$BlueprintWatcherException: Unknown error in KarafBlueprintWatcher
at org.pentaho.osgi.impl.KarafBlueprintWatcherImpl.waitForBlueprint(KarafBlueprintWatcherImpl.java:103)
at org.pentaho.di.osgi.KarafLifecycleListener$2.run(KarafLifecycleListener.java:161)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.pentaho.osgi.api.IKarafBlueprintWatcher$BlueprintWatcherException: Timed out waiting for blueprints to load: pentaho-big-data-api-runtimeTest,pentaho-big-data-kettle-plugins-common-named-cluster-bridge,pentaho-big-data-kettle-plugins-hdfs,pentaho-big-data-kettle-plugins-hbase,pentaho-big-data-kettle-plugins-mapreduce,pentaho-big-data-kettle-plugins-pig,pentaho-big-data-kettle-plugins-oozie,pentaho-big-data-kettle-plugins-sqoop,pentaho-big-data-impl-clusterTests,pentaho-big-data-impl-shim-shimTests
at org.pentaho.osgi.impl.KarafBlueprintWatcherImpl.waitForBlueprint(KarafBlueprintWatcherImpl.java:88)
... 2 more
查看kettle版本
$./kitchen.sh -version
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
16:20:38,608 INFO [KarafInstance]
*******************************************************************************
*** Karaf Instance Number: 1 at /data/etl/data-integration/./system/karaf/c ***
*** aches/default/data-1 ***
*** Karaf Port:8802 ***
*** OSGI Service Port:9051 ***
*******************************************************************************
16:20:38,608 INFO [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled
Jan 14, 2019 4:20:38 PM org.apache.karaf.main.Main$KarafLockCallback lockAquired
INFO: Lock acquired. Setting startlevel to 100
2019/01/14 16:20:39 - Kitchen - Kettle version 6.1.0.1-196, build 1, build date : 2016-04-07 12.08.49
處理方式:
1 修改karaf配置
# 進入配置目錄
cd <kettle_home>/system/karaf/etc
# 修改org.apache.karaf.features.cfg文件
vi org.apache.karaf.features.cfg
# 將第31行從
featuresBoot=config,pentaho-client,pentaho-metaverse,pdi-dataservice,pdi-data-refinery
# 改成
featuresBoot=config,pentaho-client,pentaho-metaverse,pdi-dataservice
# 即去掉最后一個參數
具體原因參考官方論壇文章
另有鏈接說明這個問題在7.0版本中得到了解決。
2 如果kettle轉換中需要傳入參數,將並行轉換中的參數名改成不一樣的,防止沖突。
-- EOF --
