spark-submit python 程序,"/home/.python-eggs" permission denied 問題解決


問題描述,spark-submit 用 yarn 模式提交一個python 腳本運行程序,運行到需要分布式的部分,即map/mapPartition等等RDD的時候,或者actor RDD的時候,報錯如下 :

Traceback (most recent call last):
File "/usr/lib64/python2.7/runpy.py", line 151, in _run_module_as_main
  mod_name, loader, code, fname = _get_module_details(mod_name)
File "/usr/lib64/python2.7/runpy.py", line 101, in _get_module_details
  loader = get_loader(mod_name)
File "/usr/lib64/python2.7/pkgutil.py", line 464, in get_loader
  return find_loader(fullname)
File "/usr/lib64/python2.7/pkgutil.py", line 474, in find_loader
  for importer in iter_importers(fullname):
File "/usr/lib64/python2.7/pkgutil.py", line 430, in iter_importers
  __import__(pkg)
File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/__init__.py", line 41, in <module>
File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/context.py", line 35, in <module>
File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/rdd.py", line 51, in <module>
File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/shuffle.py", line 33, in <module>
File "build/bdist.linux-x86_64/egg/psutil/__init__.py", line 89, in <module>
File "build/bdist.linux-x86_64/egg/psutil/_pslinux.py", line 24, in <module>
File "build/bdist.linux-x86_64/egg/_psutil_linux.py", line 7, in <module>
File "build/bdist.linux-x86_64/egg/_psutil_linux.py", line 4, in __bootstrap__
File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 945, in resource_filename
  self, resource_name
File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 1633, in get_resource_filename
  self._extract_resource(manager, self._eager_to_zip(name))
File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 1661, in _extract_resource
  self.egg_name, self._parts(zip_path)
File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 1025, in get_cache_path
  self.extraction_error()
File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 991,     inextraction_error
  raise err
  pkg_resources.ExtractionError: Can't extract file(s) to egg cache
  The following error occurred while trying to extract file(s) to the Python egg
  cache:
    [Errno 13] Permission denied: '/home/.python-eggs' 
  The Python egg cache directory is currently set to:  
   /home/.python-eggs  
 Perhaps your account does not have write access to this directory?  You can
 change the cache directory by setting the PYTHON_EGG_CACHE environment
 variable to point to an accessible directory.

  解決方案:

1、在你的map/mapPartition 里面的代碼里面加上:

  

os.environ['PYTHON_EGG_CACHE'] = '/tmp/.python-eggs/'
os.environ['PYTHON_EGG_DIR']='/tmp/.python-eggs/'

2、在集群的每一台機器上面配置環境變量(推薦):

os.environ['PYTHON_EGG_CACHE'] = '/tmp/.python-eggs/'
os.environ['PYTHON_EGG_DIR']='/tmp/.python-eggs/'

3、打開spark的根目錄,cd到python/lib,找到pyspark.zip文件,解壓文件,cd 到pyspark里面,找到rdd.py ,vim打開,找到  “import os”這一行,在這行下面插入代碼:

os.environ['PYTHON_EGG_CACHE'] = '/tmp/.python-eggs/'
os.environ['PYTHON_EGG_DIR']='/tmp/.python-eggs/'

以上三種方案都不能解決這個問題的話,建議先用 hadoop 的streaming 功能 提交一個python 的執行文件,測試yarn是否支持python運算。

然后再看看用spark的standalone模式是不是可以提交python任務。

以上。

如果還有問題,那就只能發郵件給spark的開發組了。


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM