lightgbm GPU版本安裝


 python金融風控評分卡模型和數據分析微專業(博主親自錄制視頻):http://dwz.date/b9vv

官網

https://lightgbm.readthedocs.io/en/latest/GPU-Windows.html

lightgbm GPU版本 Windows情況下安裝:

https://www.jianshu.com/p/30555fd2bd50

以下基於ubuntu 16.04 python 3.6.5安裝測試成功

1、安裝軟件依賴

sudo apt-get install --no-install-recommends git cmake build-essential libboost-dev libboost-system-dev libboost-filesystem-dev
2、安裝python庫

pip install setuptools wheel numpy scipy scikit-learn -U
3、安裝lightGBM-GPU

sudo pip3.6 install lightgbm --install-option=--gpu --install-option="--opencl-include-dir=/usr/local/cuda/include/" --install-option="--opencl-library=/usr/local/cuda/lib64/libOpenCL.so"
4、測試

先下載測試文件並且進行文件轉化

git clone https://github.com/guolinke/boosting_tree_benchmarks.git
cd boosting_tree_benchmarks/data
wget "https://archive.ics.uci.edu/ml/machine-learning-databases/00280/HIGGS.csv.gz"
gunzip HIGGS.csv.gz
python higgs2libsvm.py
編寫測試腳本

import lightgbm as lgb
import time


params = {'max_bin': 63,
'num_leaves': 255,
'learning_rate': 0.1,
'tree_learner': 'serial',
'task': 'train',
'is_training_metric': 'false',
'min_data_in_leaf': 1,
'min_sum_hessian_in_leaf': 100,
'ndcg_eval_at': [1,3,5,10],
'sparse_threshold': 1.0,
'device': 'gpu',
'gpu_platform_id': 0,
'gpu_device_id': 0}


dtrain = lgb.Dataset('data/higgs.train')
t0 = time.time()
gbm = lgb.train(params, train_set=dtrain, num_boost_round=10,
valid_sets=None, valid_names=None,
fobj=None, feval=None, init_model=None,
feature_name='auto', categorical_feature='auto',
early_stopping_rounds=None, evals_result=None,
verbose_eval=True,
keep_training_booster=False, callbacks=None)
t1 = time.time()

print('gpu version elapse time: {}'.format(t1-t0))


params = {'max_bin': 63,
'num_leaves': 255,
'learning_rate': 0.1,
'tree_learner': 'serial',
'task': 'train',
'is_training_metric': 'false',
'min_data_in_leaf': 1,
'min_sum_hessian_in_leaf': 100,
'ndcg_eval_at': [1,3,5,10],
'sparse_threshold': 1.0,
'device': 'cpu'
}

t0 = time.time()
gbm = lgb.train(params, train_set=dtrain, num_boost_round=10,
valid_sets=None, valid_names=None,
fobj=None, feval=None, init_model=None,
feature_name='auto', categorical_feature='auto',
early_stopping_rounds=None, evals_result=None,
verbose_eval=True,
keep_training_booster=False, callbacks=None)
t1 = time.time()

print('cpu version elapse time: {}'.format(t1-t0))
測試結果如下,可見gpu版確實比cpu快

 

 python機器學習生物信息學系列課(博主錄制)http://dwz.date/b9vw

歡迎關注博主主頁,學習python視頻資源

 

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM