python - Train two models concurrently -
all need is, train 2 regression models (using scikit-learn) on same data @ same time, using different cores. i've tried figured out myself using process without success.
gb1 = gradientboostingregressor(n_estimators=10) gb2 = gradientboostingregressor(n_estimators=100) def train_model(model, data, target): model.fit(data, target) live_data # pandas dataframe object target # numpy array object p1 = process(target=train_model, args=(gb1, live_data, target)) # same data p2 = process(target=train_model, args=(gb2, live_data, target)) # same data p1.start() p2.start()
if run code above following error while trying start p1 process.
traceback (most recent call last): file "<pyshell#28>", line 1, in <module> p1.start() file "c:\python27\lib\multiprocessing\process.py", line 130, in start self._popen = popen(self) file "c:\python27\lib\multiprocessing\forking.py", line 274, in __init__ to_child.close() ioerror: [errno 22] invalid argument
i'm running script (in idle) on windows. suggestions on how should proceed?
ok.. after hours spent in try working, i'll post solution. first thing. if you're on windows , you're using interactive intepreter need encapsualte code under 'main' condition, @ exeption of function definition , imports. because when new process spawned go on loop.
my solution below:
from sklearn.ensemble import gradientboostingregressor multiprocessing import pool itertools import repeat def train_model(params): model, data, target = params # since pool args accept once argument, need pass 1 # , unroll above model.fit(data, target) return model if __name__ == '__main__': gb1 = gradientboostingregressor(n_estimators=10) gb2 = gradientboostingregressor(n_estimators=100) live_data # pandas dataframe object target # numpy array object po = pool(2) # 2 numbers of process want spawn gb, gb2 = po.map_async(train_model, zip([gb1,gb2], repeat(data), repeat(target)) # zip in 1 iterable object ).get() # start processes , execute them po.terminate() # kill spawned processes
Comments
Post a Comment