Simple Multithread For Loop In Python
Solution 1:
Easiest way is with multiprocessing.dummy (which uses threads instead of processes) and a Pool
import multiprocessing.dummy as mp
defdo_print(s):
print s
if __name__=="__main__":
p=mp.Pool(4)
p.map(do_print,range(0,10)) # range(0,1000) if you want to replicate your example
p.close()
p.join()
Maybe you want to try real multiprocessing, too if you want to better utilize multiple CPUs but there are several caveats and guidelines to follow then.
Possibly other methods of Pool
would better suit your needs - depending on what you are actually trying to do.
Solution 2:
You'll have to do the splitting manually:
import threading
defThFun(start, stop):
for item inrange(start, stop):
print item
for n inrange(0, 1000, 100):
stop = n + 100if n + 100 <= 1000else1000
threading.Thread(target = ThFun, args = (n, stop)).start()
This code uses multithreading, which means that everything will be run within a single Python process (i.e. only one Python interpreter will be launched).
Multiprocessing, discussed in the other answer, means running some code in several Python interpreters (in several processes, not threads). This may make use of all the CPU cores available, so this is useful when you're focusing on the speed of your code (print a ton of numbers until the terminal hates you!), not simply on parallel processing.
Solution 3:
Since Python 3.2, the concurrent.futures
standard library provides primitives to concurrently map
a function across iterables. Since map
and for
are closely related, this allows to easily convert a for
loop into a multi-threaded/multi-processed loop:
from concurrent.futures import ThreadPoolExecutor
with ThreadPoolExecutor() as executor:
executor.map(print, range(0, 1000))
Post a Comment for "Simple Multithread For Loop In Python"