Python Multiprocessing And Serializing Data
I am running a script on a school computer using the multiprocessing module. I am serializing the data frequently. It can be summarized by the code below: import multiprocessing
Solution 1:
Your program looks pretty good. In this case IOError
just means "bad things happened." The entire set of simulated data became to large for the Python process, so it exited with the mysterious message.
A couple improvements in the following version:
Once some data has been produced, append it to a data file, then zap it from memory. The program should have roughly the same RAM use over time, rather than using up more and more, then crashing.
Conveniently, if a file is a concatenation of
pickle
objects, we can easily print out each one later for further examination. Example code shown.
Have fun!
source
import multiprocessing as mp
import glob, time, pickle, sys
defsimulation(j):
for k inrange(10):
datum = {'result': k}
time.sleep(1)
withopen('data%d.pkl'%j, 'ab') as dataf:
pickle.dump(datum, dataf)
defshow():
for datname in glob.glob('data*.pkl'):
try:
print'*'*8, datname
withopen(datname, 'rb') as datf:
whileTrue:
print pickle.load(datf)
except EOFError:
passdefdo_sim():
processes = []
processes.append(mp.Process(target = simulation, args = (1,) ))
processes.append(mp.Process(target = simulation, args = (2,) ))
for process in processes:
process.start()
for process in processes:
process.join()
if __name__ == '__main__':
if'--show'in sys.argv:
show()
else:
do_sim()
output of "python ./msim.py --show"
******** data2.pkl
{'result': 0}
{'result': 1}
{'result': 2}
{'result': 3}
{'result': 4}
{'result': 5}
{'result': 6}
{'result': 7}
{'result': 8}
{'result': 9}
******** data1.pkl
{'result': 0}
{'result': 1}
{'result': 2}
{'result': 3}
{'result': 4}
{'result': 5}
{'result': 6}
{'result': 7}
{'result': 8}
{'result': 9}
Post a Comment for "Python Multiprocessing And Serializing Data"