let's have class called dog:
class dog(): def __init__(self, name, age): self.name = name self.age = age self.fleas = [] now let's have list of names,
names = ["bob", "joe", "dave"] and create list doing following:
dogs = [dog(name, index) index, name in enumerate(names)] # creates list "dogs" has 3 dog classes in it, named bob, joe, david, ages 0, 1, 2 respectively. now have dictionary of fleas,
global_fleas = {"bob":[flea_43], "joe":[flea_20], "dave":[flea_3]} and there separate method flea_mod() modifies dictionary flea number each dog changed random number flea_50. there 1 flea each dog in dictionary.
i can append flea in global_fleas corresponding dog doing following:
def dirty_dog(dog): dog.fleas.append(global_fleas[dog.name]) here's kicker. want multiprocess loop. here's have right now:
while true: flea_mod() # randomizes flea_number each dog in global_flea dictionary pool = multiprocessing.pool(processes=len(dogs)) [pool.apply_async(dirty_dog, dog) dog in dogs] pool.close() pool.join() so problem each dog object not retain fleas list every time loop run. want fleas list each dog retained @ end of each loop, after 2 loops, each dog have 2 fleas, , on. ideas? i'm guessing i'll have pickle something.
the multiprocessing docs advise against shared state when possible, use managed dictionary achieve goals:
http://docs.python.org/2/library/multiprocessing.html#sharing-state-between-processes
(see server based manager section.)
this way alter single managed object based on core logic , processing nodes see updated version. you'll have try out in code see best sharing setup (if any!).
Comments
Post a Comment