-
Type: Bug
-
Resolution: Won't Fix
-
Priority: Major - P3
-
None
-
Affects Version/s: None
-
Component/s: None
-
None
-
Environment:Affects all environments.
I feel compatibility with Queues and multiprocessing needs to be implemented. I have a huge file to process and i wanted to make use of multiprocessing.
Even If you can give me a hack(ish) solution for now, It would work for me. Any help is appreciated.
To replicate this you can run the below python code.
Bar.python
import multiprocessing from multiprocessing import Queue from pymongo import MongoClient m = MongoClient('10.14.100.246', 27017) cursors=[m['test']['test'].find()] queue=Queue() for t in cursors: queue.put(t) class Worker(multiprocessing.Process): def __init__(self, inqueue): super(Worker, self).__init__() self.inqueue=inqueue def run(self): try: t=self.inqueue.get(False) print('{}'.format(t)) except multiprocessing.queues.Empty: pass workers=[Worker(queue)] for w in workers: w.start() for w in workers: w.join()