You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Since this issue is still open, I thought I would comment about my recent experience with multiple processes accessing the same file-based queue, i.e., where one or more processes adds to the queue while a daemon server reads from it. While investigating why queue items were sporadically lost, I realized that the Queue.info attribute, which stores a pointer to the next queue item, was not being updated by the server. To overcome this in the past, I had re-initialized the Queue when it was empty, but never understood why that was needed. However, I now realize that the problem is that the server queue should reload the info file before each new get by calling Queue._loadinfo(). I subclassed the file-based queue to do this, and it seems to work reliably, although I can't guarantee that there won't be race conditions causing the odd failure. I mitigate this by locking the files during each put and get call.
I wonder if this could be a setting when initializing the queue, i.e., to force a reload of Queue.info before each get. I can see that it might affect performance, but I don't see how file-based queues could be used by multiple processes without it. My system works, but I would prefer not to be calling private functions so it would be better if it were handled internally.
Hi,
Thanks for the great work.
I think this is a question and not an issue, but couldn't find a better place to post.
Can one sqlite db file be used by more than one processes concurrently via SQLiteQueue?
I don't see any reason why it couldn't be, but I can't find any mention of multiprocessing, and wanted to be sure.
The text was updated successfully, but these errors were encountered: