Closed trourance closed 4 years ago
Same error in 9.6.6 when running medium/large queries processing of several 10K's of rows.
Recently, I have made some changes to support bigger data sets. Can you please try and let me know if it works for you?
In case, if you receive any error message, please share a test case which I can use to reproduce the issue to fix the problem.
Thanks, Vibhor
Hi,
I'll try asap and let you know, but I'm very busy at the moment.
On Tue, Feb 20, 2018 at 11:39 PM, Vibhor Kumar notifications@github.com wrote:
Recently, I have made some changes to support bigger data sets. Can you please try and let me know if it works for you?
In case, if you receive any error message, please share a test case which I can use to reproduce the issue to fix the problem.
Thanks, Vibhor
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/vibhorkum/pg_background/issues/4#issuecomment-367145834, or mute the thread https://github.com/notifications/unsubscribe-auth/AGoUKY-FaFEVCLHTJJVro1CWELBN79vpks5tW0mSgaJpZM4QXnUE .
Any update on this?
Hi I'm currently running into the same issue. My test case is the following:
DO $do$ BEGIN FOR i IN 1..1000 LOOP perform pg_background_launch('INSERT INTO progress(operation, user_id, progress) VALUES (1, 1, 1);'); END LOOP; END $do$;
I'm basically only inserting rows into a table and after a some runs in the loop the error occurs.
Also doesnt always happen the first time i run this code sometimes it happens in the 2nd run as well.
I can confirm, the issue is still there using the latest version, and for me it happens everytime using the following test case:
DO $do$ DECLARE v_query TEXT; BEGIN FOR i IN 1..1000 LOOP v_query := 'INSERT INTO history (nb, pid, cid) VALUES (' || i || ',' || i || ',' || i || ')'; perform pg_background_launch(v_query); END LOOP; END $do$;
Btw, I have to set max_worker_processes = 128
for the test case.
Is there any update on this? I am experiencing the same issue too. Would be great if this could be fixed.
It works for me
SELECT INTO id_background pg_background_launch('INSERT INTO ....');
PERFORM pg_sleep(0.010);
PERFORM pg_background_detach(id_background);
or
SELECT INTO id_background pg_background_launch('INSERT INTO objects(id) VALUES ('||id_object||');');
LOOP
PERFORM id FROM objects WHERE id = id_object LIMIT 1;
EXIT WHEN FOUND;
END LOOP;
PERFORM pg_background_detach(id_background);
pg_background_result should also work you need to get the result and release the queue
Yeah. As @angrocode mentioned, you would like to use either pg_background_detach or pg_background_result functions in your procedures. Thank you, @angrocode !
Closing this issue.
Hi,
I'm trying to use this extension in PostgreSQL 9.5.7 to import bulk data.
After some time running, I get the following error message:
ERROR: too many dynamic shared memory segments
CONTEXTE : SQL statement "SELECT pg_background_launch(v_query)"