vibhorkum / pg_background

pg_background
GNU General Public License v3.0
147 stars 36 forks source link

ERROR: too many dynamic shared memory segments #4

Closed trourance closed 4 years ago

trourance commented 6 years ago

Hi,

I'm trying to use this extension in PostgreSQL 9.5.7 to import bulk data.

After some time running, I get the following error message: ERROR: too many dynamic shared memory segments CONTEXTE : SQL statement "SELECT pg_background_launch(v_query)"

danigosa commented 6 years ago

Same error in 9.6.6 when running medium/large queries processing of several 10K's of rows.

vibhorkum commented 6 years ago

Recently, I have made some changes to support bigger data sets. Can you please try and let me know if it works for you?

In case, if you receive any error message, please share a test case which I can use to reproduce the issue to fix the problem.

Thanks, Vibhor

trourance commented 6 years ago

Hi,

I'll try asap and let you know, but I'm very busy at the moment.

On Tue, Feb 20, 2018 at 11:39 PM, Vibhor Kumar notifications@github.com wrote:

Recently, I have made some changes to support bigger data sets. Can you please try and let me know if it works for you?

In case, if you receive any error message, please share a test case which I can use to reproduce the issue to fix the problem.

Thanks, Vibhor

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/vibhorkum/pg_background/issues/4#issuecomment-367145834, or mute the thread https://github.com/notifications/unsubscribe-auth/AGoUKY-FaFEVCLHTJJVro1CWELBN79vpks5tW0mSgaJpZM4QXnUE .

vibhorkum commented 6 years ago

Any update on this?

Bombsch commented 6 years ago

Hi I'm currently running into the same issue. My test case is the following:

DO $do$ BEGIN FOR i IN 1..1000 LOOP perform pg_background_launch('INSERT INTO progress(operation, user_id, progress) VALUES (1, 1, 1);'); END LOOP; END $do$;

I'm basically only inserting rows into a table and after a some runs in the loop the error occurs.

Bombsch commented 6 years ago

Also doesnt always happen the first time i run this code sometimes it happens in the 2nd run as well.

trourance commented 6 years ago

I can confirm, the issue is still there using the latest version, and for me it happens everytime using the following test case:

DO $do$ DECLARE v_query TEXT; BEGIN FOR i IN 1..1000 LOOP v_query := 'INSERT INTO history (nb, pid, cid) VALUES (' || i || ',' || i || ',' || i || ')'; perform pg_background_launch(v_query); END LOOP; END $do$;

Btw, I have to set max_worker_processes = 128 for the test case.

FLandgraf commented 5 years ago

Is there any update on this? I am experiencing the same issue too. Would be great if this could be fixed.

angrocode commented 4 years ago

It works for me

SELECT INTO id_background pg_background_launch('INSERT INTO ....');
PERFORM pg_sleep(0.010);
PERFORM pg_background_detach(id_background);

or

SELECT INTO id_background pg_background_launch('INSERT INTO objects(id) VALUES ('||id_object||');');
LOOP
    PERFORM id FROM objects WHERE id = id_object LIMIT 1;
    EXIT WHEN FOUND;
END LOOP;
PERFORM pg_background_detach(id_background);

pg_background_result should also work you need to get the result and release the queue

vibhorkum commented 4 years ago

Yeah. As @angrocode mentioned, you would like to use either pg_background_detach or pg_background_result functions in your procedures. Thank you, @angrocode !

Closing this issue.