1

I have been using this query (successfully) to move records from one table to another in a Postgres database.

WITH moved_rows AS 
( 
    DELETE FROM tableB
    RETURNING *
) 
INSERT INTO tableA 
SELECT * FROM moved_rows; 

For relatively small row-counts this works fine, but as the number of rows in tableB grows larger I need to move rows at a rate of a few million each time. I tried this....

WITH moved_rows AS 
( 
    DELETE FROM tableB
    RETURNING *
) 
INSERT INTO tableA 
SELECT * FROM moved_rows LIMIT 100000; 

This inserts 100000 records into tableA, but deletes ALL records from tableB.

How can I transfer a fixed number of records from one table to another? Obviously I need to ensure that only those rows successfully moved from A to B actually get deleted from A. I have no particular selection criteria for exactly which records get moved, I just want to limit the time and resources for each transaction.

UPDATE - I've tried both of these variants, and neither works....

WITH moved_rows AS 
( 
    DELETE FROM tableB LIMIT 100000
    RETURNING *
) 
INSERT INTO tableA 
SELECT * FROM moved_rows ;

Or

WITH moved_rows AS 
( 
    DELETE FROM tableB 
    RETURNING * LIMIT 100000
) 
INSERT INTO tableA 
SELECT * FROM moved_rows ;

Both return syntax error at or near "LIMIT"

ConanTheGerbil
  • 1,155
  • 4
  • 26
  • 43

1 Answers1

-2
WITH moved_rows AS (
    DELETE FROM <original_table> a
    USING <other_table> b
    WHERE <condition>
    RETURNING a.* -- or specify columns
)
SELECT [DISTINCT] * FROM moved_rows;
Roman
  • 789
  • 1
  • 6
  • 7