0

I'm trying to create a function which inserts rows from one database table to another, this is the part of the function which goes wrong:

BEGIN
  INSERT INTO suppliers("id","name", "url", "logo", "clicks", "order")
                SELECT * FROM dblink('dbname=linkeddatabase', 'SELECT id,"name", url, logo, clicks, "order" FROM suppliers')
                AS supp(id int,"name" character varying, "url" character varying, "logo" character varying, "clicks" int, "order" int);
  EXCEPTION WHEN unique_violation THEN
  -- do nothing
END;

Problem:

It looks like it stops as soon as it reaches a duplicated row. Cause as soon as the function is finished, my table contains the same amount of rows...

What I want: If it needs to insert 1000 rows, of which the first 100 are duplicated, it still needs to insert the last 900 rows. Any ideas to fix this with the current function? Or do I just need to create some loop?

Erik van de Ven
  • 510
  • 5
  • 19
  • INSERT INTO suppliers ... SELECT ... FROM dblink(...) t WHERE NOT EXISTS (SELECT 1 FROM suppliers s WHERE s.id = t.id or something like that. See http://dba.stackexchange.com/questions/30499/optimal-way-to-ignore-duplicate-inserts/30554#30554 – András Váczi Feb 04 '15 at 14:43
  • Got it done using a try catch inside a python script now. Takes a bit more time cause it has to try inserting every row separately, so I added a delta-import functionality, which imports everyday only the new records. It needs to update the rows which do exist so... In the catch I send an update query – Erik van de Ven Feb 05 '15 at 09:17
  • And how did you find the solution proposed in my answer on the linked question? I see no disadvantage of it, while being (supposedly) much faster than processing row by row in python. – András Váczi Feb 05 '15 at 09:19
  • Well you are right, if I look at your query it does look much much faster! So I will try your solution asap :) First have to finish our deadline today. As soon as I tested your version, I'll let you all know. Guess the update query is gonna be a bit tricky, but I'll try :) – Erik van de Ven Feb 05 '15 at 09:24

0 Answers0