I am using SqlBulkCopy to insert batches of records to a server. If one record fails because of a unique constraint, I don't want to rollback the whole batch, but I want to do an update of the existing record instead.
I was hoping I could log the failing records key values then once the bulk insert is finished, go back and address them individually.
However I can't see any way of telling SqlBulkCopy to not to rollback the transaction on failure, nor how to get info on the failing record.
I could check the record exists before adding it to the SqlBulkCopy DataTable, but that would add considerable overhead.
I have checked the answers here which indicate using a staging table and some other sproc to do the insert/update after the bulk insert, but I think that sproc would need to process each record individually to either insert new or update the existing records - very time-consuming.
Any other tips or has nothing changed in the last 4 years?
I wouldn't recommend using the merge statement. It has issues see Use Caution with SQL Server's MERGE Statement first of many, and personally IMHO I find it difficult to follow and offers little if anything over a simple insert update statement (not so sweet syntactic sugar).
INSERT INTO A SELECT * FROM #B B LEFT OUTER JOIN A ON A.ID = B.ID WHERE A.ID IS NULL
UPDATE A SET BLAH = 'BLAH' FROM A INNER JOIN #B B ON B.ID = A.ID
If you are set on a tool rather than pure sql you may want to look into SSIS which can handle what you are looking for.
Do not use SqlBulkCopy.
Insert into a tempoary table (also avoids the horrific locking behavior of SqlBulkCopy) then MERGE into the final table with the proper rules.
Simple, done, this is how I do all my bulk updates.