How to we skip error with SQLBulkCopy

c# sql sqlbulkcopy sql-server


I am using SQLBulkCopy to upload bulk data into Sql Table.. the table contains the column ID as Identity column. so it will not allow dulicate record as you know..

Consoder that I have 10 rows to insert by SQLBulkCopy, among that 10 records we have 2 dulicate rows...

My questions is: Is there any way to skip that duplicate rows while inserting rows using SQLBulkCopy?

5/14/2014 12:05:04 PM

Accepted Answer

SqlBulkCopy, has very limited error handling facilities, by default it doesn't even check constraints.

However, its fast, really really fast.

If you want to work around the duplicate key issue, and identify which rows are duplicates in a batch. One option is:

  • start tran
  • Grab a tablockx on the table select all current "Hash" values and chuck them in a HashSet.
  • Filter out the duplicates and report.
  • Insert the data
  • commit tran

This process will work effectively if you are inserting huge sets and the size of the initial data in the table is not too huge.

Can you please expand your question to include the rest of the context of the problem.


Now that I have some more context here is another way you can go about it:

  • Do the bulk insert into a temp table.
  • start serializable tran
  • Select all temp rows that are already in the destination table ... report on them
  • Insert the data in the temp table into the real table, performing a left join on hash and including all the new rows.
  • commit the tran

That process is very light on round trips, and considering your specs should end up being really fast;

6/17/2009 4:51:46 AM

Popular Answer

Slightly different approach than already suggested; Perform the SqlBulkCopy and catch the SqlException thrown:

    Violation of PRIMARY KEY constraint 'PK_MyPK'. Cannot insert duplicate 
key in object 'dbo.MyTable'. **The duplicate key value is (17)**.

You can then remove all items from your source from ID 17, the first record that was duplicated. I'm making assumptions here that apply to my circumstances and possibly not yours; i.e. that the duplication is caused by the exact same data from a previously failed SqlBulkCopy due to SQL/Network errors during the upload.

Related Questions

Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow