How to we skip error with SQLBulkCopy

c# sql sqlbulkcopy sql-server


I'm using SQLBulkCopy to upload large amounts of data into a SQL database, which has an Identity column with the field ID. Duplicate records won't be permitted, as you are aware.

Consoder that there are two duplicate rows among the 10 data I need to enter using SQLBulkCopy.

I want to know whether there is a method to avoid such duplicate rows while using SQLBulkCopy to insert data.

5/14/2014 12:05:04 PM

Accepted Answer

SqlBulkCopy's error handling capabilities are quite restricted; by default, it doesn't even check restrictions.

However, it moves really quickly.

If you want to solve the duplicate key problem and figure out which rows in a batch are duplicates. One choice is:

  • start tran
  • Take a tablockx from the table, pick all of the current "Hash" values, and then dump them all into a hashset.
  • Remove the duplicates, then report.
  • Put the info in.
  • dedicate tran

If you are adding large sets and the starting data size in the database is not too large, this approach will work well.

Please be more specific by describing the whole problem's context in your query.


Here is another approach you may take now that I have additional background for it:

  • Enter the data in bulk into a temporary table.
  • tran start serializable
  • Report on all temporary rows that are present in the final table.
  • By executing a left join on the hash and incorporating all the new rows, insert the data from the temporary table into the actual table.
  • entrust the driver

Since there are hardly any round trips involved, it ought to be quite quick given your specifications;

6/17/2009 4:51:46 AM

Popular Answer

a different strategy than what has previously been offered; carry out theSqlBulkCopy and grab the thrown SqlException

    Violation of PRIMARY KEY constraint 'PK_MyPK'. Cannot insert duplicate 
key in object 'dbo.MyTable'. **The duplicate key value is (17)**.

After that, you may delete every item from ID 17, the first duplicated record, by using your source. I'm assuming things here that may not apply to your situation and may just apply to mine, such as that the duplicate data came from a prior unsuccessful attempt.SqlBulkCopy as a result of uploading issues in the network or SQL.

Related Questions

Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow