I'm inserting lot of data with a SqlBulkCopy. The source of the data may have some duplicated rows. I the destination table I have an index of uniqueness.
When the first duplicated row appears, the SqlBulkCopy throws an exception and rollback the internal transaction. I want it to ignore the exception and keep inserting. (ONLY if the exception is for the duplicated row).
There are some questions about this issue, but they all are looking for report duplicated rows, I just don't care about them.
This is one of the trade-offs with SqlBulkCopy - I use this when I know I have clean data. If you need to be able to deal with errors nicely in situations like this where you don't have clean data, then other approaches are "better" (it's a tradeoff though for performance).
Following on from your other question, SqlDataAdapter approach would give you this ability as there is a ContinueOnError property you can set that will allow the process to continue in the event of errors like this - very handy. I use this approach when I need to handle not-so-clean/potentially problematic data from .NET. But as I linked to previously, you will see a perf cost.
If you want absolute performance, and to meet your original requirements of keeping separate, then you should dedupe the data before then loading into the DB. Or use an approach like TomTom suggested
Fix the issue before doing the bulk insert. Bulk insert is specifically defined to not handle stuff like this. For that it puts SQL Server into some bulk insert mode which is faster... but has limitations. All key violations must be handeld BEFORE the insert.
What you my want to do is insert into a staging table (doubles allowed), then MERGE into the main table (using the MERGE command).