SqlBulkCopy - Useful on smaller datasets (ie 1000 rows)?

.net performance sqlbulkcopy

Question

Before I go to the trouble of implementing SqlBulkCopy and performance test it, I was wondering if anyone had any guidance on when it should be used and if it is preferable to use even on relatively small data sets (1000 or so rows). Is there any overhead to using it in this situation or should I always use it when inserting a reasonable number of rows?

Accepted Answer

I probably would use it for 1000 rows, but you can do something similar with command-batching.

Note that with 1000 rows, the simplest approach is probably to throw the data you want into a DataTable of the correct shape, which is pretty trivial. For larger sets, writing a bespoke IDataReader (to stream the records, rather than buffer them all in memory first) is my preferred option.

The other thing to watch is transactions, durability, etc; when using SqlBulkCopy I tend to use this to push into a staging table, and then call a sproc (or just a simple TSQL command) to push the data over to the real table. This keeps the real table clean except for a single set-based (and fully logged) INSERT.

But if you just want to measure the performance, throwing it into the real table should do to get a ballpark estimate.


Popular Answer

SqlBulkCopy does not write in the transaction logs, making it faster than inserting each row manually.

Even for 1000 rows, there should be a speed gain, especially if it's a highly accessed operation, such as from a web application.



Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Is this KB legal? Yes, learn why
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Is this KB legal? Yes, learn why