Is SqlBulkCopy the approach for regular bulk inserts?

.net security sqlbulkcopy

Question

I am developing an online tool in .NET 4 to read potentially large (thousands of rows) Excel pages into a DataTable and then using SqlBulkCopy.WriteToServer() to upload to a matching Sql Server 2000 (soon to be 2008) table. It is only for use within my company, so I don't need to worry too much about securing it for external use.

I have got it working like a dream, and is naturally much quicker than the row by row inserts I was performing before. But now I am asking myself if it is the right approach.

Firstly, from a business point of view, all of the other data access is through Stored Procedures with exec permissions granted to the user, whereas this requires Table level security.

Secondly, most online documentation mentions that this is more of a tool for one off operations rather than the daily use mine would have.

I can't find many references to the negatives of using SqlBulkCopy. Are there any security issues or other issues that I should be worried about? Are there any alternative approaches that I could consider? As with everything, my main aim is to get it working as quickly as possible while being secure and the "correct" approach.

Many thanks.

Accepted Answer

I'd say you are fine with this choice. It is being used for performance benefit over standard insert statements with a regular command, non-batched calling. This is why it may differ from the "standard" approach with stored procedures.

I'd say security isn't too much of a concern over simply validating the input. All it appears you are doing with it is transferring data directly into a table, not running any custom SQL along side.

As for running it more than once, we use it in production code for large data actions in multiple places, multiple times - the performance benefit outweighs the slightly different DAL code you have to use to get it.

The only potential issue is memory pressure. If you create a DataTable from an object graph (something we do) you get a second copy of that data in memory (albiet, short lived). In some places we have changed the code from using a DataTable to use a custom class that implements IDataReader over the object graph (I say graph, usually a couple of flat lists). This avoids the memory pressure.



Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Is this KB legal? Yes, learn why
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Is this KB legal? Yes, learn why