I am familiar with the C# SqlBulkCopy class where you can call the 'WriteToServer' method passing through a DataTable.
My question is what underlying mechanism in SQL server is used to bulk insert that data?
The reason I ask is that the bulk insert referenced in the Bulk Insert MSDN T-SQL help file requires a data file to import. Does the SqlBulkCopy create a data file?
I would like to understand this stuff to work out whether I can use the bulk insert functionality in SQL.
If I write a SQL statement that prepares all the rows to insert into a particular table (thousands of rows) can I bulk insert them into the destination table? Something like this is how I am doing it now,
INSERT INTO sync_filters (table_name, device_id, road_id, contract_id) SELECT * FROM dbo.sync_contract_filters (@device_id)
And the dbo.sync_contract_filters is a function to generate all the rows to insert. Can this be bulk inserted?
SqlBulkCopy does not create a data file. It streams the data table directly from the .Net DataTable object to the server using the available communication protocol (Named Pipes, TCP/IP, etc...) and insert the data to the destination table in bulk using the same technique used by BCP.
SqlBulkCopy can use a datatable, IDataReader or DataRow as a datasource. Take a look at the WriteToServer methods of the class. I've found it to be quite a useful mechanism of getting data into SQL Server. I've used this in conjunction with CSVReader previously. The previous link gives you an idea of how it works with a class that implements IDataReader.
You might need to play around with the batch sizes to ensure you get the performance you are expecting.
The Data Loading Performance Guide on MSDN is quite a useful resource if you are looking at getting the best performance out of writing large amounts of data quickly. It focuses more on things like BCP and BULK INSERT, but touches on SQLBulkCopy and also provides quite a lot of food for thought (maybe a little too much, but it is a useful reference at least anyway).