200,000 rows or info with SQLBulkCopy c#

c# sql sqlbulkcopy

Question

I have a database table with three strings-only columns. I have to submit 200,000 rows of data. I can accomplish this, but it's taking much too long—like, 30 minutes or more—as you could anticipate.

Now that I'm attempting to utilize SQLBulkCopy to speed things up, I'm not sure how to accomplish it right. Here is an example of a row of data:

"test string","test string,"test string"

Should I save my data to a temporary file so that SQLBulkCopy may upload it there? like dividing the data into rows with commas and having each line represent a row?

Here is what I currently have; any assistance would be much appreciated!

//this method gets a list of the data objects passed in

 List<string[]> listOfRows = new List<string[]>(); // holds all the rows to be bulk copied to the data table in th database

foreach (DataUnit dataUnit in dataUnitList)
{         
 string[] row = new string[2];
 row[0] = dataUnit.value1.ToString();
 row[1] = dataUnit.value2.ToString();
 row[2] = dataUnit.value3.ToString() ;
 listOfRows.Add(row);
 }

File.Create(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Test200kFile.txt"));

using (System.IO.StreamWriter file = new System.IO.StreamWriter(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Test200kFile.txt")))
             {

                 foreach(string[] array in listOfRows)
                 {
                     file.Write(array[0] + "," + array[1] + "," + array[2]);
                 }
             }

             using (MyFileDataReader reader = new MyFileDataReader(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Test200kFile.txt")))
             {
                 SqlBulkCopy bulkCopy = new SqlBulkCopy("my connection string to database");
                 bulkCopy.DestinationTableName = "my table name";
                 bulkCopy.BatchSize = 20000; //what if its less than 200,000?

                 bulkCopy.WriteToServer(reader);

                 bulkCopy.Close();

             }  
1
0
8/19/2014 10:55:07 AM

Accepted Answer

The source data must be present in a file on the database server in order for the bulk insert T-SQL to work. If you develop code, it is not the route you want to go.

Instead, you ought to design an adaptor for your data source. The adaptor need to put some of theIDataReader the interfaceSqlBulkCopy employ merely a handful of the techniques ofIDataReader hence, only those must be put into practice:

Properties

  • FieldCount

Methods

  • GetName
  • GetOrdinal
  • GetValue
  • Read

(Reference: zz-56-zz)

So there is no need to write your data to a file unless it already is. Make sure you can only play one record at a time on theIDataReader interface.

Saving the data to a file is time- and resource-consuming. I previously created an import process that used an XmlReader to read XML data from an FTP data source before being wrapped in a bespoke framework.IDataReader . Without having to write the complete data set to disk on the app server, I could stream the data from the ftp server, via the app server, to the DB server.

3
8/19/2014 11:03:22 AM


Related Questions





Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow