I am basically from database background and new to .net stuff. Please bear with me if my question sounds silly.
I am using SqlBulkCopy in my code to transfer data from one SQL Server to other. But it is failing frequently due to network issues. To avoid that I am planning to do two things
Decrease batch size (from 5000 to 1000) and increase timeout (from 3min. to 1min)
Implement retry logic
My question is
Sample Code that I am using:
private void BulkCopyTable(string schemaName, string tableName)
{using (var reader = srcConnection.ExecuteReader($"select * from [{SourceDBName}].[{schemaName}].[{tableName}]"))
{
const SqlBulkCopyOptions bulkCopyOptions = SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.FireTriggers |
SqlBulkCopyOptions.KeepNulls |
SqlBulkCopyOptions.KeepIdentity;
using (var bcp = new SqlBulkCopy(dstConnection.ConnectionString, bulkCopyOptions))
{
const int threeMinutes = 60*3;
bcp.BulkCopyTimeout = threeMinutes; //Timeout is for a single batch
bcp.BatchSize = 5000;
bcp.DestinationTableName = $"[{DestinationDB}].[{schemaName}].[{tableName}]";
bcp.EnableStreaming = true;
foreach (var col in table.Columns.Cast<Column>().Where(c => !c.Computed))
{
bcp.ColumnMappings.Add(col.Name, col.Name);
}
bcp.WriteToServer(reader);
}
}
}
A simple approach is to:
SqlBulkCopy
needs to query meta data for each WriteToServer
call. So don't make the batches too small. Experiment.#temp
table but a durable one so that you can lose the connection and continue).insert...select
as the final step to move the rows from the temp table to the real table.This dance splits the work into retryable batches but acts as if it was one transaction.
If you don't need atomicity you can leave it at step (1).