Bulk insert is not working properly in Azure SQL Server

azure azure-sql-database c# ef-bulkinsert sqlbulkcopy

Question

I'm not able to insert the bulk amount of data into Azure SQL server DB using C# webapi

Consider

I want to insert 60K> data in SQL. In my local sql server there is no problem but in Azure SQL its getting connection timed-out

My approach:(All are working in local sql server but not in Azure sql server)

1) Tried using EF its inserting record one by one (For 10000 approx. 10 min,mostly timeout)

2) Tried using Bulk insert Extension along with EF 3) Tried in SqlBulkCopy

4) Tried increasing connection time out in connection string

5) Tried increasing command time out in Dbcontext.

Exception StackTrace

Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.
System.Data.SqlClient.SqlException (0x80131904): Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding. ---> System.ComponentModel.Win32Exception (0x80004005): The wait operation timed out
   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
   at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)
   at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error)
   at System.Data.SqlClient.TdsParserStateObject.ReadSniSyncOverAsync()
   at System.Data.SqlClient.TdsParserStateObject.TryReadNetworkPacket()
   at System.Data.SqlClient.TdsParserStateObject.TryPrepareBuffer()
   at System.Data.SqlClient.TdsParserStateObject.TryReadByte(Byte& value)
   at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)
   at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
   at System.Data.SqlClient.SqlBulkCopy.RunParser(BulkCopySimpleResultSet bulkCopyHandler)
   at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinuedOnSuccess(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinued(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsync(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestAsync(CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalAsync(CancellationToken ctoken)
   at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table, DataRowState rowState)

Is there any solution for it or any configuration to be changed in Azure?

Update

Code used for bulk insert

  using (var dbConnection = new DBModel().Database.Connection as SqlConnection)
                {
                    dbConnection?.Open();
                    using (var sqlBulkCopy = new SqlBulkCopy(dbConnection))
                    {
                        try
                        {
                            /* ColumnMapping
                             * Column is mapped to DB Column to DataTable Column
                             *
                             */
                            sqlBulkCopy.EnableStreaming = true;
                            sqlBulkCopy.BulkCopyTimeout = 500;
                            sqlBulkCopy.DestinationTableName = "LogTable";
                            //dt is object of the Datatable
                            sqlBulkCopy.WriteToServer(dt);
                        }
                        catch (Exception ex)
                        {

                        }
                    }


                }
1
3
1/18/2017 6:17:17 AM

Popular Answer

I'd suggest you set sqlBulkCopy.BatchSize to a reasonable amount, instead of inserting everything in one batch. Depending on the data you're inserting, try starting with 10.000 and work your way up or down until you're satisfied with the performance.

Edit for some extra clarification: When you consider your batch size, you need to take into consideration that SqlBulkCopy will need to not only insert the data, but also read AND send it - the last part is probably the reason why it works on your local SQL server, but not on Azure - it also means that, if you're working with a large dataset, that you will need to work with lower batch size, or a considerably higher BulkCopyTimeout setting, to allow each batch the chance to finish before reaching the timeout limit.

You can read more on batch sizes in this post. What is the recommended batch size for SqlBulkCopy?

Other option:
I was reading up on this, and it could simply be because your insert reaches a critical DTU (Database Transaction Unit, basically a measure of the servers combined resources) usage point.

Performance levels are calibrated and governed to provide the needed resources to run your database workload up to the max limits allowed for your selected service tier/performance level. If your workload is hitting the limits in one of CPU/Data IO/Log IO limits, you will continue to receive the resources at the maximum allowed level, but you are likely to see increased latencies for your queries. These limits will not result in any errors, but just a slowdown in your workload, unless the slowdown becomes so severe that queries start timing out.

Taken from this link: https://azure.microsoft.com/da-dk/blog/azure-sql-database-introduces-new-near-real-time-performance-metrics/
Try starting the copy again while monitoring the DTU usage and see if it's on 100% for long(er) periods. If that is the case, you might want to up your pricing tier scale for the database.

5
5/23/2017 12:24:22 PM


Related Questions





Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow