How do I optimize data import into SQL Server with 100000 records with 100s of concurrent users sqlbulkcopy sql-server


I am using Visual Studio web application using ASP.NET MVC & C#. I have an Excel file of more than 100000 records for import into SQL Server 2008 R2.

I have used SqlBulkCopy with DataTable to copy data to a table (Table1).

Then validating and processing data and marking with valid/invalid.

Then adding and updating records to 3 tables (Table2, Table3 and Table4) from Table1 using a stored procedure.

Table4 contains 2 triggers and it is using cursor to update several records related to it inside Table4 and another table (Table2).

When testing it is taking 1 minute for 10000 records from my local machine with debug mode.

If 10 or 100 users imports at same time, would it support?

Is there any better process to improve and supports for many concurrent users?

My computer configuration:

  • Windows 10 64Bit
  • AMD FX QuardCore 3.6 GHz processor
  • 8GB RAM

  • Excel file contains 50 columns

  • BulkCopy takes 2 - 3 seconds
  • Stored procedure takes 30 seconds to 2 minutes

When I test it with 5 concurrent user, 2 or 3 process successfully executed and I get this error:

Transaction (Process ID 66) was deadlocked on lock | communication buffer resources with another process and has been chosen as the deadlock victim. Rerun the transaction.

8/26/2017 10:48:50 AM

Popular Answer

  1. Can you disable (or maybe, drop and re-create later) indexes during import? Indexes fasten the select but slow down Insert and update for large tables? Use MERGE instead of INSERT and UPDATE.

  2. Find an alternative for cursor (like CTE or while loop or MERGE statement) or use FAST_FORWARD cursor, if not already using.

  3. How are you validating data. Is that step fast enough?

  4. Do you need triggers during Bulk processing. Are these tables being used from somewhere. Code of trigger which will execute per row, can you execute that in a batch in bulk import process. There is an option to DISABLE triggers in Sql. Do check if that suits your purpose.

  5. SSIS packages also can import data and do such stuff.

  6. You can create table2_temp, table3_temp,table4_temp (may be per user dynamically and drop them later). Do your processing separately and MERGE into main table2, table3, table4, if the triggers in table4 do not work on pre-existing data.

8/28/2017 10:11:47 AM

Related Questions

Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow