How do I optimize data import into SQL Server with 100000 records with 100s of concurrent users

asp.net-mvc sqlbulkcopy sql-server

Question

I'm using C# & ASP.NET MVC in a Visual Studio web application. I need to import more than 100,000 entries from an Excel file into SQL Server 2008 R2.

I've employedSqlBulkCopy using Adobe.netDataTable to add information to a table (Table1 ).

Data is then processed, validated, and marked as valid or invalid.

Afterward, add records to three tables and update them (Table2 , Table3 and Table4 ) fromTable1 by use of a stored method.

Table4 includes two triggers, and it updates multiple records within using a cursor.Table4 additional table (Table2 ).

When testing, 10000 records from my local system in debug mode take one minute.

Will it handle simultaneous imports from 10 or 100 users?

Exists a better procedure that supports several concurrent users while also being improved?

How my PC is set up

  • 64-bit Windows 10
  • Processor AMD FX QuardCore 3.6 GHz
  • 8GB RAM

  • The Excel file has 50 columns.

  • It takes 2–3 seconds to bulk copy.
  • A stored operation might take between 30 and 2 minutes.

I try it with five concurrent users, and two or three processes complete correctly.

Transaction (Process ID 66) was deadlocked on lock | communication buffer resources with another process and has been chosen as the deadlock victim. Rerun the transaction.

1
3
8/26/2017 10:48:50 AM

Popular Answer

  1. Can indexes be disabled (or perhaps deleted and created again later) during import? Although indexes slow down insert and update for big tables, they speed up select. Instead of using INSERT and UPDATE, use MERGE.

  2. If you're not currently utilizing it, look find a substitute for the cursor (such as a CTE, while loop, or MERGE statement), or utilize the FAST FORWARD cursor.

  3. How are the data being verified? Is that movement quick enough?

  4. Do you need triggers while processing data in bulk? Are these tables being used by anyone? You may run the code for the trigger, which will run once per row, in a batch during the bulk import process. In SQL, it is possible to disable triggers. Check to see whether that serves your needs.

  5. SSIS packages may import data and do similar tasks.

  6. Tables 2 temp, 3 temp, and 4 temp may be created (may be per user dynamically and drop them later). If the triggers in table4 do not operate on previously existing data, do your processing independently and MERGE into main table2, table3, and table4.

1
8/28/2017 10:11:47 AM


Related Questions





Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow