I am using Visual Studio web application using ASP.NET MVC & C#. I have an Excel file of more than 100000 records for import into SQL Server 2008 R2.
I have used
SqlBulkCopy with ado.net
DataTable to copy data to a table (
Then validating and processing data and marking with valid/invalid.
Then adding and updating records to 3 tables (
Table1 using a stored procedure.
Table4 contains 2 triggers and it is using cursor to update several records related to it inside
Table4 and another table (
When testing it is taking 1 minute for 10000 records from my local machine with debug mode.
If 10 or 100 users imports at same time, would it support?
Is there any better process to improve and supports for many concurrent users?
My computer configuration:
Excel file contains 50 columns
When I test it with 5 concurrent user, 2 or 3 process successfully executed and I get this error:
Transaction (Process ID 66) was deadlocked on lock | communication buffer resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
Can you disable (or maybe, drop and re-create later) indexes during import? Indexes fasten the select but slow down Insert and update for large tables? Use MERGE instead of INSERT and UPDATE.
Find an alternative for cursor (like CTE or while loop or MERGE statement) or use FAST_FORWARD cursor, if not already using.
How are you validating data. Is that step fast enough?
Do you need triggers during Bulk processing. Are these tables being used from somewhere. Code of trigger which will execute per row, can you execute that in a batch in bulk import process. There is an option to DISABLE triggers in Sql. Do check if that suits your purpose.
SSIS packages also can import data and do such stuff.
You can create table2_temp, table3_temp,table4_temp (may be per user dynamically and drop them later). Do your processing separately and MERGE into main table2, table3, table4, if the triggers in table4 do not work on pre-existing data.