I have 2 tables. The following are just a stripped down version of these tables.
TableA Id <pk> incrementing Name varchar(50) TableB TableAId <pk> non incrementing Name varchar(50)
Now these tables have a relationship to each other.
User 1 comes to my site and does some actions(in this case adds rows to Table A). So I use a SqlBulkCopy all this data in Table A.
However I need to add the data also to Table B but I don't know the newly created Id's from Table A as SQLBulkCopy won't return these.
So I am thinking of having a stored procedure that finds all the id's that don't exist in Table B and then insert them in.
INSERT INTO TableB (TableAId , Name) SELECT Id,Name FROM TableA as tableA WHERE not exists( ...)
However this comes with a problem. A user at any time can delete something from TableB so if a user deletes say a row and then another user comes around or even the same user comes around and does something to Table A my stored procedure will bring back that deleted row in Table B. Since it will still exist in Table A but not Table B and thus satisfy the stored procedure condition.
So is there a better way of dealing with two tables that need to be updated when using bulk insert?
SQLBulkCopy complicates this so I'd consider using a staging table and an OUTPUT clause
Example, in a mixture of client pseudo code and SQL
create SQLConnection Create #temptable Bulkcopy to #temptable Call proc on same SQLConnection proc: INSERT tableA (..) OUTPUT INSERTED.key, .. INTO TableB SELECT .. FROM #temptable close connection
Add another column to A and B called sessionid and use that to identify row batches.