SQLBulk WritetoServer from JSON

bulkinsert c# sqlbulkcopy sql-server sql-server-2008

Question

Hi I am trying to enter large chunk of data from JSON to SQL Server, So according to what I have researched I Can Convert the JSON to DataTable and then use SQLBulk to Write the Data to Server using WriteToSever(dataTable). Is that the Best way to do it ?

And How Will I be able to extract Scope_identity() for each row Insert from this way?

Accepted Answer

Extracting scope identity isn't really possible from a bulk insert command. If you want to accurately get back every single scope_identity() value in an insert statement, you really have to insert each record individually.

However, you could also consider using a table parameter insert (I wrote an article about it here: http://www.altdevblogaday.com/2012/05/16/sql-server-high-performance-inserts/).

First, create a table insert type:

CREATE TYPE item_drop_bulk_table_rev4 AS TABLE (
    item_id BIGINT,
    monster_class_id INT,
    zone_id INT,
    xpos REAL,
    ypos REAL,
    kill_time datetime
)

Second, create a procedure to insert the data:

CREATE PROCEDURE insert_item_drops_rev4
    @mytable item_drop_bulk_table_rev4 READONLY
AS

BEGIN TRANSACTION

-- Lookup the current ID
DECLARE @id_marker bigint
SELECT @id_marker = MAX(primary_key_column) FROM item_drops_rev4 

-- Insert all the data
INSERT INTO item_drops_rev4 
    (item_id, monster_class_id, zone_id, xpos, ypos, kill_time)
SELECT 
    item_id, monster_class_id, zone_id, xpos, ypos, kill_time 
FROM 
    @mytable

-- Return back the identity values
SELECT * FROM item_drops_rev4 WHERE primary_key_column > @id_marker

COMMIT TRANSACTION

Third, write the C# code to insert this data:

DataTable dt = new DataTable();
dt.Columns.Add(new DataColumn("item_id", typeof(Int64)));
dt.Columns.Add(new DataColumn("monster_class_id", typeof(int)));
dt.Columns.Add(new DataColumn("zone_id", typeof(int)));
dt.Columns.Add(new DataColumn("xpos", typeof(float)));
dt.Columns.Add(new DataColumn("ypos", typeof(float)));
dt.Columns.Add(new DataColumn("timestamp", typeof(DateTime)));

for (int i = 0; i < MY_INSERT_SIZE; i++) {
    dt.Rows.Add(new object[] { item_id, monster_class_id, zone_id, xpos, ypos, DateTime.Now });
}

// Now we&#039;re going to do all the work with one connection!
using (SqlConnection conn = new SqlConnection(my_connection_string)) {
    conn.Open();
    using (SqlCommand cmd = new SqlCommand("insert_item_drops_rev4", conn)) {
        cmd.CommandType = CommandType.StoredProcedure;

        // Adding a "structured" parameter allows you to insert tons of data with low overhead
        SqlParameter param = new SqlParameter("@mytable", SqlDbType.Structured);
        param.Value = dt;
        cmd.Parameters.Add(param);
        SqlDataReader dr = cmd.ExecuteReader()
        // TODO: Read back in the objects, now attached to their primary keys
    }
}

Now, in theory, you could retrieve back only the single "id marker". However, I'm not convinced that every single version of SQL Server out there will always insert records from a datatable insert in linear order, so I think it's safer to retrieve back the whole data batch.



Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Is this KB legal? Yes, learn why
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Is this KB legal? Yes, learn why