SQLBulk WritetoServer from JSON

bulkinsert c# sqlbulkcopy sql-server sql-server-2008

Question

Hello, I'm attempting to input a sizable amount of data fromJSON to SQL Server Therefore, based on my knowledge, I am able to convert theJSON to DataTable and then use SQLBulk Utilizing WriteToSever, write the data to the server (dataTable). Is that the best approach to take?

How will I be able to extract the data?Scope_identity() for every row Insert starting from here?

1
0
7/18/2012 5:11:00 PM

Accepted Answer

It's not really feasible to get scope identification from a bulk insert operation. You should definitely insert each record separately if you wish to precisely get every single scope identity() value in an insert statement.

But you may also think about utilizing a table parameter insert (I wrote an article about it here: http://www.altdevblogaday.com/2012/05/16/sql-server-high-performance-inserts/).

Make a table insert type first:

CREATE TYPE item_drop_bulk_table_rev4 AS TABLE (
    item_id BIGINT,
    monster_class_id INT,
    zone_id INT,
    xpos REAL,
    ypos REAL,
    kill_time datetime
)

Create a process to input the data second:

CREATE PROCEDURE insert_item_drops_rev4
    @mytable item_drop_bulk_table_rev4 READONLY
AS

BEGIN TRANSACTION

-- Lookup the current ID
DECLARE @id_marker bigint
SELECT @id_marker = MAX(primary_key_column) FROM item_drops_rev4 

-- Insert all the data
INSERT INTO item_drops_rev4 
    (item_id, monster_class_id, zone_id, xpos, ypos, kill_time)
SELECT 
    item_id, monster_class_id, zone_id, xpos, ypos, kill_time 
FROM 
    @mytable

-- Return back the identity values
SELECT * FROM item_drops_rev4 WHERE primary_key_column > @id_marker

COMMIT TRANSACTION

Write the following C# code, then input this data:

DataTable dt = new DataTable();
dt.Columns.Add(new DataColumn("item_id", typeof(Int64)));
dt.Columns.Add(new DataColumn("monster_class_id", typeof(int)));
dt.Columns.Add(new DataColumn("zone_id", typeof(int)));
dt.Columns.Add(new DataColumn("xpos", typeof(float)));
dt.Columns.Add(new DataColumn("ypos", typeof(float)));
dt.Columns.Add(new DataColumn("timestamp", typeof(DateTime)));

for (int i = 0; i < MY_INSERT_SIZE; i++) {
    dt.Rows.Add(new object[] { item_id, monster_class_id, zone_id, xpos, ypos, DateTime.Now });
}

// Now we&#039;re going to do all the work with one connection!
using (SqlConnection conn = new SqlConnection(my_connection_string)) {
    conn.Open();
    using (SqlCommand cmd = new SqlCommand("insert_item_drops_rev4", conn)) {
        cmd.CommandType = CommandType.StoredProcedure;

        // Adding a "structured" parameter allows you to insert tons of data with low overhead
        SqlParameter param = new SqlParameter("@mytable", SqlDbType.Structured);
        param.Value = dt;
        cmd.Parameters.Add(param);
        SqlDataReader dr = cmd.ExecuteReader()
        // TODO: Read back in the objects, now attached to their primary keys
    }
}

Now, theoretically, you could just obtain the one "id marker." It may be safer to obtain the whole data batch, but I'm not sure if every SQL Server version will always put entries from a datatable insert in linear order.

1
7/18/2012 5:23:37 PM


Related Questions





Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow