Handling empty fields in csv during bulk import to database

c# csv idatareader sqlbulkcopy


I'm using a csv file to import geocode information into our database.

I read the csv file using the library a quick reader of CSV and then used SqlBulkCopy.

A sample of the data I'm importing is shown below.


It functions OK with valid data, but since the database is configured to reject null values, it will fail with the top line.

Is it possible to instruct bulkcopy to disregard corrupt data? The built-in characteristics of the library have been used in an attempt to have the csv reader avoid problematic lines, but they don't seem to work.

csv.SkipEmptyLines = true;
csv.MissingFieldAction = MissingFieldAction.ParseError;
csv.DefaultParseErrorAction = ParseErrorAction.AdvanceToNextLine;

Another option, I suppose, would be to pre-parse the csv and eliminate all the problematic rows. Perhaps there is a better.net csv library available?

5/22/2018 5:24:06 AM

Accepted Answer

I ultimately parsed the csv into a DataTable object 1000 items at a time and imported them one at a time to cope with the null entries.

10/19/2012 12:41:37 PM

Popular Answer

We could assist you more if you posted your csv reader code. However, by examining the code on your connected website, you might do the following action:

while (csv.ReadNextRecord())
        for (int i = 0; i < fieldCount; i++)
            Console.Write(string.Format("{0} = {1};",
                          csv[i] ?? "null"));

Check out where I inserted that null-coalescing operator. Your output should alter as a result from:




Related Questions

Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow