Handling empty fields in csv during bulk import to database

c# csv idatareader sqlbulkcopy


I'm importing geocode data to our database from a csv file.

I've used the following library A fast csv reader to read the csv and then using SqlBulkCopy

Here's an example of the data I'm importing


It works ok on good data but on the top line it will throw an exception because the database is set up to not accept null values.

Is there a way to tell bulkcopy to ignore bad data? I've tried to get the csv reader to ignore bad lines by using the in built properties of the library like so but they don't appear to work.

csv.SkipEmptyLines = true;
csv.MissingFieldAction = MissingFieldAction.ParseError;
csv.DefaultParseErrorAction = ParseErrorAction.AdvanceToNextLine;

I guess another option would be to pre-parse the csv and remove all the offending rows. Perhaps there's a better csv library out there for .net?

5/22/2018 5:24:06 AM

Accepted Answer

To deal with the null entries I ended up parsing the csv into a DataTable object 1000 entries at a time and then imported them as I went.

10/19/2012 12:41:37 PM

Popular Answer

If you could post your csv reader code then we could help more. But looking at the code on your linked page, you could do something like this:

while (csv.ReadNextRecord())
        for (int i = 0; i < fieldCount; i++)
            Console.Write(string.Format("{0} = {1};",
                          csv[i] ?? "null"));

See where I have added that null-coalescing operator? This should change your output from:




Related Questions

Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow