I'm using SqlBulkCopy to import records from a spreadsheet into a SQL Server database.
The bulk copy operation works fine under most conditions. However it's doing something I can't get my head around.
I have a column defined in my column mappings:
In the spreadsheet this is a varchar, as it is in my work table (I import to a work table to do some validation before I move the data to my live table).
While I was testing, I went into a record and changed the value from a number to the word FOO. Then I ran my import.
My validation routine correctly identifies a problem with the column - but it's because the column is null, not because the value is not a number. When I checked the database, I found this field was imported with a null value. I thought the value should have been FOO.
This strikes me as kind of weird.
If it makes any difference, I'm using OLEDB to access the spreadsheet like a database table and passing the OLEDBDataReader in as the source in my BulkCopy object.
Is the OLEDB driver smart enough to realize this isn't a valid value in that column? I thought everything would be string/varchar by default. (We don't make any modifications to the column datatype in the spreadsheet)
I've checked the dataReader as suggested - and strange enough, when I change the Amount field to have a string value, the value in the datareader is null. This is the first time I've seen this kind of behavior in an Excel datareader. Other string fields work just fine.
I fixed the issue by adding IMEX=1 to my OLEDB Connection string. Now the value is shown in the datareader. (I guess I need to do this to support mixed value columns)