Question

I have data in the csv file similar to this:

Name,Age,Location,Score
"Bob, B",34,Boston,0
"Mike, M",76,Miami,678
"Rachel, R",17,Richmond,"1,234"

While trying to BULK INSERT this data into a SQL Server table, I encountered two problems.

  1. If I use FIELDTERMINATOR=',' then it splits the first (and sometimes the last) column
  2. The last column is an integer column but it has quotes and comma thousand separator whenever the number is greater than 1000

Is there a way to import this data (using XML Format File or whatever) without manually parsing the csv file first?

I appreciate any help. Thanks.

Popular Answer

Use MySQL load data:

LOAD DATA LOCAL INFILE 'path-to-/filename.csv' INTO TABLE `sql_tablename` 
CHARACTER SET 'utf8' 
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
IGNORE 1 LINES;

The part optionally enclosed by '\"', or escape character and quote, will keep the data in the first column together for the first field.

IGNORE 1 LINES will leave the field name row out.

UTF8 line is optional but good to use if names have diacritics, like in José.



Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Is this KB legal? Yes, learn why
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Is this KB legal? Yes, learn why