Rhino ETL - loading large pipe-delimited files

c# filehelpers rhino-etl sqlbulkcopy sql-server


We've got to load large pipe-delimited files. When loading these into a SQL Server DB by using Rhino ETL (relying upon FileHelpers), is it mandatory to provide a record class? We have to load files to different tables which have dozens of columns - it might take us a whole day to generate them. I guess we can write a small tool to generate the record classes out of the SQL Server tables.

Another approach would be to write an IDataReader wrapper for a FileStream and the pass it on to a SqlBulkCopy.

SqlBulkCopy does require column mappings as well but it does allow column ordinals - that's easy.

Any ideas/suggestions?


7/26/2012 8:52:05 AM

Popular Answer

I don't know much about Rhino ETL, but FileHelpers has a ClassBuilder which allows you to generate the record class at run time. See the documentation for some examples.

So it would be easy to generate a class with something like the following:

SqlCommand command = new SqlCommand("SELECT TOP 1 * FROM Customers;", connection);

// get the schema for the customers table
SqlDataReader reader = command.ExecuteReader();
DataTable schemaTable = reader.GetSchemaTable();

// create the FileHelpers record class
// alternatively there is a 'FixedClassBuilder'
DelimitedClassBuilder cb = new DelimitedClassBuilder("Customers", ","); 
cb.IgnoreFirstLines = 1; 
cb.IgnoreEmptyLines = true; 

// populate the fields based on the columns
foreach (DataRow row in schemaTable.Rows)
     cb.AddField(row.Field<string>("ColumnName"), row.Field<Type>("DataType")); 
     cb.LastField.TrimMode = TrimMode.Both;

// load the dynamically created class into a FileHelpers engine
FileHelperEngine engine = new FileHelperEngine(cb.CreateRecordClass());

// import your records
DataTable dt = engine.ReadFileAsDT("testCustomers.txt"); 
7/26/2012 8:51:44 AM

Related Questions

Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow