zzz projects SqlBulkCopy Tutorial
Documentation Troubleshooting
Knowledge Base
  • Documentation
  • Troubleshooting
  • Knowledge Base

SqlBulkCopy Tutorial - Knowledge Base (KB)

4 results in tag: powershell

Powershell: Implementing an IdataReader wrapper around streamreader

I am trying to load extremely large CSV files into SQL Server using Powershell. The code also has to apply on the fly regex replacements, allow for various delimiters, EOR, and EOF markers. For maintenance, I would really like all of this logic to exist...
idatareader powershell sqlbulkcopy streamreader
asked by Snowdogging

Insert xml data into SQL server via SqlBulkCopy with powershell (casting error)

I'm getting a "cast not valid" error when trying to insert a data table containing xml using sqlbulckcopy....I cast the DataColumn that will hold the xml data to type System.Xml.XmlNode is that incorrect? Looking ...here... it just says the .NET Framework...
.net powershell sqlbulkcopy sql-server xml
asked by red888

Powershell inserting DataTable with SqlBulkCopy, that hold datetime columns

I'm trying to insert a predefined ...DataTable... into a SQL Server database, using ...Data.SqlClient.SqlBulkCopy.......There are different datatypes in the ...DataTable..., but the problem is with ...Datetime... columns....Like this one:...$Datatable.Col...
datatable datetime powershell sqlbulkcopy sql-server
asked by Simon Bruun

How to save an Excel (xlsm) as CSV file with semicolon delimiter?

For a project i was to trying to do this:...Using powershell, open an excel file. (Done)...Refresh the table with a data connection to a website. (Done)...Save refreshed Excel in the same path.(Done)...Convert that file into a CSV used to be loaded into S...
csv excel powershell sqlbulkcopy sql-server
asked by Giuseppe Lolli

Page 1 of 1
  • 1

Prime Library

Performance

  • Entity Framework Extensions
  • Entity Framework Classic
  • Bulk Operations
  • Dapper Plus

Expression Evaluator

  • C# Eval Expression
  • SQL Eval Function
More Projects...
Get monthly updates by subscribing to our newsletter!
SUBSCRIBE!