Open utterances-bot opened 2 years ago
Dude, you saved my life.
I ran into this area via Google. Option 2 was a life saver. Appreciate you posting this.
Exception calling "WriteToServer" with "1" argument(s): "Cannot access destination table 'Retail transactions'." At C:\Users\bahrawy\Desktop# Database variables.ps1:43 char:3
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : InvalidOperationException
could you please give guide how to use script ?
The first method worked for me too. Had to make some small adjustments to my csv, but nothing beyond that main variables on the PS script. Also, some batches failed due to some fields being larger than my model on the DB. Some fixes on the source file may avoid this in the future! Many thanks.
Hi,
I have been looking for this all over the net, this is really life saver for, however due to embedded Quotes (Only on some fields) I tried to use the VisualBasic.FileIO.TextFieldParser to DataTable batches, however this failed. I have over a million records in csv file, Comma delimited, but some fields which have text with comma have a text qualifier " (Quotes) at the beginning of text and at the end, rest of the fields don't, so its only when a string has , in it. The challenge I am getting it the above scripts either look for quotes for all field delimiters or none, as I mentioned, my field delimiters are just comma.
Can you help ? Please
in the "VisualBasic.FileIO.TextFieldParser to DataTable batches" where do I insert the " (quote) parser function? Sorry I am very very new to programming,.
High-Performance Techniques for Importing CSV to SQL Server using PowerShell | netnerds.net
If you've ever tried to use PowerShell's Import-CSV with large files, you know that it can exhaust all of your RAM. Previously, I created a script on …
https://blog.netnerds.net/2015/01/powershell-high-performance-techniques-for-importing-csv-to-sql-server/