I am generating a CSV with EXPORT-CSV in Powershell and then feeding it to a Perl script. But Perl is unable to import the file.
I have verified the CSV-file against a working version (that has been exported from the same Perl-script and not powershell) and there are NO difference. The coloumns are excactly the same and they both have semicolon as delimiter. If I open the file in Excel however everything ends up in the first cell on each line (meaning I have to do a text-to-coloumns). The working file ends up in a different cells from the start..
To add to the confusion: when I open the file in notepad and copy/paste the contents to a new file the import works!
So, what am I missing? Are there “hidden” properties that I cannot spot with Notepad? Do I have to change the encoding-type?
To get a better look at your CSV files try using Notepad++. This will tell you the file encoding in the status bar. Also turn on hidden characters (View > Show Symbol > Show All Characters). This will reveal if there are just line feeds, or carriage returns + line feeds, tabs vs spaces etc… You can also change the file encoding from the Encoding menu. This may help you identify the differences. Notepad doesn’t display any of this information.
Update – Here’s how to convert a text file from Windows to Unix format in code:
$allText = [IO.File]::ReadAllText("C:\test.csv") -replace "`r`n?", "`n"
$encoding = New-Object System.Text.ASCIIEncoding
[IO.File]::WriteAllText("C:\test2.csv", $allText, $encoding)
Or you can use Notepad++ (Edit > EOL Conversion > Unix Format).