When working with PowerShell scripts, I often need to output data to text files. Over the years, I have learned that using the correct encoding, particularly UTF-8, is crucial for handling special characters and international text accurately.
In this tutorial, I’ll show you multiple methods to write data to files using UTF-8 encoding in PowerShell. I’ve been using these techniques in my automation scripts for years, and they’ve saved me countless hours of troubleshooting character encoding issues.
Whether you’re creating logs, reports, or configuration files, these methods will ensure your text data remains consistent across different systems and applications.
Why UTF-8 Matters in PowerShell
As a PowerShell developer, it is essential to understand the importance of UTF-8 encoding.
UTF-8 has become the standard encoding for the web and most modern applications, ensuring consistent display of text across platforms.
When dealing with data that contains special characters, international characters, or symbols, UTF-8 prevents those annoying “�” or garbled characters from appearing.
Method 1: Using Out-File with UTF8 Encoding
The Out-File cmdlet in PowerShell is one of the best ways to write data to a file with UTF-8 encoding:
Get-Process | Out-File -FilePath C:\Reports\processes.txt -Encoding UTF8
This command takes the output of Get-Process (a list of all running processes) and writes it to a file using UTF-8 encoding.
You can see the exact output in the screenshot below:

If you’re using PowerShell 5.1 or earlier, you would use:
Get-Process | Out-File -FilePath C:\Reports\processes.txt -Encoding UTF8
For PowerShell 6 (PowerShell Core) and later, Microsoft changed the parameter slightly:
Get-Process | Out-File -FilePath C:\Reports\processes.txt -Encoding utf8
Adding Content Without Overwriting
If you want to append content to an existing file rather than overwrite it, you can use the -Append parameter:
"New log entry: $(Get-Date)" | Out-File -FilePath C:\logs\app_log.txt -Encoding UTF8 -Append
This technique is perfect for appending text to files in PowerShell when creating logs or collecting data over time.
Method 2: Using Set-Content with UTF8 Encoding
Another approach is to use the PowerShell Set-Content cmdlet, which offers a bit more control over the output:
Get-Service | Set-Content -Path C:\Reports\services.txt -Encoding UTF8
The advantage of Set-Content is that it processes objects as strings rather than formatted output, which can be useful in certain scenarios.
To append content using this method:
"System scan completed: $(Get-Date)" | Add-Content -Path C:\logs\system_scan.txt -Encoding UTF8
Method 3: Using StreamWriter for Better Performance
When dealing with large amounts of data or when you need maximum performance, using a .NET StreamWriter can be more efficient:
$file = "C:\Reports\large_data.txt"
$encoding = New-Object System.Text.UTF8Encoding $false # $false = no BOM
$streamWriter = New-Object System.IO.StreamWriter($file, $false, $encoding)
foreach ($i in 1..10000) {
$streamWriter.WriteLine("Line $i: $(Get-Date)")
}
$streamWriter.Close()
This method is particularly useful when writing to a file in a PowerShell loop, as it avoids repeatedly opening and closing the file.
UTF-8 With or Without BOM?
The Byte Order Mark (BOM) is a special character at the beginning of a file that signals it’s UTF-8 encoded. In the StreamWriter example above, I used $false to create UTF-8 encoding without BOM.
For compatibility with some applications (particularly older ones), you might want to include the BOM:
$encoding = New-Object System.Text.UTF8Encoding $true # $true = include BOM
Modern applications typically handle UTF-8 without BOM just fine, and it’s becoming the preferred standard.
Method 4: Export-Csv with UTF8 Encoding
When working with structured data, the Export-Csv cmdlet is extremely useful. The below PowerShell script explains how to use Export-CSV with UTF8 encoding.
Get-Process | Select-Object Name, ID, CPU | Export-Csv -Path C:\Reports\process_metrics.csv -Encoding UTF8 -NoTypeInformation
This creates a properly UTF-8 encoded CSV file that can be opened in Excel or other applications without encoding issues.
Method 5: Using PowerShell Redirection with UTF-8
PowerShell’s redirection operators (> and >>) are convenient, but by default, they don’t use UTF-8. Here’s how to set UTF-8 as the default encoding for redirection:
# For PowerShell 5.1 (Windows PowerShell)
$PSDefaultParameterValues['Out-File:Encoding'] = 'utf8'
# For PowerShell 6+ (PowerShell Core)
$PSDefaultParameterValues['Out-File:Encoding'] = 'utf8NoBOM'
# Now redirection will use UTF-8
Get-Process > C:\Reports\processes.txt
"Append this line" >> C:\Reports\processes.txt
This technique allows you to use the more concise redirection syntax while still ensuring UTF-8 encoding.
Special Considerations for UTF-8 in PowerShell
Here are some special considerations for UTF-8 in PowerShell.
PowerShell Version Differences
In PowerShell 6 and later, the default encoding for cmdlets like Out-File, Set-Content, and Export-Csv was changed to UTF-8 without BOM. In PowerShell 5.1 and earlier, it defaults to UTF-16 (Unicode).
Always specify -Encoding UTF8 explicitly if you need to ensure compatibility across PowerShell versions.
Writing to Files Without Carriage Returns
Sometimes you need to write content to a file without adding a new line character. You can use the -NoNewline parameter with Out-File:
"This will not have a newline at the end" | Out-File -FilePath C:\temp\no_newline.txt -Encoding UTF8 -NoNewline
This is particularly useful when writing to a file without carriage return in PowerShell, such as when generating specific file formats.
Working with Different File Types
When generating files for web applications:
$htmlContent = "<html><body><h1>Hello World</h1></body></html>"
$htmlContent | Out-File -FilePath C:\websites\index.html -Encoding UTF8
For configuration files:
$configContent = @"
{
"appName": "MyApplication",
"version": "1.0.0",
"settings": {
"logLevel": "info",
"maxConnections": 10
}
}
"@
$configContent | Out-File -FilePath C:\config\settings.json -Encoding UTF8
Troubleshooting UTF-8 Encoding Issues
If you encounter issues with your UTF-8 encoded files:
- Check if the application reading the file expects a BOM or not
- Verify the PowerShell version you’re using and adjust the encoding parameter as needed
- When handling files created on different operating systems, be aware of line ending differences (Windows uses CR+LF, while Unix/Linux uses LF)
A simple way to check a file’s encoding is using PowerShell itself:
$file = Get-Content -Path C:\temp\myfile.txt -Encoding UTF8 -Raw
if ($file.StartsWith([char]0xFEFF)) {
Write-Output "File has a UTF-8 BOM"
} else {
Write-Output "File has no BOM"
}
I hope this tutorial explains how to write to files using UTF-8 encoding in PowerShell using various methods such as:
- Using Out-File with UTF8 Encoding
- Using Set-Content with UTF8 Encoding
- Using StreamWriter for Better Performance
- Export-Csv with UTF8 Encoding
- Using PowerShell Redirection with UTF-8
You may also like:
- Replace a String in Text File with PowerShell
- Replace String In JSON File Using PowerShell
- Replace Strings Containing Backslashes in PowerShell
- Replace Semicolon with Comma in PowerShell
Do you have any questions about writing to files with UTF-8 encoding in PowerShell? Let me know in the comments below!

Hey! I’m Bijay Kumar, founder of SPGuides.com and a Microsoft Business Applications MVP (Power Automate, Power Apps). I launched this site in 2020 because I truly enjoy working with SharePoint, Power Platform, and SharePoint Framework (SPFx), and wanted to share that passion through step-by-step tutorials, guides, and training videos. My mission is to help you learn these technologies so you can utilize SharePoint, enhance productivity, and potentially build business solutions along the way.