When working with PowerShell automation, I often need to process data in loops and write the results to files. This is a common requirement when generating reports, logging application activities, or processing batches of data.
Over my years of PowerShell experience, I’ve discovered several effective approaches to writing data to files within loops.
In this tutorial, I’ll share multiple methods to write to files inside PowerShell loops, along with real-world examples and performance considerations.
Note: For all the examples, I have used VS Code to execute the PowerShell scripts; you can use any editor of your choice.
Method 1: Using Out-File Inside a Loop
The simplest way to write to a file in a PowerShell loop is using the Out-File cmdlet. This method is easy but has some performance limitations for large datasets.
Here’s how to use it:
# Sample data - list of US states
$states = @("Alabama", "Alaska", "Arizona", "Arkansas", "California")
# Create or overwrite the file
"US States Report" | Out-File -FilePath "C:\Reports\states.txt"
"Generated on: $(Get-Date)" | Out-File -FilePath "C:\Reports\states.txt" -Append
"---------------------------" | Out-File -FilePath "C:\Reports\states.txt" -Append
# Write each state to the file in a loop
foreach ($state in $states) {
"State: $state" | Out-File -FilePath "C:\Reports\states.txt" -Append
}
In this example, I’m first creating the file with a header, then appending each state in the loop. The -Append parameter is crucial here as it preserves the existing content.
You can see the exact output in the screenshot below:

However, using Out-File repeatedly in a loop opens and closes the file each time, which can impact performance with large datasets.
Method 2: Using Add-Content for Better Performance
The Add-Content cmdlet is specifically designed for appending content and offers better performance than Out-File for loop scenarios.
Here is an example, along with the complete script.
# Sample data - US sales figures
$salesData = @(
[PSCustomObject]@{Region="East"; Product="Laptop"; Sales=42500},
[PSCustomObject]@{Region="West"; Product="Desktop"; Sales=38750},
[PSCustomObject]@{Region="South"; Product="Tablet"; Sales=29800},
[PSCustomObject]@{Region="North"; Product="Phone"; Sales=56200},
[PSCustomObject]@{Region="Central"; Product="Accessories"; Sales=18300}
)
# Create file with header
Set-Content -Path "C:\Reports\sales_report.csv" -Value "Region,Product,Sales"
# Process each record and append to file
foreach ($record in $salesData) {
Add-Content -Path "C:\Reports\sales_report.csv" -Value "$($record.Region),$($record.Product),$($record.Sales)"
}
The Add-Content cmdlet is optimized for appending operations and generally performs better than Out-File when writing inside loops.
You can see the exact output in the screenshot below:

Method 3: Using StreamWriter for Maximum Performance
For processing large datasets or when performance is critical, using the .NET StreamWriter class provides the best performance since it keeps the file open during the entire loop:
# Sample data - Server monitoring data
$serverData = @(
[PSCustomObject]@{Server="NY-WEB01"; CPU=78; Memory=82; Disk=45},
[PSCustomObject]@{Server="NY-WEB02"; CPU=65; Memory=75; Disk=38},
[PSCustomObject]@{Server="CHI-DB01"; CPU=92; Memory=88; Disk=72},
[PSCustomObject]@{Server="LA-APP01"; CPU=45; Memory=62; Disk=29},
[PSCustomObject]@{Server="MIA-WEB01"; CPU=81; Memory=77; Disk=53}
)
# Create StreamWriter
$filePath = "C:\Reports\server_status.log"
$streamWriter = New-Object System.IO.StreamWriter($filePath, $false)
# Write header
$streamWriter.WriteLine("Server Status Report - $(Get-Date -Format 'yyyy-MM-dd HH:mm:ss')")
$streamWriter.WriteLine("Server, CPU%, Memory%, Disk%")
# Process data in loop
foreach ($server in $serverData) {
$streamWriter.WriteLine("$($server.Server), $($server.CPU), $($server.Memory), $($server.Disk)")
}
# Important: Close the StreamWriter to release the file
$streamWriter.Close()
This method is the most efficient for high-volume data processing because it keeps the file open during the entire loop, avoiding the overhead of repeatedly opening and closing the file.
Method 4: Build a String and Write Once
Another efficient approach is to build a string in memory during the loop and write it to the file once after the loop completes.
Here is how to do this. Follow the example below.
# Sample data - Employee records
$employees = @(
[PSCustomObject]@{Name="John Smith"; Department="Sales"; Location="New York"},
[PSCustomObject]@{Name="Emily Johnson"; Department="Marketing"; Location="Los Angeles"},
[PSCustomObject]@{Name="Michael Brown"; Department="IT"; Location="Chicago"},
[PSCustomObject]@{Name="Sarah Davis"; Department="HR"; Location="Dallas"},
[PSCustomObject]@{Name="Robert Wilson"; Department="Finance"; Location="Boston"}
)
# Initialize string builder with header
$output = "Employee Directory`r`n"
$output += "-------------------`r`n"
# Build content in the loop
foreach ($employee in $employees) {
$output += "Name: $($employee.Name)`r`n"
$output += "Department: $($employee.Department)`r`n"
$output += "Office: $($employee.Location)`r`n"
$output += "-------------------`r`n"
}
# Write the entire content to file at once
$output | Out-File -FilePath "C:\Reports\employee_directory.txt"
This approach minimizes file I/O operations, which can significantly improve performance, especially for larger datasets.
Method 5: Create a Dynamic Log File with Timestamp
For logging scenarios, it’s often useful to create a file with today’s date in the filename:
# Generate a log filename with current date
$logDate = Get-Date -Format "yyyyMMdd"
$logFile = "C:\Logs\application_$logDate.log"
# Sample application events to log
$appEvents = @(
[PSCustomObject]@{Timestamp=(Get-Date).AddMinutes(-30); Level="INFO"; Message="Application started"},
[PSCustomObject]@{Timestamp=(Get-Date).AddMinutes(-25); Level="INFO"; Message="Connected to database"},
[PSCustomObject]@{Timestamp=(Get-Date).AddMinutes(-15); Level="WARNING"; Message="High memory usage detected"},
[PSCustomObject]@{Timestamp=(Get-Date).AddMinutes(-10); Level="ERROR"; Message="Failed to process transaction #12345"},
[PSCustomObject]@{Timestamp=(Get-Date).AddMinutes(-5); Level="INFO"; Message="Backup completed successfully"}
)
# Create the file if it doesn't exist
if (-not (Test-Path $logFile)) {
New-Item -Path $logFile -ItemType File -Force | Out-Null
"Log created at $(Get-Date)" | Out-File -FilePath $logFile
"----------------------------------------" | Out-File -FilePath $logFile -Append
}
# Log each event
foreach ($event in $appEvents) {
$formattedTime = $event.Timestamp.ToString("yyyy-MM-dd HH:mm:ss")
$logEntry = "[$formattedTime] [$($event.Level)] $($event.Message)"
Add-Content -Path $logFile -Value $logEntry
}
This approach is ideal for application logs that need to be organized by date. The script checks if the log file exists before creating it, which is a best practice for log management.
Handle Large Collections with ForEach-Object
When processing very large collections, the ForEach-Object cmdlet with pipeline can be more memory-efficient than a foreach loop. Here is an example.
# Generate a large dataset (simulating 1000 log entries)
$logEntries = 1..1000 | ForEach-Object {
[PSCustomObject]@{
ID = "TRANS-$_"
Timestamp = (Get-Date).AddMinutes(-$_)
Status = (Get-Random -InputObject @("SUCCESS", "PENDING", "FAILED"))
}
}
# Process and write to file using pipeline
$logEntries | ForEach-Object {
$logLine = "[$($_.Timestamp.ToString('HH:mm:ss'))] $($_.ID) - $($_.Status)"
$logLine | Out-File -FilePath "C:\Reports\transaction_log.txt" -Append
}
The ForEach-Object cmdlet processes items one at a time as they come through the pipeline, which can be more efficient for memory usage with very large datasets.
Best Practices for Writing to Files in Loops
After years of PowerShell scripting, I’ve found these best practices to be invaluable:
- Choose the right method based on data size:
- For small datasets:
Out-FileorAdd-Contentis fine - For medium datasets: Use
Add-Contentor build string and write once - For large datasets: Use
StreamWriteror process in batches
- For small datasets:
- Always use try/catch for file operations:
You can see an example below:
try {
foreach ($item in $collection) {
$item | Out-File -FilePath $filePath -Append -ErrorAction Stop
}
}
catch {
Write-Error "Failed to write to file: $_"
}
- Test if the directory exists before writing:
Here is an example and the code to check if a directory exists before writing.
$directory = Split-Path -Path $filePath -Parent
if (-not (Test-Path $directory)) {
New-Item -Path $directory -ItemType Directory -Force
}
- Use UTF-8 encoding without BOM for maximum compatibility:
Here is an example.
$utf8WithoutBom = New-Object System.Text.UTF8Encoding $false
[System.IO.File]::WriteAllLines($filePath, $lines, $utf8WithoutBom)
In this tutorial, I explained how to write to files within loops using PowerShell using several methods. For simple scripts with small datasets, Out-File or Add-Content will serve you well. For enterprise-grade scripts processing thousands of records, consider using StreamWriter or building strings in memory.
I hope you found this tutorial helpful for your PowerShell tasks. Remember that the right approach depends on your specific use case, so feel free to combine these methods as needed.
Other PowerShell articles you may also like:
- How to Write to File Line by Line in PowerShell
- How to Create a Log File using PowerShell
- How to Loop Through an Array in PowerShell

Hey! I’m Bijay Kumar, founder of SPGuides.com and a Microsoft Business Applications MVP (Power Automate, Power Apps). I launched this site in 2020 because I truly enjoy working with SharePoint, Power Platform, and SharePoint Framework (SPFx), and wanted to share that passion through step-by-step tutorials, guides, and training videos. My mission is to help you learn these technologies so you can utilize SharePoint, enhance productivity, and potentially build business solutions along the way.