Monitoring GPU usage via PowerShell requires a specific module, and it's often vendor-dependent. One common module for monitoring GPU usage on Windows is the NVIDIA System Management Interface (nvidia-smi) for NVIDIA GPUs. Here's an example script that uses nvidia-smi to monitor GPU usage and generate a daily report:
# Set the output file path
$OutputPath = "C:\Path\To\Reports\"
# Create a folder for reports if it doesn't exist
if (-not (Test-Path -Path $OutputPath)) {
New-Item -ItemType Directory -Path $OutputPath
}
# Set the duration for monitoring (in seconds)
$MonitoringDuration = 3600 # 1 hour
# Generate a timestamp for the report
$Timestamp = Get-Date -Format "yyyyMMdd_HHmmss"
$ReportFileName = "GPU_Usage_Report_$Timestamp.csv"
$ReportFilePath = Join-Path -Path $OutputPath -ChildPath $ReportFileName
# Get GPU usage information using nvidia-smi
$GPUInfo = Invoke-Expression -Command "& 'C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe' --query-gpu=timestamp,utilization.gpu --format=csv,noheader,nounits -l $MonitoringDuration"
# Create an array to store the data
$ReportData = @()
# Process GPU usage data and populate the report array
foreach ($Sample in $GPUInfo) {
$Timestamp, $GPUUsage = $Sample -split ','
$ReportData += [PSCustomObject]@{
Timestamp = $Timestamp
GPUUsagePercent = $GPUUsage
}
}
# Export the report data to a CSV file
$ReportData | Export-Csv -Path $ReportFilePath -NoTypeInformation
Write-Host "GPU Usage Report generated: $ReportFilePath"
Explanation:
Customize the $OutputPath variable to the desired directory where you want to save the reports.
Adjust the $MonitoringDuration variable to set the duration for monitoring. In the example, it's set to 1 hour (3600 seconds).
The script uses nvidia-smi to query GPU timestamp and utilization information. Make sure to adjust the path to nvidia-smi based on your installation.
The script creates a CSV file with a timestamp in the filename, containing data on GPU utilization for each sample.
Run this script daily using Task Scheduler to generate daily reports.
Note: This script assumes you have NVIDIA GPUs and nvidia-smi installed on your system. For other GPU vendors, you would need to use their specific tools or APIs to collect GPU usage information. Additionally, ensure that nvidia-smi is in the system's PATH or provide the full path to it in the script.
Monitoring GPU usage via PowerShell requires a specific module, and it's often vendor-dependent. One common module for monitoring GPU usage on Windows is the NVIDIA System Management Interface (nvidia-smi) for NVIDIA GPUs. Here's an example script that uses nvidia-smi to monitor GPU usage and generate a daily report:
Explanation:
Customize the
$OutputPath
variable to the desired directory where you want to save the reports.Adjust the
$MonitoringDuration
variable to set the duration for monitoring. In the example, it's set to 1 hour (3600 seconds).The script uses nvidia-smi to query GPU timestamp and utilization information. Make sure to adjust the path to nvidia-smi based on your installation.
The script creates a CSV file with a timestamp in the filename, containing data on GPU utilization for each sample.
Run this script daily using Task Scheduler to generate daily reports.
Note: This script assumes you have NVIDIA GPUs and nvidia-smi installed on your system. For other GPU vendors, you would need to use their specific tools or APIs to collect GPU usage information. Additionally, ensure that nvidia-smi is in the system's PATH or provide the full path to it in the script.