Closed avishnyakov closed 7 years ago
Yep! There are a few ways around that:
Cheers!
I get this same issue. I think that the default log file name should include this logic.
Oh, misread this, you're talking the Invoke-Parallel log file.
Any preferences on a workaround?
C:\temp\log.log
, let user specify a more unique name if desiredJoin-Path $ENV:AppData Invoke-Parallel-<a new guid>.log
?Cheers!
Just thought I would leave another suggestion here for you.
Original:
if($logFile -and $log){
($log | ConvertTo-Csv -Delimiter ";" -NoTypeInformation)[1] | out-file $LogFile -append
}
Modified:
if($logFile -and $log){
$isOutFileComplete = $false
While ($isOutFileComplete -eq $false) {
Try {
($log | ConvertTo-Csv -Delimiter ";" -NoTypeInformation)[1] | out-file $LogFile -append
$isOutFileComplete = $true
}
Catch {
Start-Sleep -Milliseconds 50
}
}
}
This should allow for all code to be written to the same log file without blocking errors. Only downside is that it may be a little slower when multiple processes are writing to the log at once and have to sleep.
FYI - 50 Milliseconds was an arbitrary number, might be able to get away with 10 milliseconds.
Alrighty, workaround for now is that Invoke-Parallel won't log unless you explicitly tell it to by including a log file path. At some point, mutexes or a bound 'check every few milliseconds' bit might help
Cheers!
Temporary workaround in place, closing this out. If someone wants to tackle adding mutex support when logging is specified, that would be awesome, just open a new issue - cheers!
Me again :)
Sometimes CI build agents give the following. Cause we run up to 16-20 items in parallel from different builds. Just FYI.