dotnet / runtime

.NET is a cross-platform runtime for cloud, mobile, desktop, and IoT apps.
https://docs.microsoft.com/dotnet/core/
MIT License
14.98k stars 4.66k forks source link

[Apple][TVOS][release/6.0-staging] Unhandled exception. System.IO.IOException: No space left on device #105146

Open richlander opened 2 months ago

richlander commented 2 months ago

I am seeing this. Either our tests are doing something wrong, our infra isn't cleaning things up, or someone downloaded too many shows.

Leg: Build tvOSSimulator x64 Release AllSubsets_Mono


   at Microsoft.VisualStudio.Services.Agent.Worker.Program.MainAsync(IHostContext context, String[] args) in /Users/runner/work/1/s/src/Agent.Worker/Program.cs:line 52
System.IO.IOException: No space left on device : '/Users/runner/runners/3.241.0/_diag/Worker_20240718-051117-utc.log'
   at System.IO.RandomAccess.WriteAtOffset(SafeFileHandle handle, ReadOnlySpan`1 buffer, Int64 fileOffset)
   at System.IO.Strategies.BufferedFileStreamStrategy.FlushWrite()
   at System.IO.StreamWriter.Flush(Boolean flushStream, Boolean flushEncoder)
   at System.Diagnostics.TextWriterTraceListener.Flush()
   at Microsoft.VisualStudio.Services.Agent.HostTraceListener.TraceEvent(TraceEventCache eventCache, String source, TraceEventType eventType, Int32 id, String message) in /Users/runner/work/1/s/src/Microsoft.VisualStudio.Services.Agent/HostTraceListener.cs:line 81
   at System.Diagnostics.TraceSource.TraceEvent(TraceEventType eventType, Int32 id, String message)
   at Microsoft.VisualStudio.Services.Agent.Tracing.Error(Exception exception) in /Users/runner/work/1/s/src/Microsoft.VisualStudio.Services.Agent/Tracing.cs:line 56
   at Microsoft.VisualStudio.Services.Agent.Worker.Program.MainAsync(IHostContext context, String[] args) in /Users/runner/work/1/s/src/Agent.Worker/Program.cs:line 66
Unhandled exception. System.IO.IOException: No space left on device : '/Users/runner/runners/3.241.0/_diag/Worker_20240718-051117-utc.log'
   at System.IO.RandomAccess.WriteAtOffset(SafeFileHandle handle, ReadOnlySpan`1 buffer, Int64 fileOffset)
   at System.IO.Strategies.BufferedFileStreamStrategy.FlushWrite()
   at System.IO.StreamWriter.Flush(Boolean flushStream, Boolean flushEncoder)
   at System.Diagnostics.TextWriterTraceListener.Flush()
   at System.Diagnostics.TraceSource.Flush()
   at Microsoft.VisualStudio.Services.Agent.TraceManager.Dispose(Boolean disposing) in /Users/runner/work/1/s/src/Microsoft.VisualStudio.Services.Agent/TraceManager.cs:line 66
   at Microsoft.VisualStudio.Services.Agent.TraceManager.Dispose() in /Users/runner/work/1/s/src/Microsoft.VisualStudio.Services.Agent/TraceManager.cs:line 58
   at Microsoft.VisualStudio.Services.Agent.HostContext.Dispose(Boolean disposing) in /Users/runner/work/1/s/src/Microsoft.VisualStudio.Services.Agent/HostContext.cs:line 589
   at Microsoft.VisualStudio.Services.Agent.HostContext.Dispose() in /Users/runner/work/1/s/src/Microsoft.VisualStudio.Services.Agent/HostContext.cs:line 531
   at Microsoft.VisualStudio.Services.Agent.Worker.Program.Main(String[] args) in /Users/runner/work/1/s/src/Agent.Worker/Program.cs:line 24
,##[error]We stopped hearing from agent Azure Pipelines 13. Verify the agent machine is running and has a healthy network connection. Anything that terminates an agent process, starves it for CPU, or blocks its network access can cause this error. For more information, see: https://go.microsoft.com/fwlink/?linkid=846610
Agent: Azure Pipelines 13
Started: Wed at 10:11 PM
Duration: 1h 54m 35s
dotnet-policy-service[bot] commented 2 months ago

Tagging subscribers to this area: @dotnet/runtime-infrastructure See info in area-owners.md if you want to be subscribed.

ivanpovazan commented 2 months ago

/cc: @matouskozak

matouskozak commented 2 months ago

I took a look at the current state and I'm not seeing this failure anywhere (last 3 rolling builds).

I see a dip in executed tests between 18-19 of July which could be related to the Azure outage. @richlander do you happen to have a link to the build with this error so I can take a look what happened?

richlander commented 1 month ago

https://github.com/dotnet/runtime/pull/105063

https://dev.azure.com/dnceng-public/public/_build/results?buildId=745877&view=logs&jobId=cd385208-11d3-5d70-fa38-1c21a65d17f1

matouskozak commented 1 month ago

105063

https://dev.azure.com/dnceng-public/public/_build/results?buildId=745877&view=logs&jobId=cd385208-11d3-5d70-fa38-1c21a65d17f1

Ah, sorry. I was looking at runtime pipeline and not runtime-staging. Looks like this is a recurring issue on this pipeline. This is a range of commits when the failure started occurring https://github.com/dotnet/runtime/compare/fe40750d...4b5fdbce. I wonder if it could be related to the pipeline changes introduced in https://github.com/dotnet/runtime/commit/ac83237808c117098e5f79ad4ef36ea07fdbb42a.

richlander commented 1 month ago

Sorry for not including the branch name in the title. Will definitely do that next time.