Closed DrewScoggins closed 4 years ago
I couldn't figure out the best area label to add to this issue. If you have write-permissions please help me learn by adding exactly one area label.
Diff link broken.
https://github.com/dotnet/runtime/issues/39122#issuecomment-656941297
This should help you track down the diff.
I don't see regressions here locally on Ubuntu 18.04. I see improvements.
dotnet run -c Release -f netcoreapp3.1 --runtimes netcoreapp3.1 netcoreapp5.0 --join --filter "System.Memory.Constructors<Byte>*"
Method | Job | Runtime | Toolchain | Mean | Error | StdDev | Median | Min | Max | Ratio |
---|---|---|---|---|---|---|---|---|---|---|
SpanFromArray | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 3.8188 ns | 0.0291 ns | 0.0258 ns | 3.8170 ns | 3.7838 ns | 3.8563 ns | 1.00 |
SpanFromArray | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 0.2487 ns | 0.0092 ns | 0.0086 ns | 0.2487 ns | 0.2374 ns | 0.2688 ns | 0.07 |
ReadOnlySpanFromArray | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 3.8462 ns | 0.0181 ns | 0.0161 ns | 3.8421 ns | 3.8134 ns | 3.8706 ns | 1.00 |
ReadOnlySpanFromArray | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 0.2399 ns | 0.0076 ns | 0.0068 ns | 0.2386 ns | 0.2311 ns | 0.2523 ns | 0.06 |
SpanFromArrayStartLength | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 4.0660 ns | 0.0164 ns | 0.0153 ns | 4.0635 ns | 4.0239 ns | 4.0848 ns | 1.00 |
SpanFromArrayStartLength | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 0.2254 ns | 0.0139 ns | 0.0130 ns | 0.2239 ns | 0.2078 ns | 0.2520 ns | 0.06 |
ReadOnlySpanFromArrayStartLength | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 4.0591 ns | 0.0303 ns | 0.0283 ns | 4.0590 ns | 4.0063 ns | 4.1000 ns | 1.00 |
ReadOnlySpanFromArrayStartLength | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 0.2327 ns | 0.0127 ns | 0.0119 ns | 0.2285 ns | 0.2137 ns | 0.2509 ns | 0.06 |
SpanFromMemory | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 4.7719 ns | 0.0496 ns | 0.0464 ns | 4.7791 ns | 4.6889 ns | 4.8430 ns | 1.00 |
SpanFromMemory | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 1.1934 ns | 0.0112 ns | 0.0100 ns | 1.1942 ns | 1.1780 ns | 1.2151 ns | 0.25 |
ReadOnlySpanFromMemory | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 4.7748 ns | 0.0395 ns | 0.0329 ns | 4.7745 ns | 4.6984 ns | 4.8358 ns | 1.00 |
ReadOnlySpanFromMemory | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 1.2188 ns | 0.0159 ns | 0.0133 ns | 1.2185 ns | 1.1832 ns | 1.2417 ns | 0.26 |
SpanImplicitCastFromArray | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 3.5946 ns | 0.0160 ns | 0.0142 ns | 3.5965 ns | 3.5658 ns | 3.6183 ns | 1.00 |
SpanImplicitCastFromArray | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 0.2263 ns | 0.0070 ns | 0.0062 ns | 0.2279 ns | 0.2080 ns | 0.2335 ns | 0.06 |
ReadOnlySpanImplicitCastFromArray | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 3.5954 ns | 0.0088 ns | 0.0074 ns | 3.5924 ns | 3.5873 ns | 3.6139 ns | 1.00 |
ReadOnlySpanImplicitCastFromArray | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 0.2489 ns | 0.0320 ns | 0.0315 ns | 0.2302 ns | 0.2216 ns | 0.3193 ns | 0.07 |
SpanImplicitCastFromArraySegment | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 5.2506 ns | 0.0059 ns | 0.0050 ns | 5.2513 ns | 5.2409 ns | 5.2591 ns | 1.00 |
SpanImplicitCastFromArraySegment | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 1.0575 ns | 0.0214 ns | 0.0200 ns | 1.0560 ns | 1.0275 ns | 1.0904 ns | 0.20 |
ReadOnlySpanImplicitCastFromArraySegment | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 5.2690 ns | 0.0290 ns | 0.0271 ns | 5.2705 ns | 5.2280 ns | 5.3278 ns | 1.00 |
ReadOnlySpanImplicitCastFromArraySegment | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 0.9625 ns | 0.0950 ns | 0.1056 ns | 0.9470 ns | 0.8578 ns | 1.1971 ns | 0.19 |
ReadOnlySpanImplicitCastFromSpan | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 3.1182 ns | 0.0116 ns | 0.0103 ns | 3.1182 ns | 3.1004 ns | 3.1399 ns | 1.000 |
ReadOnlySpanImplicitCastFromSpan | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 0.0000 ns | 0.0000 ns | 0.0000 ns | 0.0000 ns | 0.0000 ns | 0.0000 ns | 0.000 |
MemoryFromArray | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 4.0924 ns | 0.0235 ns | 0.0219 ns | 4.1007 ns | 4.0513 ns | 4.1175 ns | 1.00 |
MemoryFromArray | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 3.9561 ns | 0.0840 ns | 0.0744 ns | 3.9162 ns | 3.8607 ns | 4.1049 ns | 0.97 |
ReadOnlyMemoryFromArray | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 4.0800 ns | 0.0222 ns | 0.0207 ns | 4.0813 ns | 4.0273 ns | 4.1142 ns | 1.00 |
ReadOnlyMemoryFromArray | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 3.9015 ns | 0.0275 ns | 0.0244 ns | 3.9075 ns | 3.8441 ns | 3.9283 ns | 0.96 |
MemoryFromArrayStartLength | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 4.0654 ns | 0.0327 ns | 0.0290 ns | 4.0641 ns | 4.0332 ns | 4.1338 ns | 1.00 |
MemoryFromArrayStartLength | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 3.8936 ns | 0.0224 ns | 0.0198 ns | 3.8838 ns | 3.8758 ns | 3.9361 ns | 0.96 |
ReadOnlyMemoryFromArrayStartLength | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 4.0718 ns | 0.0255 ns | 0.0239 ns | 4.0746 ns | 4.0340 ns | 4.1118 ns | 1.00 |
ReadOnlyMemoryFromArrayStartLength | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 3.8835 ns | 0.0208 ns | 0.0174 ns | 3.8797 ns | 3.8541 ns | 3.9260 ns | 0.95 |
ArrayAsSpan | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 3.5620 ns | 0.0247 ns | 0.0219 ns | 3.5602 ns | 3.5317 ns | 3.6053 ns | 1.00 |
ArrayAsSpan | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 0.2368 ns | 0.0066 ns | 0.0055 ns | 0.2394 ns | 0.2253 ns | 0.2431 ns | 0.07 |
ArrayAsSpanStartLength | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 3.5514 ns | 0.0193 ns | 0.0161 ns | 3.5577 ns | 3.5285 ns | 3.5778 ns | 1.00 |
ArrayAsSpanStartLength | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 0.2794 ns | 0.0326 ns | 0.0349 ns | 0.2700 ns | 0.2410 ns | 0.3666 ns | 0.08 |
ArrayAsMemory | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 3.8380 ns | 0.0249 ns | 0.0221 ns | 3.8379 ns | 3.7981 ns | 3.8722 ns | 1.00 |
ArrayAsMemory | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 3.9017 ns | 0.0179 ns | 0.0167 ns | 3.9019 ns | 3.8722 ns | 3.9334 ns | 1.02 |
ArrayAsMemoryStartLength | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 3.8382 ns | 0.0343 ns | 0.0321 ns | 3.8232 ns | 3.8037 ns | 3.8832 ns | 1.00 |
ArrayAsMemoryStartLength | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 3.8888 ns | 0.0281 ns | 0.0263 ns | 3.8948 ns | 3.8271 ns | 3.9316 ns | 1.01 |
MemoryMarshalCreateSpan | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 3.3144 ns | 0.0218 ns | 0.0204 ns | 3.3049 ns | 3.2891 ns | 3.3486 ns | 1.000 |
MemoryMarshalCreateSpan | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 0.0010 ns | 0.0020 ns | 0.0018 ns | 0.0000 ns | 0.0000 ns | 0.0050 ns | 0.000 |
MemoryMarshalCreateReadOnlySpan | Job-GBBTHZ | .NET Core 3.1 | netcoreapp3.1 | 3.3293 ns | 0.0268 ns | 0.0251 ns | 3.3199 ns | 3.3004 ns | 3.3825 ns | 1.000 |
MemoryMarshalCreateReadOnlySpan | Job-XSJRDM | .NET Core 5.0 | netcoreapp5.0 | 0.0000 ns | 0.0000 ns | 0.0000 ns | 0.0000 ns | 0.0000 ns | 0.0000 ns | 0.000 |
@DrewScoggins, a bunch of the issues you opened show the "regressions" happening at what appear to be two distinct moments in time, one in November and one in May. Is it possible what we're seeing is actually the result of something changing in the execution environment, e.g. a patch, a microcode update, etc.? Would it be possible for you to run the tests in the current environment against .NET Core 3.1 and the latest .NET 5 bits, and see whether things showing up as regressions actually show up as regressions in such a run?
@DrewScoggins I wonder if being able to get the raw codegen dump could also help. Basically, for each method, the raw x86 assembly and the memory address of each instruction. It might be a longshot and I don't want to randomize the perf work too much. But might it make sense to get it for a small handful of these runs and see if it can help diagnose problems any more quickly?
I am taking a look at why we are seeing this behavior on Ubuntu. The odd thing is that the numbers that you are finding and the baseline values seem to be in agreement. This is the same as when I run on a repro machine that is identical to our lab hardware. Also looking at the changes that went into the infra over that time regression pop in May, I see nothing of interest. I am going to take a look at a more recent run out of the lab to see if we are still seeing these regressions. If so I am going to have to do a deeper investigation.
Any update @DrewScoggins ?
Yeah, I dug into this a while back and forgot to close the loop here. I was able to reproduce this when I ran against the exact version of the product, 5.0.100-preview.7.20319.6
, that we were measuring in the lab at the point that we are comparing. Later runs have now shown that regression going away, so it was likely an alignment issue. So I think the right action to take here is close the bug, but I don't think there is anything wrong with the lab, we just some times see these types of behaviors.
OK, thanks for investigating! After 5.0, perhaps it would be worth us doing some thinking about how to reduce bimodality in our tests eg by randomizing alignment somehow. I see numerous seemingly bimodal or jittery ones in the full list.
Run Information
Regressions in System.Memory.Constructors
Historical Data in Reporting System
Repro