Skip to content

Commit 5f6ea2f

Browse files
authored
CoreFX workflow docs (#314)
* CoreFX workflow docs
1 parent cd22be9 commit 5f6ea2f

File tree

3 files changed

+268
-4
lines changed

3 files changed

+268
-4
lines changed

README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ Finding these benchmarks in a separate repository might be surprising. Performan
1010

1111
* [Microbenchmarks Guide](./src/benchmarks/micro/README.md) for information on running our microbenchmarks
1212
* [Real-World Scenarios Guide](./src/benchmarks/real-world/JitBench/README.md) for information on running our real-world scenario benchmarks
13+
* [Benchmarking workflow for CoreFX](./docs/benchmarking-workflow-corefx.md) for information on working with CoreFX
1314

1415
## Contributing to Repository
1516

Lines changed: 263 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,263 @@
1+
# Benchmarking workflow for CoreFX
2+
3+
## Table of Contents
4+
5+
- [Introduction](#Introduction)
6+
- [Code Organization](#Code-Organization)
7+
- [CoreFX Prerequisites](#CoreFX-Prerequisites)
8+
- [Preventing Regressions](#Preventing-Regressions)
9+
- [Solving Regressions](#Solving-Regressions)
10+
- [Repro Case](#Repro-Case)
11+
- [Profiling](#Profiling)
12+
- [Running against Older Versions](#Running-against-Older-Versions)
13+
- [Local CoreCLR Build](#Local-CoreCLR-Build)
14+
- [Benchmarking new API](#Benchmarking-new-API)
15+
- [Reference](#Reference)
16+
- [PR](#PR)
17+
18+
## Introduction
19+
20+
This repository is **independent of the CoreFX build system.** All you need to get the benchmarks running is to download the dotnet cli or use the python script. Please see [Prerequisites](./prerequisites.md) for more.
21+
22+
If you are not familiar with BenchmarkDotNet or this repository you should read the [Microbenchmarks Guide](../src/benchmarks/micro/README.md) first. It's really short and concise, we really encourage you to read it.
23+
24+
### Code Organization
25+
26+
All CoreFX benchmarks which have been ported from CoreFX repository belong to the corresponding folders: `corefx\$namespace`. The directory structure is the following (some folders have been ommited for brevity):
27+
28+
```log
29+
PS C:\Projects\performance\src\benchmarks\micro> tree
30+
├───corefx
31+
│ ├───System
32+
│ ├───System.Collections
33+
│ ├───System.ComponentModel.TypeConverter
34+
│ ├───System.Console
35+
│ ├───System.Diagnostics
36+
│ ├───System.Globalization
37+
│ ├───System.IO.Compression
38+
│ ├───System.IO.FileSystem
39+
│ ├───System.IO.MemoryMappedFiles
40+
│ ├───System.IO.Pipes
41+
│ ├───System.Linq
42+
│ ├───System.Memory
43+
│ ├───System.Net.Http
44+
│ ├───System.Net.Primitives
45+
│ ├───System.Net.Sockets
46+
│ ├───System.Numerics.Vectors
47+
│ ├───System.Runtime
48+
│ ├───System.Runtime.Extensions
49+
│ ├───System.Runtime.Numerics
50+
│ ├───System.Runtime.Serialization.Formatters
51+
│ ├───System.Security.Cryptography
52+
│ ├───System.Security.Cryptography.Primitives
53+
│ ├───System.Text.Encoding
54+
│ ├───System.Text.RegularExpressions
55+
│ ├───System.Threading
56+
│ ├───System.Threading.Channels
57+
│ ├───System.Threading.Tasks
58+
│ ├───System.Threading.Tasks.Extensions
59+
│ ├───System.Threading.ThreadPool
60+
│ ├───System.Threading.Timers
61+
│ └───System.Xml.XmlDocument
62+
```
63+
64+
During the port from xunit-performance to BenchmarkDotNet, the namespaces, type and methods names were not changed. The exception to this rule are all `System.Collections` ([#92](https://github.com/dotnet/performance/pull/92)) and `Span<T>` ([#94](https://github.com/dotnet/performance/pull/94)) benchmarks which got rewritten to utilize the full capabilities of BenchmarkDotNet.
65+
66+
Please remember that you can filter the benchmarks using a glob pattern applied to namespace.typeName.methodName ([read more](./benchmarkdotnet.md#Filtering-the-Benchmarks)):
67+
68+
```cmd
69+
dotnet run -f netcoreapp3.0 --filter System.Memory*
70+
```
71+
72+
Moreover, every CoreFX benchmark belongs to a [CoreFX category](../src/benchmarks/micro/README.md#Categories)
73+
74+
### CoreFX Prerequisites
75+
76+
In order to run the benchmarks against local CoreFX build you need to build the CoreFX repository in **Release**:
77+
78+
```cmd
79+
C:\Projects\corefx> build -Release
80+
```
81+
82+
**The most important build artifact for us is CoreRun**. CoreRun is a simple host that does NOT take any dependency on NuGet. BenchmarkDotNet generates some boilerplate code, builds it using dotnet cli and tells CoreRun.exe to run the benchmarks from the auto-generated library. CoreRun runs the benchmarks using the libraries that are placed in its folder. When a benchmarked code has a dependency to `System.ABC.dll` version 4.5 and CoreRun has `System.ABC.dll` version 4.5.1 in its folder, then CoreRun is going to load and use `System.ABC.dll` version 4.5.1. **This means that with a single clone of this dotnet/performance repository you can run benchmarks against private builds of CoreCLR/FX from many different locations.**
83+
84+
Every time you want to run the benchmarks against local build of CoreFX you need to provide the path to CoreRun:
85+
86+
```cmd
87+
dotnet run -f netcoreapp3.0 --coreRun "C:\Projects\corefx\artifacts\bin\runtime\netcoreapp-Windows_NT-Release-x64\CoreRun.exe" --filter $someFilter
88+
```
89+
90+
**Note:** BenchmarkDotNet expects a path to `CoreRun.exe` file (`corerun` on Unix), not to `Core_Root` folder.
91+
92+
Once you rebuild the part of CoreFX you are working on, the appropriate `.dll` gets updated and the next time you run the benchmarks, CoreRun is going to load the updated library.
93+
94+
```cmd
95+
C:\Projects\corefx\src\System.Text.RegularExpressions\src> dotnet msbuild /p:ConfigurationGroup=Release
96+
```
97+
98+
## Preventing Regressions
99+
100+
Preventing regressions is a fundamental part of our performance culture. The cheapest regression is one that does not get into the product.
101+
102+
**Before introducing any changes that may impact performance**, you should run the benchmarks that test the performance of the feature that you are going to work on and store the results in a **dedicated** folder.
103+
104+
```cmd
105+
C:\Projects\performance\src\benchmarks\micro> dotnet run -f netcoreapp3.0 \
106+
--artifacts "C:\results\before" \
107+
--coreRun "C:\Projects\corefx\artifacts\bin\runtime\netcoreapp-Windows_NT-Release-x64\CoreRun.exe" \
108+
--filter System.IO.Pipes*
109+
```
110+
111+
Please try to **avoid running any resource-heavy processes** that could **spoil** the benchmark results while running the benchmarks.
112+
113+
You can also create a **copy** of the folder with CoreRun and all the libraries to be able to run the benchmarks against the **unmodified base** in the future.
114+
115+
After you introduce the changes and rebuild the part of CoreFX that you are working on **in Release** you should re-run the benchmarks. Remember to store the results in a different folder.
116+
117+
```cmd
118+
C:\Projects\corefx\src\System.IO.Pipes\src> dotnet msbuild /p:ConfigurationGroup=Release
119+
120+
C:\Projects\performance\src\benchmarks\micro> dotnet run -f netcoreapp3.0 \
121+
--artifacts "C:\results\after" \
122+
--coreRun "C:\Projects\corefx\artifacts\bin\runtime\netcoreapp-Windows_NT-Release-x64\CoreRun.exe" \
123+
--filter System.IO.Pipes*
124+
```
125+
126+
When you have the results you should use [ResultsComparer](../src/tools/ResultsComparer/README.md) to find out how your changes have affected the performance:
127+
128+
```cmd
129+
C:\Projects\performance\src\tools\ResultsComparer> dotnet run --base "C:\results\before" --diff "C:\results\after" --threshold 2%
130+
```
131+
132+
Sample output:
133+
134+
```log
135+
No Slower results for the provided threshold = 2% and noise filter = 0.3ns.
136+
```
137+
138+
| Faster | base/diff | Base Median (ns) | Diff Median (ns) | Modality|
139+
| -------------------------------------------------------------------------------- | ---------:| ----------------:| ----------------:| --------:|
140+
| System.IO.Pipes.Tests.Perf_NamedPipeStream_ServerIn_ClientOut.ReadWrite(size: 10 | 1.16 | 297167.47 | 255575.49 | |
141+
142+
### Running against the latest .NET Core SDK
143+
144+
To run the benchmarks against the latest .NET Core SDK you can use the [benchmarks_ci.py](../scripts/benchmarks_ci.py) script. It's going to download the latest .NET Core SDK(s) for the provided framework(s) and run the benchmarks for you. Please see [Prerequisites](./prerequisites.md#python) for more.
145+
146+
```cmd
147+
C:\Projects\performance> py scripts\benchmarks_ci.py -f netcoreapp3.0 \
148+
--bdn-arguments="--artifacts "C:\results\latest_sdk"" \
149+
--filter System.IO.Pipes*
150+
```
151+
152+
## Solving Regressions
153+
154+
### Repro Case
155+
156+
Once a regression is spotted, the first thing that you need to do is to create a benchmark that shows the problem. Typically every performance bug report comes with a small repro case. This is a perfect candidate for the benchmark (it might require some cleanup).
157+
158+
The next step is to send a PR to this repository with the aforementioned benchmark. Our automation is going to run this benchmark and export the results to our reporting system. When your fix to CoreFX gets merged, our reports are going to show the difference. It also helps us to keep track of the old performance bugs and make sure that they never come back.
159+
160+
### Profiling
161+
162+
The real performance investigation starts with profiling. To profile the benchmarked code and produce an ETW Trace file ([read more](./benchmarkdotnet.md#Profiling)):
163+
164+
```
165+
dotnet run -f netcoreapp3.0 --profiler ETW --filter $YourFilter
166+
```
167+
168+
The benchmarking tool is going to print the path to the `.etl` trace file. You should open it with PerfView or Windows Performance Analyzer and start the analysis from there. If you are not familiar with PerfView, you should watch [PerfView Tutorial](https://channel9.msdn.com/Series/PerfView-Tutorial) by @vancem first. It's an investment that is going to pay off very quickly.
169+
170+
```log
171+
// * Diagnostic Output - EtwProfiler *
172+
Exported 1 trace file(s). Example:
173+
C:\Projects\performance\artifacts\20190215-0303-51368\Benchstone\BenchF\Adams\Test.etl
174+
```
175+
176+
If profiling using the `--profiler ETW` is not enough, you should use a different profiler. When attaching to a process please keep in mind that what you run in the console is Host process, while the actual benchmarking is performed in dedicated processes. If you want to disable this behavior, you should use [InProcessToolchain](./benchmarkdotnet.md#Running-In-Process).
177+
178+
### Running against Older Versions
179+
180+
BenchmarkDotNet has some extra features that might be useful when doing performance investigation:
181+
182+
- You can run the benchmarks against [multiple Runtimes](./benchmarkdotnet.md#Multiple-Runtimes). It can be very useful when the regression has been introduced between .NET Core releases, for example: between netcoreapp2.2 and netcoreapp3.0.
183+
- You can run the benchmarks using provided [dotnet cli](./benchmarkdotnet.md#dotnet-cli). You can download few dotnet SDKs, unzip them and just run the benchmarks to spot the version that has introduced the regression to narrow down your investigation.
184+
- You can run the benchmarks using few [CoreRuns](./benchmarkdotnet.md#CoreRun). You can build the latest CoreFX in Release, create a copy of the folder with CoreRun and use git to checkout an older commit. Then rebuild CoreFX and run the benchmarks against the old and new builds. This can narrow down your investigation to the commit that has introduced the bug.
185+
186+
### Confirmation
187+
188+
When you identify and fix the regression, you should use [ResultsComparer](../src/tools/ResultsComparer/README.md) to confirm that you have solved the problem. Please remember that if the regression was found in a very common type like `Span<T>` and you are not sure which benchmarks to run, you can run all of them using `--filter *`.
189+
190+
Please take a moment to consider how the regression managed to enter the product. Are we now properly protected?
191+
192+
## Local CoreCLR Build
193+
194+
Sometimes you might need to run the benchmarks using not only local CoreFX but also local CoreCLR build.
195+
196+
Thanks to the simplicity of CoreRun, all you need to do is copy all the runtimes files into the folder with CoreRun.
197+
198+
But before you do it, you should copy/remember the version of CoreCLR reported by BenchmarkDotNet:
199+
200+
```log
201+
BenchmarkDotNet=v0.11.3.1003-nightly, OS=Windows 10.0.17763.253 (1809/October2018Update/Redstone5)
202+
Intel Xeon CPU E5-1650 v4 3.60GHz, 1 CPU, 12 logical and 6 physical cores
203+
.NET Core SDK=3.0.100-preview-009812
204+
[Host] : .NET Core 3.0.0-preview-27122-01 (CoreCLR 4.6.27121.03, CoreFX 4.7.18.57103), 64bit RyuJIT
205+
Job-SBBRNM : .NET Core ? (CoreCLR 4.6.27416.73, CoreFX 4.7.19.11901), 64bit RyuJIT
206+
```
207+
208+
As you can see, before the "update" the version for the Job (not the Host process) was `4.6.27416.73`.
209+
210+
The CoreFX build system has a built-in support for that, exposed by the [CoreCLROverridePath](https://github.com/dotnet/corefx/blob/0e7236fda21a07302b14030c82f79bb981c723a6/Documentation/project-docs/developer-guide.md#testing-with-private-coreclr-bits) build parameter:
211+
212+
```cmd
213+
C:\Projects\corefx> build -Release /p:CoreCLROverridePath="C:\Projects\coreclr\bin\Product\Windows_NT.x64.Release"
214+
```
215+
216+
After the update, the reported CoreCLR version should be changed:
217+
218+
```log
219+
[Host] : .NET Core 3.0.0-preview-27122-01 (CoreCLR 4.6.27121.03, CoreFX 4.7.18.57103), 64bit RyuJIT
220+
Job-CYVJFZ : .NET Core ? (CoreCLR 4.6.27517.0, CoreFX 4.7.19.11901), 64bit RyuJIT
221+
```
222+
223+
**It's very important to validate the reported version numbers** to make sure that you are running the benchmarks using the versions you think you are using.
224+
225+
## Benchmarking new API
226+
227+
When developing new CoreFX features, we should be thinking about the performance from day one. One part of doing this is writing benchmarks at the same time when we write our first unit tests. Keeping the benchmarks in a separate repository makes it a little bit harder to run the benchmarks against new CoreFX API, but it's still very easy.
228+
229+
### Reference
230+
231+
When you develop a new feature, whether it's a new method/type/library in CoreFX all you need to do is to build it in Release and just reference the produced implementation `.dll` from the [MicroBenchmarks.csproj](../src/benchmarks/micro/MicroBenchmarks.csproj) project file.
232+
233+
The easiest way to do it is to open [MicroBenchmarks.sln](../src/benchmarks/micro/MicroBenchmarks.sln) with Visual Studio, right click on the [MicroBenchmarks](../src/benchmarks/micro/MicroBenchmarks.csproj) project file, select "Add", then "Reference..." and in the new Dialog Window click "Browse" in the left bottom corner. From the File Picker, choose the new library and click "Add". Please don't forget to Save the changes (Ctrl+Shift+S). From this moment you should be able to consume new public types and methods exposed by the referenced library.
234+
235+
Sample changes:
236+
237+
```cs
238+
namespace System
239+
{
240+
public static class Console
241+
{
242+
public static void WriteHelloWorld() => WriteLine("Hello World!");
243+
244+
// the rest omitted for brevity
245+
}
246+
}
247+
```
248+
249+
Sample project file change:
250+
251+
```xml
252+
<ItemGroup>
253+
<Reference Include="System.NewAPI">
254+
<HintPath>..\..\..\..\corefx\artifacts\bin\runtime\netcoreapp-Windows_NT-Release-x64\System.Console.dll</HintPath>
255+
</Reference>
256+
</ItemGroup>
257+
```
258+
259+
### PR
260+
261+
Because the benchmarks are not in the CoreFX repository you must do two PR's.
262+
263+
The first thing you need to do is send a PR with the new API to the CoreFX repository. Once your PR gets merged and a new NuGet package is published to the CoreFX NuGet feed, you should remove the Reference to a `.dll` and install/update the package consumed by [MicroBenchmarks](../src/benchmarks/micro/MicroBenchmarks.csproj). After that, you should send a PR to the performance repo.

src/benchmarks/micro/README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ We use BenchmarkDotNet as the benchmarking tool, you can read more about it in [
1010

1111
## Quick Start
1212

13-
The first thing that you need to choose is the Target Framework. Available options are: netcoreapp2.0|2.1|2.2|3.0 and net461. You can specify the target framework using `-f|--framework` argument. For the sake of simplicity, all examples below use `netcoreapp3.0` as the target framework.
13+
The first thing that you need to choose is the Target Framework. Available options are: `netcoreapp2.0|netcoreapp2.1|netcoreapp2.2|netcoreapp3.0|net461`. You can specify the target framework using `-f|--framework` argument. For the sake of simplicity, all examples below use `netcoreapp3.0` as the target framework.
1414

1515
To run the benchmarks in Interactive Mode, where you will be asked which benchmark(s) to run:
1616

@@ -44,7 +44,7 @@ dotnet run -f netcoreapp2.1 --filter * --runtimes netcoreapp2.1 netcoreapp3.0 co
4444

4545
## Private Runtime Builds
4646

47-
If you contribute to CoreFX/CoreCLR and want to benchmark **local builds of the .NET Core** you need to build CoreCLR/FX in Release (including tests) and then provide the path(s) to CoreRun(s). Provided CoreRun(s) will be used to execute every benchmark in a dedicated process.
47+
If you contribute to CoreFX/CoreCLR and want to benchmark **local builds of .NET Core** you need to build CoreFX/CoreCLR in Release (including tests) and then provide the path(s) to CoreRun(s). Provided CoreRun(s) will be used to execute every benchmark in a dedicated process:
4848

4949
```cmd
5050
dotnet run -f netcoreapp3.0 --filter $YourFilter --coreRun "C:\Projects\coreclr\bin\tests\Windows_NT.x64.Release\Tests\Core_Root\CoreRun.exe"
@@ -61,7 +61,7 @@ dotnet run -f netcoreapp3.0 \
6161
"C:\Projects\corefx_fork\bin\runtime\netcoreapp-Windows_NT-Release-x64\CoreRun.exe"
6262
```
6363

64-
If you **prefer to use dotnet cli** for that, you need to pass the path to cli via the `--cli` argument.
64+
If you **prefer to use dotnet cli** instead of CoreRun, you need to pass the path to cli via the `--cli` argument.
6565

6666
BenchmarkDotNet allows you to run the benchmarks for private builds of [Full .NET Framework](../../../docs/benchmarkdotnet.md#Private-CLR-Build) and [CoreRT](../../../docs/benchmarkdotnet.md#Private-CoreRT-Build)
6767

@@ -86,7 +86,7 @@ public class SomeType
8686

8787
### Enabling given benchmark(s) for selected Operating System(s)
8888

89-
This is possible with the `AllowedOperatingSystemsAttribute`. You need to provide a mandatory comment and OS(es) which benchmark(s) can run on.
89+
This is possible with the `AllowedOperatingSystemsAttribute`. You need to provide a mandatory comment and OS(es) that benchmark(s) can run on.
9090

9191
```cs
9292
[AllowedOperatingSystems("Hangs on non-Windows, dotnet/corefx#18290", OS.Windows)]

0 commit comments

Comments
 (0)