ordo-one / package-benchmark

Swift benchmark runner with many performance metrics and great CI support
Apache License 2.0
326 stars 25 forks source link

Sample benchmark from Readme doesn't compile #295

Closed valeriyvan closed 1 week ago

valeriyvan commented 1 month ago
swift package -c release benchmark                        
Building for production...
[12/12] Linking BenchmarkBoilerplateGenerator-tool
Build of product 'BenchmarkBoilerplateGenerator' complete! (10.41s)
Building for production...
[1/1] Write swift-version--58304C5D6DBC2206.txt
Build of product 'BenchmarkTool' complete! (10.68s)
Build complete!
Building BenchmarkTool in release mode...
Building benchmark targets in release mode for benchmark run...
Building My-Benchmark
Building for production...
[0/2] Write swift-version--58304C5D6DBC2206.txt
Build of product 'BenchmarkBoilerplateGenerator' complete! (0.19s)
Building for production...
[0/5] Write swift-version--58304C5D6DBC2206.txt
[1/5] Generating plugin support files
[2/5] Write sources
[4/6] Compiling My_Benchmark benchmark.swift
/******/Benchmarks/My-Benchmark/benchmark.swift:21:1: error: expected ')' in expression list
 6 |     }
 7 | 
 8 |     Benchmark("All metrics, full concurrency, async",
   |              `- note: to match this opening '('
 9 |               configuration: .init(metrics: BenchmarkMetric.all,
10 |                                    maxDuration: .seconds(10)) { benchmark in
   :
19 |         })
20 |     }
21 | }
   | `- error: expected ')' in expression list
22 | 

/******/Benchmarks/My-Benchmark/benchmark.swift:14:21: error: cannot find 'dummyCounter' in scope
12 |             for _ in 0..<80  {
13 |                 taskGroup.addTask {
14 |                     dummyCounter(defaultCounter()*1000)
   |                     `- error: cannot find 'dummyCounter' in scope
15 |                 }
16 |             }

/******/Benchmarks/My-Benchmark/benchmark.swift:14:34: error: cannot find 'defaultCounter' in scope
12 |             for _ in 0..<80  {
13 |                 taskGroup.addTask {
14 |                     dummyCounter(defaultCounter()*1000)
   |                                  `- error: cannot find 'defaultCounter' in scope
15 |                 }
16 |             }

/******/Benchmarks/My-Benchmark/benchmark.swift:21:1: error: missing argument for parameter 'closure' in call
19 |         })
20 |     }
21 | }
   | `- error: missing argument for parameter 'closure' in call
22 | 

/******/.build/checkouts/package-benchmark/Sources/Benchmark/Benchmark.swift:184:12: note: 'init(_:configuration:closure:setup:teardown:)' declared here
182 |     ///   - closure: The actual benchmark closure that will be measured
183 |     @discardableResult
184 |     public init?(_ name: String,
    |            `- note: 'init(_:configuration:closure:setup:teardown:)' declared here
185 |                  configuration: Benchmark.Configuration = Benchmark.defaultConfiguration,
186 |                  closure: @escaping BenchmarkClosure,

/******/Benchmarks/My-Benchmark/benchmark.swift:10:63: warning: backward matching of the unlabeled trailing closure is deprecated; label the argument with 'teardown' to suppress this warning
 8 |     Benchmark("All metrics, full concurrency, async",
 9 |               configuration: .init(metrics: BenchmarkMetric.all,
10 |                                    maxDuration: .seconds(10)) { benchmark in
   |                                                               `- warning: backward matching of the unlabeled trailing closure is deprecated; label the argument with 'teardown' to suppress this warning
11 |         let _ = await withTaskGroup(of: Void.self, returning: Void.self, body: { taskGroup in
12 |             for _ in 0..<80  {

/******/.build/checkouts/package-benchmark/Sources/Benchmark/Benchmark.swift:412:16: note: 'init(metrics:tags:timeUnits:warmupIterations:scalingFactor:maxDuration:maxIterations:skip:thresholds:setup:teardown:)' declared here
410 |         public var teardown: BenchmarkTeardownHook?
411 | 
412 |         public init(metrics: [BenchmarkMetric] = defaultConfiguration.metrics,
    |                `- note: 'init(metrics:tags:timeUnits:warmupIterations:scalingFactor:maxDuration:maxIterations:skip:thresholds:setup:teardown:)' declared here
413 |                     tags: [String: String] = defaultConfiguration.tags,
414 |                     timeUnits: BenchmarkTimeUnits = defaultConfiguration.timeUnits,

/******/Benchmarks/My-Benchmark/benchmark.swift:10:65: error: contextual closure type '() async throws -> Void' expects 0 arguments, but 1 was used in closure body
 8 |     Benchmark("All metrics, full concurrency, async",
 9 |               configuration: .init(metrics: BenchmarkMetric.all,
10 |                                    maxDuration: .seconds(10)) { benchmark in
   |                                                                 `- error: contextual closure type '() async throws -> Void' expects 0 arguments, but 1 was used in closure body
11 |         let _ = await withTaskGroup(of: Void.self, returning: Void.self, body: { taskGroup in
12 |             for _ in 0..<80  {

Benchmark failed to run due to build error.
hassila commented 2 weeks ago

Thanks for the report - @lehtihet do you think you could help out with this one please?

lehtihet commented 2 weeks ago

There are three problems in the sample benchmark code from the readme which is causing this build failure to occur:

1 and 2. the functions defaultCounter() and dummyCounter(_ count: Int) are created in Benchmarks/Benchmarks/Basic/BenchmarkRunner+Basic.swift and are not accessible by simply importing the benchmark tool.

  1. There is a missing right-parenthesis right after the config initialisation in the benchmark. maxDuration: .seconds(10)) { benchmark in should be maxDuration: .seconds(10))) { benchmark in

As a temporary workaround to get the sample benchmark to compile and run, add the missing parenthesis and re-define the two functions:

import Benchmark

let benchmarks = {
    Benchmark("Minimal benchmark") { benchmark in
      // measure something here
    }

    func defaultCounter() -> Int {
        10
    }

    func dummyCounter(_ count: Int) {
        for index in 0 ..< count {
            blackHole(index)
        }
    }

    Benchmark("All metrics, full concurrency, async",
              configuration: .init(metrics: BenchmarkMetric.all,
                                   maxDuration: .seconds(10))) { benchmark in
        let _ = await withTaskGroup(of: Void.self, returning: Void.self, body: { taskGroup in
            for _ in 0..<80  {
                taskGroup.addTask {
                    dummyCounter(defaultCounter()*1000)
                }
            }
            for await _ in taskGroup {
            }
        })
    }     
}

@hassila What do you think the best approach is to make dummyCounter and defaultCounter available without having to re-define them inside the benchmark? I tried changing them to public in BenchmarkRunner+Basic.swift but this gives the error Attribute 'public' can only be used in a non-local scope. Perhaps changing their location to be under Sources?