Skip to content

Commit

Permalink
Updated readme.md
Browse files Browse the repository at this point in the history
  • Loading branch information
JPenuchot committed Apr 5, 2022
1 parent 6066d53 commit bfeda57
Showing 1 changed file with 27 additions and 23 deletions.
50 changes: 27 additions & 23 deletions readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,8 @@ FetchContent_Declare(
FetchContent_MakeAvailable(ctbench)
```

or through `find_package(ctbench REQUIRED)` after installing it.

The [Rule of Cheese](https://github.com/JPenuchot/rule-of-cheese) project can be
used as an example of how to use `ctbench`. It is the project that gave birth to
this tool and is maintained as `ctbench` evolves.
Expand Down Expand Up @@ -94,52 +96,54 @@ ctbench_add_benchmark(function_selection.requires # Benchmark case name
### Declaring a graph target

Once you have several benchmark cases, you can start writing a graph config.
Here's an example as a starting point:

Example configs can be found [here](./grapher/configs/), or by running
`ctbench-grapher-utils --plotter=<plotter> --command=get-default-config`.

```json
{
"plotter": "compare",
"draw_average": true,
"draw_points": false,
"plot_file_extension": ".svg",
"value_json_pointer": "/dur",
"width": 1500,
"height": 500,
"legend_title": "Timings",
"xlabel": "Benchmark size factor",
"ylabel": "Time (µs)",
"group_descriptors": [
{
"name": "Total ExecuteCompiler",
"name": "ExecuteCompiler",
"predicates": [
{
"type": "streq",
"pointer": "/name",
"string": "Total ExecuteCompiler",
"type": "streq"
"string": "Total ExecuteCompiler"
}
]
},
{
"name": "Total Frontend",
"name": "Frontend",
"predicates": [
{
"type": "streq",
"pointer": "/name",
"string": "Total Frontend",
"type": "streq"
"string": "Total Frontend"
}
]
},
{
"name": "Total Backend",
"name": "Backend",
"predicates": [
{
"type": "streq",
"pointer": "/name",
"string": "Total Backend",
"type": "streq"
"string": "Total Backend"
}
]
}
],
"width": 1500,
"height": 500,
"draw_average": true,
"draw_points": false,
"plot_file_extension": ".svg",
"value_json_pointer": "/dur",
"legend_title": "Timings",
"xlabel": "Benchmark size factor",
"ylabel": "Time (µs)"
]
}
```

Expand All @@ -163,9 +167,9 @@ ctbench_add_graph(function_selection-feature_comparison-graph # Target name
```

For each group descriptor, a graph will be generated with one curve
per benchmark case. In this case, you would then get 2 graphs
(Frontend and Backend) with 5 curves (enable_if, enable_if_t, if_constexpr,
control, and requires).
per benchmark case. In this case, you would then get 3 graphs
(`ExecuteCompiler`, `Frontend`, and `Backend`) each with 5 curves (`enable_if`,
`enable_if_t`, `if_constexpr`, `control`, and `requires`).

## Additional

Expand Down

0 comments on commit bfeda57

Please sign in to comment.