More info and documentation here.
Detection Metrics is a set of tools to evaluate object detection neural networks models over the common object detection datasets. The tools can be accessed using the GUI or the command line applications. In the picture below, the general architecture is displayed.
The tools provided are:
- Viewer: view the dataset images with the annotations.
- Detector: run a model over a dataset and get generate a new annotated dataset.
- Evaluator: evaluate the ground truth dataset with another one and get the comparison metrics.
- Deployer: run a model over different inputs like a video or webcam and generate a new annotated dataset.
- Converter: convert a dataset into another dataset format.
- Command line application (CLI): access Detection Metrics toolset through command line
- Detection Metrics as ROS Node: use Detection Metrics as a ROS Node.
- Labelling: add or modify labels in the datasets in runtime when running Deployer.
The idea is to offer a generic infrastructure to evaluate object detection models against a dataset and compute the common statistics:
- mAP
- mAR
- Mean inference time.
Support | Detail |
---|---|
Supported OS | Multiplatform using Docker |
Supported datasets |
|
Supported frameworks |
|
Supported inputs in Deployer |
|
To quickly get started with Detection Metrics, we provide a docker image.
- Download docker image and run it
docker run -dit --name detection-metrics -v [local_directory]:/root/volume/ -e DISPLAY=host.docker.internal:0 jderobot/detection-metrics:noetic
This will start the GUI, provide a configuration file (appConfig.yml can be used) and you are ready to go. Check out the web for more information
Check the installation guide here. This is also the recommended installation for contributors.
Check out the beginner's tutorial.
The top toolbar shows the different tools available.
Two image views are displayed, one with the ground truth and the other with the detected annotations. In the console output, log info is shown.