Skip to content

Commit

Permalink
Implement fluctuation complexity (#409)
Browse files Browse the repository at this point in the history
* Fluctuation complexity

New information measure

* Docs
  • Loading branch information
kahaaga authored Jun 7, 2024
1 parent a170f33 commit 3a6decd
Show file tree
Hide file tree
Showing 8 changed files with 87 additions and 1 deletion.
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@

Changelog is kept with respect to version 0.11 of Entropies.jl. From version v2.0 onwards, this package has been renamed to ComplexityMeasures.jl.

## 3.6

- New information measure: `FluctuationComplexity`.

## 3.5

- New multiscale API.
Expand Down
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ name = "ComplexityMeasures"
uuid = "ab4b797d-85ee-42ba-b621-05d793b346a2"
authors = "Kristian Agasøster Haaga <[email protected]>, George Datseries <[email protected]>"
repo = "https://github.com/juliadynamics/ComplexityMeasures.jl.git"
version = "3.5.0"
version = "3.6.0"

[deps]
Combinatorics = "861a8166-3701-5b0c-9a16-15d98fcdc6aa"
Expand Down
11 changes: 11 additions & 0 deletions docs/refs.bib
Original file line number Diff line number Diff line change
Expand Up @@ -872,3 +872,14 @@ @article{LeonenkoProzantoSavani2008
doi = {10.1214/07-AOS539},
URL = {https://doi.org/10.1214/07-AOS539}
}

@article{Bates1993,
title={Measuring complexity using information fluctuation},
author={Bates, John E and Shepard, Harvey K},
journal={Physics Letters A},
volume={172},
number={6},
pages={416--425},
year={1993},
publisher={Elsevier}
}
1 change: 1 addition & 0 deletions docs/src/information_measures.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ ShannonExtropy
RenyiExtropy
TsallisExtropy
ElectronicEntropy
FluctuationComplexity
```

## Discrete information estimators
Expand Down
58 changes: 58 additions & 0 deletions src/information_measure_definitions/fluctuation_complexity.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
export FluctuationComplexity

"""
FluctuationComplexity <: InformationMeasure
FluctuationComplexity(; definition = Shannon(; base = 2), base = 2)
The "fluctuation complexity" quantifies the standard deviation of the information content of the states
``\\omega_i`` around some summary statistic ([`InformationMeasure`](@ref)) of a PMF. Specifically, given some
outcome space ``\\Omega`` with outcomes ``\\omega_i \\in \\Omega``
and a probability mass function ``p(\\Omega) = \\{ p(\\omega_i) \\}_{i=1}^N``, it is defined as
```math
\\sigma_I(p) := \\sqrt{\\sum_{i=1}^N p_i(I_i - H_*)^2}
```
where ``I_i = -\\log_{base}(p_i)`` is the information content of the i-th outcome. The type of information measure
``*`` is controlled by `definition`.
The `base` controls the base of the logarithm that goes into the information content terms. Make sure that
you pick a `base` that is consistent with the base chosen for the `definition` (relevant for e.g. [`Shannon`](@ref)).
## Properties
If `definition` is the [`Shannon`](@ref) entropy, then we recover
the [Shannon-type information fluctuation complexity](https://en.wikipedia.org/wiki/Information_fluctuation_complexity)
from [Bates1993](@cite). Then the fluctuation complexity is zero for PMFs with only a single non-zero element, or
for the uniform distribution.
If `definition` is not Shannon entropy, then the properties of the measure varies, and does not necessarily share the
properties [Bates1993](@cite).
!!! note "Potential for new research"
As far as we know, using other information measures besides Shannon entropy for the
fluctuation complexity hasn't been explored in the literature yet. Our implementation, however, allows for it.
Please inform us if you try some new combinations!
"""
struct FluctuationComplexity{M <: InformationMeasure, I <: Integer} <: InformationMeasure
definition::M
base::I

function FluctuationComplexity(; definition::D = Shannon(base = 2), base::I = 2) where {D, I}
if D isa FluctuationComplexity
throw(ArgumentError("Cannot use `FluctuationComplexity` as the summary statistic for `FluctuationComplexity`. Please select some other information measures, like `Shannon`."))
end
return new{D, I}(definition, base)
end
end

# Fluctuation complexity is zero when p_i = 1/N or when p = (1, 0, 0, ...).
function information(e::FluctuationComplexity, probs::Probabilities)
def = e.definition
h = information(def, probs)
non0_probs = Iterators.filter(!iszero, vec(probs))
logf = log_with_base(e.base)
return sqrt(sum(pᵢ * (-logf(pᵢ) - h) ^ 2 for pᵢ in non0_probs))
end

# The maximum is not generally known.
Original file line number Diff line number Diff line change
Expand Up @@ -12,3 +12,5 @@ include("renyi_extropy.jl")

# Measures that are not strictly entropies nor extropies (but perhaps a combination)
include("electronic.jl")

include("fluctuation_complexity.jl")
9 changes: 9 additions & 0 deletions test/infomeasures/infomeasure_types/fluctuation_complexity.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# Examples from https://en.wikipedia.org/wiki/Information_fluctuation_complexity
# for the Shannon fluctuation complexity.
p = Probabilities([2//17, 2//17, 1//34, 5//34, 2//17, 2//17, 2//17, 4//17])
def = Shannon(base = 2)
c = FluctuationComplexity(definition = def, base = 2)
@test round(information(c, p), digits = 2) 0.56
# Zero both for uniform and single-element PMFs.
@test information(c, Probabilities([0.2, 0.2, 0.2, 0.2, 0.2])) == 0.0
@test information(c, Probabilities([1.0, 0.0])) == 0.0
1 change: 1 addition & 0 deletions test/infomeasures/infomeasures.jl
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ include("infomeasure_types/tsallis_extropy.jl")
include("infomeasure_types/renyi_extropy.jl")

include("infomeasure_types/electronic_entropy.jl")
include("infomeasure_types/fluctuation_complexity.jl")


# Estimators
Expand Down

0 comments on commit 3a6decd

Please sign in to comment.