Skip to content

Commit

Permalink
build based on 7bfb8b9
Browse files Browse the repository at this point in the history
  • Loading branch information
Documenter.jl committed Nov 20, 2024
1 parent 87bde3e commit b621656
Show file tree
Hide file tree
Showing 26 changed files with 206 additions and 133 deletions.
2 changes: 1 addition & 1 deletion dev/.documenter-siteinfo.json
Original file line number Diff line number Diff line change
@@ -1 +1 @@
{"documenter":{"julia_version":"1.11.1","generation_timestamp":"2024-11-16T21:22:40","documenter_version":"1.8.0"}}
{"documenter":{"julia_version":"1.11.1","generation_timestamp":"2024-11-20T21:31:56","documenter_version":"1.8.0"}}
48 changes: 24 additions & 24 deletions dev/api/counts_and_probabilities_api/index.html

Large diffs are not rendered by default.

Binary file added dev/api/cross_map_api/88f785a0.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed dev/api/cross_map_api/cfd7051a.png
Binary file not shown.
18 changes: 9 additions & 9 deletions dev/api/cross_map_api/index.html

Large diffs are not rendered by default.

12 changes: 6 additions & 6 deletions dev/api/discretization_counts_probs_api/index.html

Large diffs are not rendered by default.

40 changes: 20 additions & 20 deletions dev/api/information_multivariate_api/index.html

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions dev/api/information_single_variable_api/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
h_sh = information(Kraskov(Shannon()), x)
h_vc = information(Vasicek(Shannon()), x)</code></pre><p>A normal distribution has a base-e Shannon differential entropy of <code>0.5*log(2π) + 0.5</code> nats.</p><pre><code class="language-julia hljs">est = Kraskov(k = 5, base = ℯ) # Base `ℯ` for nats.
h = information(est, randn(2_000_000))
abs(h - 0.5*log(2π) - 0.5) # ≈ 0.0001</code></pre></div><a class="docs-sourcelink" target="_blank" href="https://github.com/JuliaDynamics/ComplexityMeasures.jl/blob/v3.7.3/src/core/information_functions.jl#L227-L267">source</a></section><section><div><pre><code class="language-julia hljs">information(est::MultivariateInformationMeasureEstimator, x...)</code></pre><p>Estimate some <a href="../../associations/#Associations.MultivariateInformationMeasure"><code>MultivariateInformationMeasure</code></a> on input data <code>x...</code>, using the given <a href="../information_multivariate_api/#Associations.MultivariateInformationMeasureEstimator"><code>MultivariateInformationMeasureEstimator</code></a>.</p><p>This is just a convenience wrapper around <a href="../../associations/#Associations.association"><code>association</code></a><code>(est, x...)</code>.</p></div><a class="docs-sourcelink" target="_blank" href="https://github.com/JuliaDynamics/Associations.jl/blob/1054b8f4dec9d530e7d9ccfcf3772d63c7309433/src/methods/information/core.jl#L158-L165">source</a></section></article><h2 id="Single-variable-information-measures"><a class="docs-heading-anchor" href="#Single-variable-information-measures">Single-variable information measures</a><a id="Single-variable-information-measures-1"></a><a class="docs-heading-anchor-permalink" href="#Single-variable-information-measures" title="Permalink"></a></h2><article class="docstring"><header><a class="docstring-article-toggle-button fa-solid fa-chevron-down" href="javascript:;" title="Collapse docstring"></a><a class="docstring-binding" id="ComplexityMeasures.Shannon" href="#ComplexityMeasures.Shannon"><code>ComplexityMeasures.Shannon</code></a><span class="docstring-category">Type</span><span class="is-flex-grow-1 docstring-article-toggle-button" title="Collapse docstring"></span></header><section><div><pre><code class="language-julia hljs">Shannon &lt;: InformationMeasure
abs(h - 0.5*log(2π) - 0.5) # ≈ 0.0001</code></pre></div><a class="docs-sourcelink" target="_blank" href="https://github.com/JuliaDynamics/ComplexityMeasures.jl/blob/v3.7.3/src/core/information_functions.jl#L227-L267">source</a></section><section><div><pre><code class="language-julia hljs">information(est::MultivariateInformationMeasureEstimator, x...)</code></pre><p>Estimate some <a href="../../associations/#Associations.MultivariateInformationMeasure"><code>MultivariateInformationMeasure</code></a> on input data <code>x...</code>, using the given <a href="../information_multivariate_api/#Associations.MultivariateInformationMeasureEstimator"><code>MultivariateInformationMeasureEstimator</code></a>.</p><p>This is just a convenience wrapper around <a href="../../associations/#Associations.association"><code>association</code></a><code>(est, x...)</code>.</p></div><a class="docs-sourcelink" target="_blank" href="https://github.com/JuliaDynamics/Associations.jl/blob/7bfb8b927d41cecb4fe467837308cf64d54b3174/src/methods/information/core.jl#L158-L165">source</a></section></article><h2 id="Single-variable-information-measures"><a class="docs-heading-anchor" href="#Single-variable-information-measures">Single-variable information measures</a><a id="Single-variable-information-measures-1"></a><a class="docs-heading-anchor-permalink" href="#Single-variable-information-measures" title="Permalink"></a></h2><article class="docstring"><header><a class="docstring-article-toggle-button fa-solid fa-chevron-down" href="javascript:;" title="Collapse docstring"></a><a class="docstring-binding" id="ComplexityMeasures.Shannon" href="#ComplexityMeasures.Shannon"><code>ComplexityMeasures.Shannon</code></a><span class="docstring-category">Type</span><span class="is-flex-grow-1 docstring-article-toggle-button" title="Collapse docstring"></span></header><section><div><pre><code class="language-julia hljs">Shannon &lt;: InformationMeasure
Shannon(; base = 2)</code></pre><p>The Shannon (<a href="../../references/#Shannon1948">Shannon, 1948</a>) entropy, used with <a href="#ComplexityMeasures.information"><code>information</code></a> to compute:</p><p class="math-container">\[H(p) = - \sum_i p[i] \log(p[i])\]</p><p>with the <span>$\log$</span> at the given <code>base</code>.</p><p>The maximum value of the Shannon entropy is <span>$\log_{base}(L)$</span>, which is the entropy of the uniform distribution with <span>$L$</span> the <a href="https://juliadynamics.github.io/DynamicalSystemsDocs.jl/complexitymeasures/stable/probabilities/#ComplexityMeasures.total_outcomes"><code>total_outcomes</code></a>.</p></div><a class="docs-sourcelink" target="_blank" href="https://github.com/JuliaDynamics/ComplexityMeasures.jl/blob/v3.7.3/src/information_measure_definitions/shannon.jl#L3-L16">source</a></section></article><article class="docstring"><header><a class="docstring-article-toggle-button fa-solid fa-chevron-down" href="javascript:;" title="Collapse docstring"></a><a class="docstring-binding" id="ComplexityMeasures.Renyi" href="#ComplexityMeasures.Renyi"><code>ComplexityMeasures.Renyi</code></a><span class="docstring-category">Type</span><span class="is-flex-grow-1 docstring-article-toggle-button" title="Collapse docstring"></span></header><section><div><pre><code class="language-julia hljs">Renyi &lt;: InformationMeasure
Renyi(q, base = 2)
Renyi(; q = 1.0, base = 2)</code></pre><p>The Rényi generalized order-<code>q</code> entropy (<a href="../../references/#Rényi1961">Rényi, 1961</a>), used with <a href="#ComplexityMeasures.information"><code>information</code></a> to compute an entropy with units given by <code>base</code> (typically <code>2</code> or <code>MathConstants.e</code>).</p><p><strong>Description</strong></p><p>Let <span>$p$</span> be an array of probabilities (summing to 1). Then the Rényi generalized entropy is</p><p class="math-container">\[H_q(p) = \frac{1}{1-q} \log \left(\sum_i p[i]^q\right)\]</p><p>and generalizes other known entropies, like e.g. the information entropy (<span>$q = 1$</span>, see <a href="../../references/#Shannon1948">Shannon (1948)</a>), the maximum entropy (<span>$q=0$</span>, also known as Hartley entropy), or the correlation entropy (<span>$q = 2$</span>, also known as collision entropy).</p><p>The maximum value of the Rényi entropy is <span>$\log_{base}(L)$</span>, which is the entropy of the uniform distribution with <span>$L$</span> the <a href="https://juliadynamics.github.io/DynamicalSystemsDocs.jl/complexitymeasures/stable/probabilities/#ComplexityMeasures.total_outcomes"><code>total_outcomes</code></a>.</p></div><a class="docs-sourcelink" target="_blank" href="https://github.com/JuliaDynamics/ComplexityMeasures.jl/blob/v3.7.3/src/information_measure_definitions/renyi.jl#L3-L28">source</a></section></article><article class="docstring"><header><a class="docstring-article-toggle-button fa-solid fa-chevron-down" href="javascript:;" title="Collapse docstring"></a><a class="docstring-binding" id="ComplexityMeasures.Tsallis" href="#ComplexityMeasures.Tsallis"><code>ComplexityMeasures.Tsallis</code></a><span class="docstring-category">Type</span><span class="is-flex-grow-1 docstring-article-toggle-button" title="Collapse docstring"></span></header><section><div><pre><code class="language-julia hljs">Tsallis &lt;: InformationMeasure
Expand Down Expand Up @@ -63,4 +63,4 @@
\dfrac{1}{n} \sum_{i = 1}^n \log
\left[ \dfrac{ \sum_{j=i-m}^{i+m}(\bar{X}_{(j)} -
\tilde{X}_{(i)})(j - i)}{n \sum_{j=i-m}^{i+m} (\bar{X}_{(j)} - \tilde{X}_{(i)})^2}
\right],\]</p><p>where</p><p class="math-container">\[\tilde{X}_{(i)} = \dfrac{1}{2m + 1} \sum_{j = i - m}^{i + m} X_{(j)}.\]</p><p>See also: <a href="#ComplexityMeasures.information"><code>information</code></a>, <a href="#ComplexityMeasures.AlizadehArghami"><code>AlizadehArghami</code></a>, <a href="#ComplexityMeasures.Ebrahimi"><code>Ebrahimi</code></a>, <a href="#ComplexityMeasures.Vasicek"><code>Vasicek</code></a>, <a href="#ComplexityMeasures.DifferentialInfoEstimator"><code>DifferentialInfoEstimator</code></a>.</p></div><a class="docs-sourcelink" target="_blank" href="https://github.com/JuliaDynamics/ComplexityMeasures.jl/blob/v3.7.3/src/differential_info_estimators/order_statistics/Correa.jl#L3-L55">source</a></section></article></article><nav class="docs-footer"><a class="docs-footer-prevpage" href="../counts_and_probabilities_api/">« Multivariate counts and probabilities API</a><a class="docs-footer-nextpage" href="../information_multivariate_api/">Multivariate information API »</a><div class="flexbox-break"></div><p class="footer-message">Powered by <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> and the <a href="https://julialang.org/">Julia Programming Language</a>.</p></nav></div><div class="modal" id="documenter-settings"><div class="modal-background"></div><div class="modal-card"><header class="modal-card-head"><p class="modal-card-title">Settings</p><button class="delete"></button></header><section class="modal-card-body"><p><label class="label">Theme</label><div class="select"><select id="documenter-themepicker"><option value="auto">Automatic (OS)</option><option value="documenter-light">documenter-light</option><option value="documenter-dark">documenter-dark</option><option value="catppuccin-latte">catppuccin-latte</option><option value="catppuccin-frappe">catppuccin-frappe</option><option value="catppuccin-macchiato">catppuccin-macchiato</option><option value="catppuccin-mocha">catppuccin-mocha</option></select></div></p><hr/><p>This document was generated with <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> version 1.8.0 on <span class="colophon-date" title="Saturday 16 November 2024 21:22">Saturday 16 November 2024</span>. Using Julia version 1.11.1.</p></section><footer class="modal-card-foot"></footer></div></div></div></body><div data-docstringscollapsed="true"></div></html>
\right],\]</p><p>where</p><p class="math-container">\[\tilde{X}_{(i)} = \dfrac{1}{2m + 1} \sum_{j = i - m}^{i + m} X_{(j)}.\]</p><p>See also: <a href="#ComplexityMeasures.information"><code>information</code></a>, <a href="#ComplexityMeasures.AlizadehArghami"><code>AlizadehArghami</code></a>, <a href="#ComplexityMeasures.Ebrahimi"><code>Ebrahimi</code></a>, <a href="#ComplexityMeasures.Vasicek"><code>Vasicek</code></a>, <a href="#ComplexityMeasures.DifferentialInfoEstimator"><code>DifferentialInfoEstimator</code></a>.</p></div><a class="docs-sourcelink" target="_blank" href="https://github.com/JuliaDynamics/ComplexityMeasures.jl/blob/v3.7.3/src/differential_info_estimators/order_statistics/Correa.jl#L3-L55">source</a></section></article></article><nav class="docs-footer"><a class="docs-footer-prevpage" href="../counts_and_probabilities_api/">« Multivariate counts and probabilities API</a><a class="docs-footer-nextpage" href="../information_multivariate_api/">Multivariate information API »</a><div class="flexbox-break"></div><p class="footer-message">Powered by <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> and the <a href="https://julialang.org/">Julia Programming Language</a>.</p></nav></div><div class="modal" id="documenter-settings"><div class="modal-background"></div><div class="modal-card"><header class="modal-card-head"><p class="modal-card-title">Settings</p><button class="delete"></button></header><section class="modal-card-body"><p><label class="label">Theme</label><div class="select"><select id="documenter-themepicker"><option value="auto">Automatic (OS)</option><option value="documenter-light">documenter-light</option><option value="documenter-dark">documenter-dark</option><option value="catppuccin-latte">catppuccin-latte</option><option value="catppuccin-frappe">catppuccin-frappe</option><option value="catppuccin-macchiato">catppuccin-macchiato</option><option value="catppuccin-mocha">catppuccin-mocha</option></select></div></p><hr/><p>This document was generated with <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> version 1.8.0 on <span class="colophon-date" title="Wednesday 20 November 2024 21:31">Wednesday 20 November 2024</span>. Using Julia version 1.11.1.</p></section><footer class="modal-card-foot"></footer></div></div></div></body><div data-docstringscollapsed="true"></div></html>
Loading

0 comments on commit b621656

Please sign in to comment.