74 results for “topic:shannon”
AI productivity studio with smart chat, autonomous agents, and 300+ assistants. Unified access to frontier LLMs
A curated list of awesome baseband research resources
A python library for the computation of various concentration, inequality and diversity indices
An open-source toolkit for entropic data analysis
A repository to collect papers and programs of historical interest to AI. Mostly gathered while reading Pamela McCurdock's Machines Who Think
A collection of notebooks for Natural Language Processing
Entropy is a measure of the uncertainty in a random variable. This application calculates the entropy of text. The current example calculates the entropy of sequence "TTTAAGCC". In the context of information theory the term "Entropy" refers to the Shannon entropy.
This application calculates the entropy of a string. The focus of this implementation is represented by a specialized function called "entropy" which receives a text sequence as a parameter and returns a value that represents the entropy. Entropy is a measure of the uncertainty in a random variable.
Uses DataVisualization.Charting and progress bars for a visually pleasing way to view how the entropy changes across a file.
Gamma distribution differential entropy.
Function integrator for iOS with the variable accuracy
a computer player for the game hangman based on the idea of Shannon's entropy and information theory
Degenerate distribution entropy.
Student's t distribution entropy.
🔑Repositório para Disciplina EEL7416 - Introdução à Codificação
Inverse gamma distribution differential entropy.
Beta distribution differential entropy.
Binomial distribution entropy.
Fréchet distribution differential entropy.
Lévy distribution entropy.
Normal distribution differential entropy.
Laplace distribution differential entropy.
Bernoulli distribution entropy.
F distribution differential entropy.
Logistic distribution entropy.
Cauchy distribution differential entropy.
Computes Shanon's Siversity index.
Exponential distribution differential entropy.
Course Project of Information Theory
Erlang distribution differential entropy.