State-dependent measures that converge to the mutual information
This article includes a
list of references,
related reading, or
external links,
but its sources remain unclear because it lacks inline citations.
Please help improve this article by introducing more precise citations. (May 2011) (Learn how and when to remove this message)In information theory, specific-information is the generic name given to the family of state-dependent measures that in expectation converge to the mutual information. There are currently three known varieties of specific information usually denoted I V {\displaystyle I_{V}} , I S {\displaystyle I_{S}} , and I s s i {\displaystyle I_{ssi}} . The specific-information between a random variable X {\displaystyle X} and a state Y = y {\displaystyle Y=y} is written as : I ( X ; Y = y ) {\displaystyle I(X;Y=y)} .
References[edit]RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4