oXitip Help
__________

oXitip stands for Online X-Information Theoretic Inequality Prover. This is an online reincarnation of the Xitip (http://xitip.epfl.ch), fully implemented in an object-oriented C/C++ class structure.  Like ITIP and Xitip, oXitip is a machine based proof checker algorithm, based on the theoretical framework developed by Raymond W. Yeung [1], in order to establish the validity of any Shannon type information inequalities and expressions.  A valid information inequality (or expression) is a linear inequality (or expression) involving measures such as (single, conditional, or joint ) entropy and mutual information. Standard Information-theoretic notation is adopted for measures such as (Shannon) entropy, represented by H and Mutual Information by I. The program can take information expressions (or inequalities) along with optional information-theoretic constraints. Information constraints can be any valid information expressions or dependency relationships of the random variables involved. Dependency such as independence (e.g., X and Y being independent), deterministic mapping (e.g., Y=f(X)), or a Markov chain formulation are examples of constraints that can be fed to the software tool. 

 

The program can take input either as a multiline mode or an equivalent single line command. In the multiline window, the information expression (equality or inequality) to be checked, is keyed in first (topmost row). Constraints if any are listed one-by-one in the following lines.  For example, in order to check whether I(X;Y|Z)<=I(X;Y) is true or not when the random variables X,Y,Z form a simple Markov Chain, the correct syntax is as follows:  

I(X;Y|Z)<= I(X;Y)
X/Y/Z

Equivalent single line command is automatically generated by the program, where each expression and constraint are separated by strings/quotes. Users who prefer a single line key in may alternatively use the single-line command. Computationally they are equivalent.

 

A random variable can be represented by single character from English alphabet or a string concatenated by letters, numbers and underscore(_)s. Some examples below:

1) I(X;Y)+3 H(A)-1.234 I(X;A|Z) >=0
2) -H(Snow_Fall)+I(Snow_Fall;Winter|Temperature) >=0
3) I(x;Y)=H(x)-H(x|Y) 

Information measures can be scaled by real constants. However, additive constants, other than 0 are not allowed. For instance, H(X)+3 >=0 is not a valid information inequality, whereas 

1.245 H(X|Y)- 12.234 I(A;B|H) >=0 is a valid information inequality.

Inequality or equality can appear anywhere in the expression, but only one equality or inequality allowed per expression. For example, the following inequalities are one and the same. You can input this expression onto this software in any of these forms.

1) H(X)-H(X|Y) >=0
2) H(X)>= H(X|Y)
3) H(X|Y) <=0 H(X)

Information expression/inequality Syntax Restrictions:
———————————————————————————

1) Case matters: x and X are treated as two different things. In other words, they will be taken as two different random variables. 

2) Information inequality term can have either "=", ">=" or "<=". This means, an equal sign always must appear. Expressions with plain ">" or "<" is not allowed.

2) There is no restriction on the space between random variables or expressions: i.e., I(X   ; Y/.    Z) will be internally parsed as I(X;Y|Z)) 

 

Information Constraints:
————————————

Constraints are input on to the second (large) text box. Any number of constraints can be provided. Each new constraint must be entered on different lines. The following types of constraints are supported.

1) Independence of random variables (X.Y means the random variables X and Y are independent. Similarly, A.B.C.D suggests the four random variables are independent of each other.

2) Markov chains. U/V/W/X refers to the random variables U->V->W->X forming a simple Markov chain.
3) Deterministic functional. If the random variable Y is a deterministic function of random variable X, i.e., Y=f(X), the notation Y:X may be used.
4) A valid information expression with equality (Inequalities as constraints are not supported)
 
Difference from Xitip
The main core of oXitip is taken from Xitip. Some part of the computing is re-written in C++ for modularity and speed/memory optimization. The Linear programming engine is migrated from Qsopt to GLPK and thus enabling the transition to a fully open-source philosophy. The program runs on a cloud server. The computational complexity grows exponentially with the number of random variables. For up to 10 random variables, the result comes out in a few seconds. For expressions with more than 10 random variables, there is an option (not mandatory) to feed user Email ID. Once the program successfully completes the task, an automatic Email will be sent with the job and result. The source code can be accessed from the Xitip webpage xitip.epfl.ch
 
Warranty/Licenses
oXitip is a totally free tool. This program/tool is developed with the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
GNU General Public License for more details.
 
Citation:
N.Rethnakar, S. Diggavi, T. Gläßle, E. Perron, R.Pulikkoonattu, R. W. Yeung, Y. Yan, Online X Information Theoretic Inequalities Prover oXitip, http://www.oxitip.com, 2020. 
 
References:
  1. R. Pulikkoonattu, R. Perron, S. Diggavi, Xitip: Information-Theoretic Inequality Prover, http://xitip.epfl.ch
  2. R. W. Yeung, Y.Yan, ITIP: Information-theoretic Inequality Prover http://user-www.ie.cuhk.edu.hk/~ITIP/
  3. R. W. Yeung, “A new outlook on Shannon’s information measures,” IEEE Trans. Inform. Theory, vol. 37, pp. 466-474, May 1991.
  4. R. W. Yeung, “A framework for linear information inequalities,” IEEE Trans. Inform. Theory, vol. 43, pp. 1924-1934, Nov 1997.
  5. Z. Zhang and R. W. Yeung, “A non-Shannon-type conditional inequality of information quantities,” IEEE Trans. Inform. Theory, vol. 43, pp. 1982-1985, Nov 1997.
  6. Z. Zhang and R. W. Yeung, “On the characterization of entropy function via information inequalities,” IEEE Trans. Inform. Theory, vol. 44, pp. 1440-1452, Jul 1998.
  7. R. W. Yeung, “On entropy, information inequalities, and Groups.”
  8. R. W. Yeung, Information Theory and Network Coding, Springer 2008.8. R. W. Yeung. "Facets of entropy." Communications in Information and Systems 15.1 (2015): 87-117.