Suppose M is a set of objects m each having attributes X and Y. Now if X and Y can have only one value for given m (i.e. X,Y are random variables with P(X=x_i|M=m_i), P(Y=y_i|M=m_i)), it's possible to calculate mutual information of X and Y. But what if X can have multiple outcomes at once? I.e. for m_3 X={x1,x2} - generally outcome of X is subset of all possible outcomes. Can mutual information or some other measure of dependence be measured in such a case?
Is it possible to split X into binary random variables X_1, X_2, etc where X_1=1 iff X contains x1, X_1=0 otherwise and then compute I(X_i,Y_j) for all combinations i,j and sum up the information in order to get I(X,Y)?
Thanks.
Example:
m_1: X={a,b}, Y={x,y}; m_2: X={c}, Y={z,x}
If I'm not wrong, the premise you set:
then you want to define
Well, this increases the complexity of the problem significantly in terms of computation, but you can still do the same type of correlation, except instead of correlating two values X and Y, you are correlating two subsets X and Y.