In information theory, one major goal is to find useful functions that summarize the amount of information contained in the interaction of several random variables. Specifically, one can ask how the classical Shannon entropy, mutual information, and higher interaction information functions relate to each other. This is formally answered by Hu's theorem, which is widely known in the form of information diagrams: it relates disjoint unions of shapes in a Venn diagram to summation rules of information functions; this establishes a bridge from set theory to information theory. While a proof of this theorem is known, to date it was not analyzed in detail in what generality it could be established. In this work, we view random variables together ...