Combining the power of Shannon Entropy and Kolmogorov-Chaitin complexity by way of Solomonoff-Levin Probability

Share this
Share on facebook
Share on twitter
Share on print
Share on email

To objectively characterise an object is a major challenge in science. Few measures have the capabilities to do so, in particular, those that are algorithmic in nature rather than only statistical have greater chances to do so because algorithmic methods are, in principle, better equipped to avoid spurious correlations and to confound causality with regularity. However, algorithmic measures are ultimately impossible to calculate and thus scientists have avoided them for a long time. Nevertheless, approximations are possible and when combined with less powerful but more scalable measures they may offer the best of both worlds: a powerful but also scalable hybrid measure. In this paper we explore the capabilities and limitations of such a measure, one that combines two major indexes in complexity theory namely Shannon Entropy and Kolmogorov-Chaitin complexity putting both classical and algorithmic information theory to work together. We demonstrate that such a measure is well behaved, yields stable results even when potential issues arise, and numerical errors are either bounded or converge in the worse case to measures used in the field. Such error convergence also allows for corrections. In summary, an interesting hybrid measure able to divide and conquer a problem is presented and studied both theoretically and experimentally. See papers: J6, J17, J20, J22. And applications of these tools to several different areas. See papers: J16, J13, J10, J8, J4, J2, J18, J19, J21, J26, J27, J30, J25 and P10. See papers: J13, J16, J22, J21, P31, P32, P24 and more recently J44). Collaborators: Fernando Soler-Toscano, Narsis A. Kiani, Jean-Paul Delahaye, Nicolas Gauvrit, Jesper Tegnér. Funding: National Science and Technology Council of Mexico (Conacyt), John Templeton Foundation, Swedish Research Council.