Baldrson (Slashdot reader #78,598) writes: First announced on Slashdot in 2006, AI professor Marcus Hutter has gone big with his challenge to the artificial intelligence [and data compression] community. A 500,000€ purse now backs The Hutter Prize for Lossless Compression of Human Knowledge… Hutter’s prize incrementally awards distillation of Wikipedia’s storehouse of human knowledge to its essence.

That essence is a 1-billion-character excerpt of Wikipedia called “enwik9” — approximately the amount that a human can read in a lifetime. And 14 years ago, Baldrson wrote a Slashdot article explaining how this long-running contest has its roots in a theory which could dramatically advance the capabilities of AI:

The basic theory, for which Hutter provides a proof, is that after any set of observations the optimal move by an AI is find the smallest program that predicts those observations and then assume its environment is controlled by that program. Think of it as Ockham’s Razor on steroids.

Writing today, Baldrson argues this could become a much more sophisticated Turing Test.

Formally it is called Algorithmic Information Theory or AIT. AIT is, according to Hutter’s “AIXI” theory, essential to Universal Intelligence. Hutter’s judging criterion is superior to Turing tests in 3 ways: 1) It is objective2) It rewards incremental improvements3) It is founded on a mathematical theory of natural science. Detailed rules for the contest and answers to frequently asked questions are available.

of this story at Slashdot.

…read more

Source:: Slashdot