Entropy, Information Theory and Pattern Recognition!


Even the title seems a little bit dizzy, but it's ok. I will try to explain in detail.
Last night i spent some noteworthy amount of time reading the classic book of Pattern Classification and it was an interesting surprise.
Although it looks like a mathematical textbook, it is a really deep book that describes the whole area of pattern recognition.
It describes how can you decide based on the information you got for one experiment and have results.
The main problem here is the noise and the overlapping data.
How can we avoid noise? There is a pretty interesting theorem, that states that we can not avoid noise. We can reduce it, but not vanquish it!!
Let's state once again the problem. He have raw data, and based on that we want to make a decision. Only pattern recognition seems not to be enough for that.
This happens because of another interesting factor called entropy on a physical system.
I will not try to describe what exactly entropy is, and how it works but i found a really good textbook focusing entirely on on Information Theory and Entropy.
Feel free to share your ideas on the subject!
Yours,
Mike.

Σχόλια

Δημοφιλείς αναρτήσεις από αυτό το ιστολόγιο

One day at the WS-REST workshop in Florence

PyAPI: A python library to play with various API Standards, like Hydra, Swagger, RAML and more

What is the Semantic Web?