Classically, entropy is a measure of disorder in a system. From a statistical perspective, it is more useful to say it's a measure of the unpredictability of the system. In this episode we discuss how information reduces the entropy in deciding whether or not Yoshi the parrot will like a new chew toy. A few other everyday examples help us examine why entropy is a nice metric for constructing a decision tree.
Creators & Guests
We don't know anything about the creators of this episode yet. You can add them yourself so they can be credited for this and other podcasts.
This episode hasn't been reviewed yet. You can add a review to show others what you thought.
This podcast, its content, and its artwork are not owned by, affiliated with, or endorsed by Podchaser.