In brief, what is information theory?

Information theory is a scientific field that studies the encoding, compression, transmission of digital information.

The field of information theory was founded by Claude Shannon with his seminal paper “A Mathematical Theory of Communication” from 1948. Since then, many ideas of information theory have permeated other technical fields, including computer science, probability, and machine learning.

In the next lesson, 8.2 Fundamentals of Information Theory , we will learn a few key concepts from information theory that help provide a new understanding of machine learning. Some of these help make intuitive ideas about probability more precise, while others might seem more surprising. Either way, please bear with us for a bit please. The concepts and perspectives that you learn here will be useful beyond this course, and will reappear in advanced machine-learning courses, research papers, and practical methods that you are likely to encounter in the future.


<aside> ⬅️ ML & Information Theory

</aside>

<aside> ➡️ 8.2 Fundamentals of Information Theory

</aside>


© 2022 — DD1420 Authors