TELEVISION

Science of Information: From Language to Black Holes

Series: Great Courses
4.7
(38)
Episodes
24
Rating
TVPG
Year
2016
Language
English

About

The science of information is the most influential, yet perhaps least appreciated field in science today. Never before have we been able to acquire, record, communicate, and use information in so many different forms. This revolution goes far beyond the limitless content that fills our lives, because information also underlies our understanding of ourselves, the natural world, and the universe.

Related Subjects

Episodes

1 to 3 of 24

1. The Transformability of Information

30m

What is information? Explore the surprising answer of American mathematician Claude Shannon, who concluded that information is the ability to distinguish reliably among possible alternatives. Consider why this idea was so revolutionary, and see how it led to the concept of the bit-the basic unit of information.

2. Computation and Logic Gates

30m

Accompany the young Claude Shannon to the Massachusetts Institute of Technology, where in 1937 he submitted a master's thesis proving that Boolean algebra could be used to simplify the unwieldy analog computing devices of the day. Drawing on Shannon's ideas, learn how to design a simple electronic circuit that performs basic mathematical calculations.

3. Measuring Information

30m

How is information measured and how is it encoded most efficiently? Get acquainted with a subtle but powerful quantity that is vital to the science of information: entropy. Measuring information in terms of entropy sheds light on everything from password security to efficient binary codes to how to design a good guessing game.

4. Entropy and the Average Surprise

30m

Intuition says we measure information by looking at the length of a message. But Shannon's information theory starts with something more fundamental: how surprising is the message? Through illuminating examples, discover that entropy provides a measure of the average surprise.

5. Data Compression and Prefix-Free Codes

30m

Probe the link between entropy and coding. In the process, encounter Shannon's first fundamental theorem, which specifies how far information can be squeezed in a binary code, serving as the basis for data compression. See how this works with a text such as Conan Doyle's The Return of Sherlock Holmes.

6. Encoding Images and Sounds

30m

Learn how some data can be compressed beyond the minimum amount of information required by the entropy of the source. Typically used for images, music, and video, these techniques drastically reduce the size of a file without significant loss of quality. See how this works in the MP3, JPEG, and MPEG formats.

Extended Details

  • Closed CaptionsEnglish

Artists