Benjamin Schumacher

Benjamin Schumacher >

25
Videos
Loading...
The Transformability of Information
Episode 1 of The Science of Information
What is information? Explore the surprising answer of American mathematician Claude Shannon, who concluded that information is the ability to distinguish reliably among possible alternatives. Consider why this idea was so revolutionary, and see how it led to the concept…
Computation and Logic Gates
Episode 2 of The Science of Information
Accompany the young Claude Shannon to the Massachusetts Institute of Technology, where in 1937 he submitted a master's thesis proving that Boolean algebra could be used to simplify the unwieldy analog computing devices of the day. Drawing on Shannon's ideas,…
Measuring Information
Episode 3 of The Science of Information
How is information measured and how is it encoded most efficiently? Get acquainted with a subtle but powerful quantity that is vital to the science of information: entropy. Measuring information in terms of entropy sheds light on everything from password…
Entropy and the Average Surprise
Episode 4 of The Science of Information
Intuition says we measure information by looking at the length of a message. But Shannon's information theory starts with something more fundamental: how surprising is the message? Through illuminating examples, discover that entropy provides a measure of the average surprise.
Cryptanalysis and Unraveling the Enigma
Episode 11 of The Science of Information
Unravel the analysis that broke the super-secure Enigma code system used by the Germans during World War II. Led by British mathematician Alan Turing, the code breakers had to repeat their feat every day throughout the war. Also examine Claude…
Error-Correcting Codes
Episode 8 of The Science of Information
Dig into different techniques for error correction. Start with a game called word golf, which demonstrates the perils of mistaking one letter for another and how to guard against it. Then graduate to approaches used for correcting errors in computer…
What Genetic Information Can Do
Episode 13 of The Science of Information
Learn how DNA and RNA serve as the digital medium for genetic information. Also see how shared features of different life forms allow us to trace our origins back to an organism known as LUCA--the last universal common ancestor--which lived…
Encoding Images and Sounds
Episode 6 of The Science of Information
Learn how some data can be compressed beyond the minimum amount of information required by the entropy of the source. Typically used for images, music, and video, these techniques drastically reduce the size of a file without significant loss of…
Signals and Bandwidth
Episode 9 of The Science of Information
Twelve billion miles from Earth, the Voyager spacecraft is sending back data with just a 20-watt transmitter. Make sense of this amazing feat by delving into the details of the Nyquist-Shannon sampling theorem, signal-to-noise ratio, and bandwidth--concepts that apply to…
Data Compression and Prefix-Free Codes
Episode 5 of The Science of Information
Probe the link between entropy and coding. In the process, encounter Shannon's first fundamental theorem, which specifies how far information can be squeezed in a binary code, serving as the basis for data compression. See how this works with a…
Qubits and Quantum Information
Episode 21 of The Science of Information
Enter the quantum realm to see how this revolutionary branch of physics is transforming the science of information. Begin with the double-slit experiment, which pinpoints the bizarre behavior that makes quantum information so different. Work your way toward a concept…
Unbreakable Codes and Public Keys
Episode 12 of The Science of Information
The one-time pad may be in principle unbreakable, but consider the common mistakes that make this code system vulnerable. Focus on the Venona project that deciphered Soviet intelligence messages encrypted with one-time pads. Close with the mathematics behind public key…
It from Bit: Physics from Information
Episode 23 of The Science of Information
Physicist John A. Wheeler's phrase "It from bit" makes a profound point about the connection between reality and information. Follow this idea into a black hole to investigate the status of information in a place of unlimited density. Also explore…
Turing Machines and Algorithmic Information
Episode 19 of The Science of Information
Contrast Shannon's code- and communication-based approach to information with a new, algorithmic way of thinking about the problem in terms of descriptions and computations. See how this idea relates to Alan Turing's theoretical universal computing machine, which underlies the operation…
Life’s Origins and DNA Computing
Episode 14 of The Science of Information
DNA, RNA, and the protein molecules they assemble are so interdependent that it's hard to picture how life got started in the first place. Survey a selection of intriguing theories, including the view that genetic information in living cells results…
The Meaning of Information
Episode 24 of The Science of Information
Survey the phenomenon of information from pre-history to the projected far future, focusing on the special problem of anti-cryptography--designing an understandable message for future humans or alien civilizations. Close by revisiting Shannon's original definition of information and ask, "What does…
Quantum Cryptography via Entanglement
Episode 22 of The Science of Information
Learn how a feature of the quantum world called entanglement is the key to an unbreakable code. Review the counterintuitive rules of entanglement. Then play a game based on The Newlywed Game that illustrates the monogamy of entanglement. This is…
Neural Codes in the Brain
Episode 15 of The Science of Information
Study the workings of our innermost information system: the brain. Take both top-down and bottom-up approaches, focusing on the world of perception, experience, and external behavior on the one hand versus the intricacies of neuron activity on the other. Then…
Horse Races and Stock Markets
Episode 18 of The Science of Information
One of Claude Shannon's colleagues at Bell Labs was the brilliant scientist and brash Texan John Kelly. Explore Kelly's insight that information is the advantage we have in betting on possible alternatives. Apply his celebrated log-optimal strategy to horse racing…
Noise and Channel Capacity
Episode 7 of The Science of Information
One of the key issues in information theory is noise: the message received may not convey everything about the message sent. Discover Shannon's second fundamental theorem, which proves that error correction is possible and can be built into a message…