The Science of Information: From Language to Black Holes

Course No. 1301
Professor Benjamin Schumacher, Ph.D.
Kenyon College
Share This Course
4.7 out of 5
56 Reviews
94% of reviewers would recommend this product
Course No. 1301
Video Streaming Included Free

What Will You Learn?

  • numbers Explore what information is, how it is measured, and how it led to the concept of the bit - the basic unit of information.
  • numbers Learn how to design a simple electronic circuit that performs basic mathematical calculations.
  • numbers Investigate the history of cryptography starting with the simple cipher used by Julius Caesar.
  • numbers Learn how a feature of the quantum world called entanglement is the key to an unbreakable code
  • numbers Unravel the super-secure Enigma code system used by the Germans during World War II.

Course Overview

The science of information is the most influential, yet perhaps least appreciated field in science today. Never before in history have we been able to acquire, record, communicate, and use information in so many different forms. Never before have we had access to such vast quantities of data of every kind. This revolution goes far beyond the limitless content that fills our lives, because information also underlies our understanding of ourselves, the natural world, and the universe. It is the key that unites fields as different as linguistics, cryptography, neuroscience, genetics, economics, and quantum mechanics. And the fact that information bears no necessary connection to meaning makes it a profound puzzle that people with a passion for philosophy have pondered for centuries.

Little wonder that an entirely new science has arisen that is devoted to deepening our understanding of information and our ability to use it. Called information theory, this field has been responsible for path-breaking insights such as the following:

  • What is information? In 1948, mathematician Claude Shannon boldly captured the essence of information with a definition that doesn’t invoke abstract concepts such as meaning or knowledge. In Shannon’s revolutionary view, information is simply the ability to distinguish reliably among possible alternatives.

  • The bit: Atomic theory has the atom. Information theory has the bit: the basic unit of information. Proposed by Shannon’s colleague at Bell Labs, John Tukey, bit stands for “binary digit”—0 or 1 in binary notation, which can be implemented with a simple on/off switch. Everything from books to black holes can be measured in bits.

  • Redundancy: Redundancy in information may seem like mere inefficiency, but it is a crucial feature of information of all types, including languages and DNA, since it provides built-in error correction for mistakes and noise. Redundancy is also the key to breaking secret codes.

Building on these and other fundamental principles, information theory spawned the digital revolution of today, just as the discoveries of Galileo and Newton laid the foundation for the scientific revolution four centuries ago. Technologies for computing, telecommunication, and encryption are now common, and it’s easy to forget that these powerful technologies and techniques had their own Galileos and Newtons.

The Science of Information: From Language to Black Holes covers the exciting concepts, history, and applications of information theory in 24 challenging and eye-opening half-hour lectures taught by Professor Benjamin Schumacher of Kenyon College. A prominent physicist and award-winning educator at one of the nation’s top liberal arts colleges, Professor Schumacher is also a pioneer in the field of quantum information, which is the latest exciting development in this dynamic scientific field.

Professor Schumacher introduces the essential mathematical ideas that govern the subject—concepts that can be understood by anyone with a background in high school math. But it is not necessary to follow the equations to appreciate the remarkable story that Dr. Schumacher tells.

A New View of Reality

Clearly, information has been around a long time. In human terms, language, writing, art, music, and mathematics are perfect examples; so are Morse code, Mendelian genetics, and radio signals—all originating before 1900. But a series of conceptual breakthroughs in the 20th century united what seemed like unrelated phenomena and led to a dramatic new way of looking at reality. The Science of Information takes you on this stimulating intellectual journey, in which some of the key figures include:

  • Claude Shannon: Shannon plays a key role throughout the course as the dominant figure in the early decades of information theory, making major contributions in computer science, cryptography, genetics, and other areas. His crucial 1948 paper was the “shot heard” round the world” for the information revolution.

  • Alan Turing: The genius behind the decryption of the Nazi Enigma code during World War II, Turing invented the principle of the modern digital computer, and he showed the inherent limitation of all computers by showing that the notorious “halting problem” was fundamentally unsolvable.

  • John A. Wheeler: One of the greatest physicists of the 20th century, Wheeler had a passion for the most fundamental questions of science, which led him to conceive the famous slogan, “It from bit,” meaning that all of physical reality emerges from information. He was also Professor Schumacher’s mentor.

In addition, you study the contributions of other pioneers, such as John Kelly, who used information theory to devise an influential strategy for betting and investing; David Huffman, who blazed the trail in data compression, now used in formats such as JPEG and MP3; and Gregory Chaitin, who pursued computer algorithms for information theory, hypothesizing a celebrated yet uncomputable number called Omega. You also explore the pivotal contributions of pre-20th-century thinkers including Charles Babbage, Ada Lovelace, Samuel F. B. Morse, and Joseph Fourier.

The Laws of Information at Work

With lucid explanations and imaginative graphics, Professor Schumacher shows you the world through an extraordinary set of lenses. “If we wear our information-colored glasses,” he says, “we will see the laws of information at work all around us, in a hundred different ways.” The course illustrates this with examples such as:

  • Money: Today most money exists as electronic account data. But even in ancient times, money was a record-keeping device—in other words, information. Precious metal coins had a cryptographic function: to make it hard to counterfeit messages of economic agreement and obligation.

  • Privacy: The search for guaranteed privacy has only one refuge—the quantum realm. Professor Schumacher explains how the only perfectly secure communications take place between pairs of entangled quantum particles called qubits (a term he coined). Such systems are now in use.

  • Games: The parlor game 20 Questions obviously involves the exchange of information. But why is the number of questions 20? Why not 10 or 30? The answer has to do with the connection between entropy and information—in this case, the total number of possible solutions to the game.

Dr. Schumacher also shows you how information theory can provide answers to profound scientific questions. What is the information content of the genome? The human brain? A black hole? The universe? Time and again, the concepts and laws of information reveal breathtaking insights into the workings of nature, even as they lay the foundation of astounding new technologies.

One final example: 12 billion miles from Earth, a spacecraft built with 1970s technology is racing through interstellar space, never to return. From that distance, the sun is a very bright star and Earth is a pale blue dot. Voyager 1’s radio transmitter is about as strong as a cell phone tower on Earth, which typically can’t reach phones more than a few miles away. Yet we continue, to this day, to receive data from Voyager. How is that possible? The Science of Information explains this amazing feat, along with so much more.

Hide Full Description
24 lectures
 |  Average 31 minutes each
  • 1
    The Transformability of Information
    What is information? Explore the surprising answer of American mathematician Claude Shannon, who concluded that information is the ability to distinguish reliably among possible alternatives. Consider why this idea was so revolutionary, and see how it led to the concept of the bit - the basic unit of information. x
  • 2
    Computation and Logic Gates
    Accompany the young Claude Shannon to the Massachusetts Institute of Technology, where in 1937 he submitted a master's thesis proving that Boolean algebra could be used to simplify the unwieldy analog computing devices of the day. Drawing on Shannon's ideas, learn how to design a simple electronic circuit that performs basic mathematical calculations. x
  • 3
    Measuring Information
    How is information measured and how is it encoded most efficiently? Get acquainted with a subtle but powerful quantity that is vital to the science of information: entropy. Measuring information in terms of entropy sheds light on everything from password security to efficient binary codes to how to design a good guessing game. x
  • 4
    Entropy and the Average Surprise
    Intuition says we measure information by looking at the length of a message. But Shannon's information theory starts with something more fundamental: how surprising is the message? Through illuminating examples, discover that entropy provides a measure of the average surprise. x
  • 5
    Data Compression and Prefix-Free Codes
    Probe the link between entropy and coding. In the process, encounter Shannon's first fundamental theorem, which specifies how far information can be squeezed in a binary code, serving as the basis for data compression. See how this works with a text such as Conan Doyle's The Return of Sherlock Holmes. x
  • 6
    Encoding Images and Sounds
    Learn how some data can be compressed beyond the minimum amount of information required by the entropy of the source. Typically used for images, music, and video, these techniques drastically reduce the size of a file without significant loss of quality. See how this works in the MP3, JPEG, and MPEG formats. x
  • 7
    Noise and Channel Capacity
    One of the key issues in information theory is noise: the message received may not convey everything about the message sent. Discover Shannon's second fundamental theorem, which proves that error correction is possible and can be built into a message with only a modest slowdown in transmission rate. x
  • 8
    Error-Correcting Codes
    Dig into different techniques for error correction. Start with a game called word golf, which demonstrates the perils of mistaking one letter for another and how to guard against it. Then graduate to approaches used for correcting errors in computer operating systems, CDs, and data transmissions from the Voyager spacecraft. x
  • 9
    Signals and Bandwidth
    Twelve billion miles from Earth, the Voyager spacecraft is sending back data with just a 20-watt transmitter. Make sense of this amazing feat by delving into the details of the Nyquist-Shannon sampling theorem, signal-to-noise ratio, and bandwidth - concepts that apply to many types of communication. x
  • 10
    Cryptography and Key Entropy
    The science of information is also the science of secrets. Investigate the history of cryptography starting with the simple cipher used by Julius Caesar. See how entropy is a useful measure of the security of an encryption key, and follow the deciphering strategies that cracked early codes. x
  • 11
    Cryptanalysis and Unraveling the Enigma
    Unravel the analysis that broke the super-secure Enigma code system used by the Germans during World War II. Led by British mathematician Alan Turing, the code breakers had to repeat their feat every day throughout the war. Also examine Claude Shannon's revolutionary views on the nature of secrecy. x
  • 12
    Unbreakable Codes and Public Keys
    The one-time pad may be in principle unbreakable, but consider the common mistakes that make this code system vulnerable. Focus on the Venona project that deciphered Soviet intelligence messages encrypted with one-time pads. Close with the mathematics behind public key cryptography, which makes modern transactions secure - for now. x
  • 13
    What Genetic Information Can Do
    Learn how DNA and RNA serve as the digital medium for genetic information. Also see how shared features of different life forms allow us to trace our origins back to an organism known as LUCA - the last universal common ancestor - which lived 3.5 to 4 billion years ago. x
  • 14
    Life's Origins and DNA Computing
    DNA, RNA, and the protein molecules they assemble are so interdependent that it's hard to picture how life got started in the first place. Survey a selection of intriguing theories, including the view that genetic information in living cells results from eons of natural computation. x
  • 15
    Neural Codes in the Brain
    Study the workings of our innermost information system: the brain. Take both top-down and bottom-up approaches, focusing on the world of perception, experience, and external behavior on the one hand versus the intricacies of neuron activity on the other. Then estimate the total information capacity of the brain. x
  • 16
    Entropy and Microstate Information
    Return to the concept of entropy, tracing its origin to thermodynamics, the branch of science dealing with heat. Discover that here the laws of nature and information meet. Understand the influential second law of thermodynamics, and conduct a famous thought experiment called Maxwell's demon. x
  • 17
    Erasure Cost and Reversible Computing
    Maxwell's demon has startling implications for the push toward ever-faster computers. Probe the connection between the second law of thermodynamics and the erasure of information, which turns out to be a practical barrier to computer processing speed. Learn how computer scientists deal with the demon. x
  • 18
    Horse Races and Stock Markets
    One of Claude Shannon's colleagues at Bell Labs was the brilliant scientist and brash Texan John Kelly. Explore Kelly's insight that information is the advantage we have in betting on possible alternatives. Apply his celebrated log-optimal strategy to horse racing and stock trading. x
  • 19
    Turing Machines and Algorithmic Information
    Contrast Shannon's code- and communication-based approach to information with a new, algorithmic way of thinking about the problem in terms of descriptions and computations. See how this idea relates to Alan Turing's theoretical universal computing machine, which underlies the operation of all digital computers. x
  • 20
    Uncomputable Functions and Incompleteness
    Algorithmic information is plagued by a strange impossibility that shakes the very foundations of logic and mathematics. Investigate this drama in four acts, starting with a famous conundrum called the Berry Paradox and including Turing's surprising proof that no single computer program can determine whether other programs will ever halt. x
  • 21
    Qubits and Quantum Information
    Enter the quantum realm to see how this revolutionary branch of physics is transforming the science of information. Begin with the double-slit experiment, which pinpoints the bizarre behavior that makes quantum information so different. Work your way toward a concept that seems positively magical: the quantum computer. x
  • 22
    Quantum Cryptography via Entanglement
    Learn how a feature of the quantum world called entanglement is the key to an unbreakable code. Review the counterintuitive rules of entanglement. Then play a game based on The Newlywed Game that illustrates the monogamy of entanglement. This is the principle underlying quantum cryptography. x
  • 23
    It from Bit: Physics from Information
    Physicist John A. Wheeler's phrase "It from bit" makes a profound point about the connection between reality and information. Follow this idea into a black hole to investigate the status of information in a place of unlimited density. Also explore the information content of the entire universe! x
  • 24
    The Meaning of Information
    Survey the phenomenon of information from pre-history to the projected far future, focusing on the special problem of anti-cryptography - designing an understandable message for future humans or alien civilizations. Close by revisiting Shannon's original definition of information and ask, What does the theory of information leave out?"" x

Lecture Titles

Clone Content from Your Professor tab

What's Included

What Does Each Format Include?

Video DVD
Instant Video Includes:
  • Download 24 video lectures to your computer or mobile app
  • Downloadable PDF of the course guidebook
  • FREE video streaming of the course from our website and mobile apps
Video DVD
DVD Includes:
  • 24 lectures on 4 DVDs
  • 354-page printed course guidebook
  • Downloadable PDF of the course guidebook
  • FREE video streaming of the course from our website and mobile apps
  • Closed captioning available

What Does The Course Guidebook Include?

Video DVD
Course Guidebook Details:
  • 354-page printed course guidebook
  • Key Equations in the Science of Information
  • Suggested Reading
  • Questions to Consider

Enjoy This Course On-the-Go with Our Mobile Apps!*

  • App store App store iPhone + iPad
  • Google Play Google Play Android Devices
  • Kindle Fire Kindle Fire Kindle Fire Tablet + Firephone
*Courses can be streamed from anywhere you have an internet connection. Standard carrier data rates may apply in areas that do not have wifi connections pursuant to your carrier contract.

Your professor

Benjamin Schumacher

About Your Professor

Benjamin Schumacher, Ph.D.
Kenyon College
Dr. Benjamin Schumacher is Professor of Physics at Kenyon College, where he has taught for 20 years. He received his Ph.D. in Theoretical Physics from The University of Texas at Austin in 1990. Professor Schumacher is the author of numerous scientific papers and two books, including Physics in Spacetime: An Introduction to Special Relativity. As one of the founders of quantum information theory, he introduced the term qubit,...
Learn More About This Professor
Also By This Professor


The Science of Information: From Language to Black Holes is rated 4.7 out of 5 by 56.
Rated 5 out of 5 by from Don't be intimidated by the math! Don't be intimidated by the math! I listened to this course on audio during my daily walks, and I think that helped me absorb what I could without getting distressed that the math was often way beyond my understanding. (Because I was not looking at incomprehensible equations.) Reliably, the professor would also state the point he was trying to make in non-math terms, and that part I usually understood. Even without grasping the math, I think I took in most of his main points. Among the points in this course that I found interesting and surprisingly relevant to some of my interests: * language as an information system * generalizations about the frequency of words in a language * challenges and solutions for data compression * error-correcting strategies * parallels between errors induced by manuscript copying in ancient times and DNA transmission * types of codes, including simple locks and keys * breakable vs. unbreakable codes and how the Enigma machine code was broken during WWII * the challenge of communicating with future humans and extra-terrestrials The professor modulates his voice enough that I could listen to him without feeling tired, even when I didn't quite follow what he was saying. From time to time he introduced down-home explanations that really did make things clearer - for example, comparing a certain challenge of cryptography to the old TV show "The Newlywed Game."
Date published: 2020-09-14
Rated 5 out of 5 by from Claude Shannon's Work! Currently about 1/3 of the way, but this is outstanding. The professor based the course on the work of Claude Shannon and its the best way to learn the Science of Information!
Date published: 2020-05-11
Rated 5 out of 5 by from Great information I'm just getting started with this one. We are watching another course right now. The 'preview' looks fantastic. It's a bit of a 'challenge' since I learned some of this from a different point of view so to speak when I was in school many years ago. I can't wait to get back into it.
Date published: 2020-02-09
Rated 5 out of 5 by from This man can teach very impressed by the elegance and clarity of these lectures, and his personable delivery. obviously he took great care and attention to distill and to work on phrasings and transitions that enabled the viewer to pull things together. i had not appreciated the maths involved and don't fully understand them, but to learn that they are there and drive the science of information is itself a great learning.
Date published: 2019-12-29
Rated 5 out of 5 by from Good recap and update - very useful to me FYI, I took thermodynamics, communications theory and statistics in college many years ago. I needed a refresh and an update on how these fields have changed with new advances and fresher perspective. This course gave me knowledge to help understand how the terms information and quantum computing are applied today.
Date published: 2019-09-02
Rated 4 out of 5 by from Great for Math Geeks I recommended this course but let me state a major caveat. This is a great course for math geeks. For people like me who have little or no math ability it was mostly a waste of time and money. Viewing this course was a lot like looking at the major equations of physics. I understood that they are brilliant, even artistic works of genius, but they meant absolutely nothing until roughly translated into English. The math eluded me even more than the Roadrunner eluded the Coyote and like the Coyote I found myself often standing in empty air. Prof. Schumacher is no doubt brilliant and to those that speak his language I have no doubt he will impart much knowledge. There are some mathematics that can be translated into English, Information Theory does not appear to me to be one of them. So for those that speak math, this is a great course and so I recommended it.
Date published: 2019-07-29
Rated 5 out of 5 by from Foundational Another wonderful course from Professor Schumacher. My only wish is that each lecture could be expanded into a course of its own, and that I had a year (or 10, or 20) to pursue all of the questions it raises!
Date published: 2018-12-20
Rated 5 out of 5 by from I bought this course because I enjoyed Schumacher's course on quantum theory so much; this doesn't disappoint. He is a dynamic and interesting lecturer who is himself working on the cutting edge of his topic. I highly recommend it.
Date published: 2018-12-08
  • y_2020, m_11, d_26, h_16
  • bvseo_bulk, prod_bvrr, vn_bulk_3.0.12
  • cp_1, bvpage1
  • co_hasreviews, tv_2, tr_54
  • loc_en_US, sid_1301, prod, sort_[SortEntry(order=SUBMISSION_TIME, direction=DESCENDING)]
  • clientName_teachco
  • bvseo_sdk, p_sdk, 3.2.0
  • CLOUD, getContent, 6.15ms

Questions & Answers

Customers Who Bought This Course Also Bought