If you asked a random person on the street to name a famous mathematician of the twentieth century, you would only get a couple of answers with high probability. The the highest probability answer would be Albert Einstein. The next highest would be "the guy Russell Crowe played in A Beautiful Mind". I'm going to guess that a distant third would be "the guy from Num3rs". Personally, my first two answers would be Jon von Neumann and Claude Shannon.
Clearly, von Neumann's contributions to mathematics were the farthest reaching. He did work in quantum mechanics, functional analysis, economics and game theory, computer science and also participated in the Manhattan Project. However, I believe that Shannon's work had the larger impact on the lives of people after the Cold War.
Claude Shannon was born in 1916 in the northern part of the Lower Peninsula of Michigan. He studied electrical engineering and mathematics at the Univeristy of Michigan. At MIT he wrote his master's thesis, "A Symbolic Analysis of Relay and Switching Circuits,". Since you are reading this on a computer, you are using relays and switching circuits.
During World War II, Shannon worked at Bell Labs on cryptography. There he wrote a classified memo in 1945 which would be declassified as the 1949 paper "Communication Theory of Secrecy Systems", which gave one of the first mathematical descriptions of cryptography. Cryptography is used in secure communications, which allows for relatively safe commerce on the internet.
In 1948, Shannon published his most famous paper "A Mathematical Theory of Communication" where he lays the foundations of information theory. The information content of a message source is measured by entropy, which means the average number of bits needed to encode a symbol. We use data compression, like .zip or .mp3 files, to reduce the number of symbols required to encode a computer file to the minimum.
Also in "Theory of Communication", Shannon studied information moving though a channel, which transmits a message from a source to a receiver. The capacity of a channel is measured in bits per second. We worry about the channel capacity of our internet connections when we are downloading large files or streaming video.
In 1949, Shannon published "Communication in the presence of noise", in which he proved a sampling theorem. The theorem states that it is possible to encode an analog signal into a digital signal and back. This process allows a CD to store music digitally, and then play the music back as a analog (sound) signal.
Claude Shannon was an all around interesting person. He enjoyed juggling, unicycle riding, and chess. He was one of the first people to consider using a computer to play chess. He also built several devices. One of the more famous ones is the "Ultimate Machine", which you can watch in action below.
Hopefully, I've piqued your interest in the work of Claude Shannon. Many topics in information theory are easily accessible to students. The book by John Pierce, linked below, is a readable introduction to information theory. If you can read a book, thank a teacher. If you can hear an audio book, thank Claude Shannon.