In 1948. Claude Shannon, then a young engineer working at Bell Telephone Laboratories in Murray Hill, M.J., published a landmark paper titled “A Mathematical Theory of Communication.” In that paper, Shannon defined what the once fuzzy concept of “information” meant for communications engineers and proposed a precise way to quantify it : in his theory, the fundamental unit of information is the bit.
Shannon showed that every communications channel has a maximum rate for reliable data transmission, which he called the channel capacity, measured in bits per second. He demonstrated that by using certain coding schemes, you could transmit data up to the channel’s full capacity, virtually free of errors-an astonishing result that surprised engineers at the time.
“I can’t think of anybody who could ever have guessed that such a theory existed,” says Robert Fano, an emeritus professor of computer science at the Massachusetts Institute of Technology, in Cambridge, and a pioneer in the information theory field. “It’s just an intellectual jump; it’s very profound.”
The channel capacity became an essential benchmark for communications engineers, a measure of what a system can and cannot do, expressed in many cases by the famous formula, C=Wlog2(1+P/N). In the formula, C is the capacity in bits per second, W is the bandwidth in hertz, P is the transmitter power in watts, and N is the noise power, also in watts.
From space probes to cellphones and CD players, Shannon’s ideas are invisibly embedded in the digital technologies that make our lives more interesting and comfortable.
A thinker, juggling enthusiast, and exceptional chess player, Shannon was also famous for riding the halls of Bell Labs on a unicycle. He died on 24 February 2001, at age 84, after a long battle with Alzheimer’s disease. –E.G.
송홍엽: IEEE Spectrum 2004년 3월호에서 발췌한 글입니다... -[04/12-22:48]-
박재석: Bell Lab은 New Jersey에 있다고합니다. Murray Hill, M.J -> N.J -[10/13-22:07]-