Communication Signal Design Lab.

한국어

송홍엽 교수의 잡글

Dr. Shannon

2003.12.07 03:09

송홍엽 조회 수:26842 추천:135

He didn’t create the compact disc, the fax machine or the digital wireless phone. But in 1948, Claude Shannon paved the way for them… with the basic theory underlying digital communications and storage. Shannon called it “information theory.” And while he did most of his pioneering work in the 1940s and 1950s, the real-world impact of his research is even more widespread today than it was fifty years ago.

“The whole idea of digitizing things--that you can save them, store them, download them, upload them—a lot of that comes out of Shannon,” says David Neuhoff of the University of Michigan, Shannon’s alma mater. “He wasn’t the first person to think of digital communications, but he really showed how it could work. Just that whole idea that digitization is useful not just because it sounds good, but that you can save it. You can store it on a CD, or a DVD player, and it’s the same today and tomorrow.”

Adds Robert McEliece of the California Institute of Technology (Caltech): “As historians look back at the 20th and 21st centuries, they’ll say that Shannon created a challenge for engineers in 1948, and within 50 or 60 years they solved this challenge and went on to other aspects of digital technology.”

While information theory was his landmark work, Claude Shannon also contributed to the early development of integrated circuits, computers, cryptography, artificial intelligence and even genomics—making him one of the most influential minds of the 20th century. Yet when he died on February 24, 2001, Shannon was virtually unknown to the public at large.

Elwyn Berlekamp, a UC Berkeley professor who co-wrote several papers with Shannon, says Shannon “was somewhat shy. Not withdrawn. But he was not one to volunteer to be in the limelight.”

“I think the lay public does not fully appreciate how much of an impact information theory has made on many things we take for granted, from storage like CD players, or communication of data, like your modem that dials up,” says Ramesh Rao, UCSD division director of the California Institute for Telecommunications and Information Technology [Cal-(IT)2]. “Just about anything that has to do with communication, storage and compression follows in some way directly from the contribution that Shannon made.”

“His discovery was very much like Einstein’s in the sense that Einstein was thinking about some things that nobody else was, and came up with not just the question but the answer, like e=mc2, and that sort of initiated all the research in atomic energy,” says UMich’s Neuhoff. “Well Shannon discovered these formulas about information transmission, how many bits per second you should be able to transfer over various media. Other people weren’t asking the question and he came out with this answer that was so beautiful that it inspired the people who ended up designing your cell phone and the communication links that make up the Internet.”

Shannon, a lifelong lover of Dixieland music popular when he was young, was born in 1916 and grew up in the small town of Gaylord, Michigan. His father was a probate judge, his mother the principal of the local high school.

A distant relative of Thomas Edison and the grandson of an inventor, in college Shannon studied mathematics and electrical engineering at Ann Arbor, after a youth spent tinkering with radio sets, model planes, even building a telegraph system to a friend's house half a mile away. Graduating in 1936, he went on to graduate school at an institution he would be associated with for most of his life—the Massachusetts Institute of Technology (MIT).

“His master’s thesis was on Boolean algebra, which is kind of an abstract field, but it’s really the basis for the design of computer circuits,” notes Andrew Viterbi, co-founder of San Diego-based Qualcomm Inc.

Shannon wrote his pivotal thesis in 1938 at the age of 22, applying the two-value binary algebra and symbolic logic originally conceived by 19th century English mathematician George Boole, to the on and off positions of switching circuits—envisioning them as the basis of a ‘logic machine’.

Explains Caltech’s McEliece: “He created the field of digital logic. That sounds simple to say now, but then, what did digital have to do with logic? Logic is reasoning, it’s a subject in philosophy. He showed that AND OR and NOT, the connectives from Boolean algebra, could be used to build electronic circuits which led in no small part to the invention of the computer and the fast calculations that people like to do. And that was sort of an afterthought. People say it was the most influential master’s thesis in history which is certainly true, but it understates the point, because if he had done nothing else he’d still be famous for inventing digital logic.”

Shannon landed at MIT after answering an ad for someone to work on one of the earliest analog computer—MIT engineering dean Vannevar Bush’s differential analyzer, a mechanical calculating machine. It was Bush who suggested that for his doctoral thesis, Shannon apply his mathematical thinking to genetics. Not published until the 1990s, that thesis on the use of algebra in organizing genetic knowledge is now considered to have been 30 to 40 years ahead of its time in foretelling the potential benefits of sophisticated computation in the life sciences such as the human genome project.

Shannon left MIT in 1940 to teach for a year at Princeton. He would later tell friends that during one of Shannon’s lectures, Albert Einstein walked in, whispered to someone at the back of the class, and promptly left. Eagerly quizzing the student later on what was said, Shannon got a kick out of the answer: Einstein was asking where tea was being served.

In 1941, Shannon stayed in New Jersey but moved into full-time research—joining Bell Telephone Laboratories. Shannon’s arrival at Bell’s research arm coincided with the Blitz over Britain… and Shannon’s early work included anti-aircraft devices that could calculate and target the aim of counter missiles against enemy planes or rockets, including the German V-1 and V-2 rockets that began raining down on England in 1942.

Bell also put Shannon, now in his mid twenties, to work as a cryptographer. Some of the groundbreaking encryption work he did was built into the complex scrambling machine used by Franklin Roosevelt and Winston Churchill to safeguard their wartime trans-Atlantic conferences. Later in the 40s, Shannon would spell out the fundamentals of modern cryptography in a paper called Communication Theory of Secrecy Systems. “I worked in cryptography for about 15 years, especially the last five years, and it always amazes me to go back and read some of his papers, how perceptive some of his comments actually were,” says David Blake. A professor at the University of Toronto, Blake recalls a conversation with Horst Feistel, the IBM researcher who in the 1970s developed the most widely used data encryption standard in the world. “He left me in no doubt whatsoever that he had read Shannon’s paper and was very strongly influenced by the principles of confusion and diffusion that it contained. That major encryption system served as the worldwide standard for the past 25 years.”

After the war, Shannon focused increasingly on how to apply binary values to telephone switching circuits. “Shannon was in a culture in the late 1940s when people did actually worry about building a better telephone network,” explains Robert Lucky, a long-time Bell Labs scientist and now with Telcordia Technologies. “I came along a little later than that and people don’t believe how it was almost a religion with us that we were out to make the world’s best telephone network and to make it better.”

“Shannon worked for the telephone company, so one of the first things that they figured out as a result of this, is that you probably shouldn’t just make a telephone line longer and longer and longer and put amplifiers in the middle of the line,” says Jack Wolf, a Shannon follower and professor of electrical and computer engineering at UCSD’s Jacobs School of Engineering. “Pretty soon the signal starts to get smaller and you have to amplify it. But every time you amplify the signal, you amplify all the imperfections that got into the signal, which we call noise. So you started out with a signal that didn’t have any noise, a little got added, and now you amplified that, and pretty soon all you had was noise.”

That led Shannon to a fundamental re-thinking of the way telephone conversations, and any other types of messages, or ‘information’ as he called it, were transmitted. While others were thinking up better ways to send analog telegraph or telephone signals, Shannon conceived of digital communications. Notes Wolf: “In Shannon’s case, what happened was, and it was a tremendous surprise, there was sort of a common commodity associated with all kinds of information, whether it be radio or television or satellites etc and this common commodity was, first of all, information, and this information could be represented in binary form—or a sequence of bits.”

The term bit, short for binary digit, first appeared in print in Shannon’s groundbreaking 1948 paper, A Mathematical Theory of Communication. But in it, he gave credit for the term to a fellow Bell Labs researcher: John Tukey.
“Very early in Shannon’s famous paper, he attributes the coinage of the word bit to John Tukey,” recalls Solomon Golomb of the University of Southern California. “But the Tukey bit and the Shannon bit are not the same thing. The popular notion of a bit is simply a unit of storage, with a 0 or 1, and Tukey coined this as short for binary digit. The Shannon bit is a unit of information, and a storage bit contains at most one bit of information, but quite often a lot less.”

Telcordia’s Bob Lucky agrees: “Shannon’s world wasn’t made of bits. Back in the Forties, the world was not an information world. Information technology was yet to be really conceived. The information age wasn’t really there, so the idea of digitally representing things and working with bits wasn’t understood or there.”

In research circles, the 1948 paper published in the Bell System Technical Journal made Shannon a star overnight. It revolutionized the way engineers and scientists thought about communications. It galvanized researchers, and spawned a new school of thought called ‘information theory’.

By 1949, when Shannon married Mary Elizabeth Moore, Betty for short and a numerical analyst at Bell Labs, he was an academic celebrity. More than 40 years later, Scientific-American and other publications would label Shannon’s paper the “Magna Carta of the Information Age.”

“It’s said that it is one of the few times in history where somebody founded the field, asked all the right questions, and proved most of them and answered them all at once,” says Toby Berger, an information theorist at Cornell University.

Shannon saw the binary digit as the fundamental element in communication. Information could be boiled down to sequences of 1s and 0s, encoded and then decoded at the other end. Messages could then be transmitted over long distances with virtually no loss in quality.

While Shannon’s work was theoretical, it didn’t take long for his employer to figure out how to use the theory. “What Bell came up with was a technology called a regenerative repeater,” recalls UCSD’s Wolf. “What this said was, if instead of working with analog signals such as my voice, if we had bits, if we didn’t let the bits get too small, you could regenerate the bit perfectly. So instead of an amplifier, they put this regenerative repeater, and then they started out with a bit, and a mile later they still had the same bit.”

While laying the groundwork for information theory, Shannon also contributed one of the fundamental insights guiding researchers ever since.

“One of the things that Shannon did that he’s widely known for, is this notion of a fundamental limit,” explains Cal-(IT)2’s Rao, who is also a professor of electrical and computer engineering at the Jacobs School. “No matter what you do, as long as you’re working with a certain amount of bandwidth, and as long as you’re dealing with signals that are only so strong in the face of noise that you confront, no matter what you do, you cannot transmit more than a certain rate.”

Today, that barrier is widely known as the Shannon Limit or Shannon Capacity. “Any communications channel, like a telephone wire, a television channel, deep space communications, even things not invented at that time, had an ultimate limit of transferring data reliably—this fast, this many megabytes per second, gigabytes per second and no farther if the data was going to be transferred reliably,” says Caltech’s McEliece. “Shannon established a prediction—like the speed of light, that we couldn’t go faster than that, but they didn’t tell you how to build rocket ships to do that. And so we’ve been trying since that time, the engineers, the mathematicians, the computer scientists who accepted this challenge that Shannon laid out, have been trying to find practical ways to get all the way up to channel capacity, to get more efficient wireless communications, fax modem communications.”

That challenge took decades. Dave Forney, who studied under Shannon and now teaches at MIT, went on to design the first computer modems for Motorola, and invent codes and algorithms that are now global standards for data transmission. Says Forney: “We are basically there, within a small epsilon of Shannon’s channel capacity, but this has really only happened in the last decade, and in some sense only the last five years in any kind of practical sense, so this is new news. So it really took fifty years for both the technology and the algorithms and codes to get to the point where we could crack these problems.”

Shannon also pointed followers in the right direction in their efforts to boost the efficiency of digital transmission or storage. His concept of adding extra, so-called redundant, bits to a message, so that it could be re-constructed at the other end despite some corruption, led to the widespread use of error detection and correction codes in data transmission, as well as codes to protect the integrity of data in CDs and other computer storage devices.

Andrew Viterbi, the co-founder of Qualcomm and designer of coding algorithms now widely used in wireless phone networks and other digital equipment, notes that it took years for technology to catch up with the digital future Shannon foresaw: “Where we used to have one transistor per chip, now we have somewhere between 10 million and 100 million, and all of that was necessary to make the Shannon theory practical. It’s quite interesting to note that Shannon’s paper came out in 1948, and late in 1947 the transistor was invented in the same Bell Laboratory environment in Murray Hill, New Jersey, and the two went hand in hand.”

As Viterbi recalls, the first testing ground for Shannon’s theory was the Jet Propulsion Laboratory at the dawn of the space age: “I, for one, was fortunate enough to land at the right place at the right time, that is, JPL, just before Sputnik was launched, and was on the ground floor of the space race and was able to influence the coding systems that flew in the early and even some of the later spacecraft, always applying the lessons of Shannon ever more efficiently.”

USC’s Golomb also worked at JPL in the 1950s and 1960s, and was instrumental in making digital the new paradigm for deep-space communications. “When I started talking in the late fifties about digital communications, this was considered a contradiction in terms by the traditional communications theorists. They were so wedded to the notion that communication involves continuous wave forms and continuous modulation,” recalls Golomb. “So the whole idea that communication was moving in the direction of going digital was a new idea that was very heavily influenced by Shannon, and the systems we started designing for deep space communications very much were influenced by the whole new concept that Shannon contributed to communications—that you could compare how well you were doing with an underlying model that told you what was the limit, the capacity of your channel in a theoretical sense.”

“All the advanced signal processing that enables us to send high-speed data was done as an outgrowth of Claude Shannon’s work on information theory,” insists Telcordia’s Lucky.

The same year Shannon married Betty—1949—he branched out from information theory and built a rudimentary computer to play chess. A year later he wrote a paper on programming the machines, including several strategies for computer chess algorithms still in use today. He was also an early pioneer in the field of artificial intelligence—how to teach a machine to learn. At Bell Labs, he built a mechanical mouse that could learn its way through a maze. It was named Theseus, after the Greek legend who escaped from a maze after slaying the Minotaur.

Although he continued as a consultant there for another decade, Shannon left Bell Labs in 1956 to teach at MIT. While there, he engaged in an eclectic set of interests and inventions, many of them, in his own words, ‘useless.’

In a 1963 edition of Vogue, the magazine described some of Shannon’s early contraptions—a chair lift that took his kids 600 feet down from the house to a nearby lake, for instance, and a hidden panel in his library that sprung open, but didn’t lead anywhere. “The fact is he loved engineering things, the gadgets,” says Paul Siegel, an information theorist and director of UCSD’s Center for Magnetic Recording Research. “The mechanical mouse, the chair lift from his house down to the lake, they were an integral part of his psyche.”

“He built more than 30 unicycles by hand in his garage,” adds UC Berkeley’s Berlekamp. “One of the questions that intrigued him was how small a unicycle could a human ride, and he built several that no one could ever ride.”

Shannon was an inveterate tinker and inventor. One room in his house was crammed with dozens of his devices: a computer that could calculate in Roman numerals; a machine that could solve the Rubik’s Cube puzzle; a gasoline-powered pogo stick; and several mechanical juggling machines.

Shannon himself took up juggling with a vengeance, a sport he later demonstrated for a Canadian Broadcasting documentary. He also wrote a widely praised academic paper on the dynamics of keeping multiple objects in the air at once. “Generally, I think if a guy comes up with all those things, like a mouse that does a maze etcetera, it usually looks like showing off, that he’s Mister Idea man,” says Thomas Cover, a professor at Stanford University. “But Shannon was so quiet and unassuming and humble, that I think he was doing these things despite himself, rather than to show off.”

Stanford’s Cover won the Shannon Award in 1990, in part for his work extending information theory into investment analysis. Shannon never published on the subject, but delivered two influential lectures at MIT on stock investing—pre-dating widespread use of portfolio theory on Wall Street. He and Betty Shannon also made a killing in the stock market, after investing in technology start-ups owned by friends, companies such as Teledyne and Hewlett-Packard.

Like playing the stock market, games and game theory also intrigued him. Notes Cover: “One of Shannon’s connections that’s little known, is Ed Thorp, the guy who became famous for writing the book Beat the Dealer, how to play blackjack optimally and count the cards. One summer he talked to Shannon and he asked whether Shannon would submit his work on blackjack to the national academy of sciences. Shannon got interested and before long he went out to Shannon’s house and worked on a roulette prediction scheme.”

In 1973, on the 25th anniversary of Shannon’s landmark paper, the information theory society (within what is now called the IEEE, the Institute of Electrical and Electronics Engineers) instituted an annual Shannon lecture that evolved into the Shannon award.

Elwyn Berlekamp of UC Berkeley, a later recipient of the award, invited Shannon to deliver the first lecture. “I have never seen such stagefright,” recalls Berlekamp. “Of course, once he got up on stage and got in front of his audience, there was no problem. It never would have occurred to me that anyone so in front of friends, could be so scared.”

“He was nervous about that, quite nervous actually,” adds Cornell’s Berger. “He had been out of information theory for the better part of five or ten years at that point, and felt that there was little that he had any right to talk to this group of people about.”

“He just felt that people were going to expect so much of him in this talk, and he was afraid that he didn’t have anything significant to say. Needless to say, he gave a fantastic talk, but in my mind, it was really, it showed me what a modest man he was,” concludes UCSD’s Jack Wolf

A few years later, Bob Lucky, while he was at Bell Labs, emceed a dinner commemorating an IEEE anniversary in Boston. He invited Shannon.

“Even though he didn’t have to say a word, he looked nervous and out of place. This black-tie dinner, with David Packard on one side, was just not his thing,” remembers Lucky. “His world was the world of mathematics that came out of the great work, and when he was placed in this world of famous people, it wasn’t his place.”

Even after retiring from MIT in 1978, Shannon could never completely escape his fame within the engineering community. Caltech’s Bob McEliece says “it was as if Newton had showed up”, when Shannon attended a 1985 meeting in Brighton, England: “People lined up to get his autograph. Physicists don’t line up to get other physicists’ autographs, but it was so far beyond what the rest of us were capable of doing, we went for the photo opportunity.”

Despite his fundamental achievements, Shannon never won a Nobel Prize. “He would have won it years ago, but his work is in mathematics and engineering and there is no Nobel in those disciplines. But in the mid 1980s, the Japanese government created the Kyoto Prize, which was supposed to be a mirror of the Nobel prize,” says McEliece. “In fact, it’s even more money, and Shannon won the first Kyoto Prize.”

That was 1985, and by then, it was starting to be clear to friends that Shannon was grappling with a terrible condition. In October 1986, Dave Neuhoff co-chaired an event at the University of Michigan. “Shannon was very quiet,” recalls Neuhoff. “I had the feeling at that time that he was already suffering from the affliction of Alzheimer’s. Betty did most of the talking.”

Shannon gave his last major interview to Omni magazine in 1987. He told the magazine he was still working on gadgets and ideas. “Usefulness is not my main goal,” he said. “I like to solve new problems all the time. I keep asking myself, ‘how would you do this? Is it possible to make a machine to do that?’.”

By 1998, when the former Bell Labs buildings in Florham Park, New Jersey, were re-named the AT&T Shannon Laboratories, Shannon was in a nursing home, too sick to attend. When his hometown of Gaylord, Michigan, put up a statue in his honor in October 2000, Shannon’s wife Betty stood in for him.
A few months later, in February 2001, Shannon lost his long battle with Alzheimer’s.

Eight months later, to celebrate Shannon’s life and highlight advances in information theory, UCSD professor Jack Wolf, winner of the 2001 Shannon Award, organized a Shannon Symposium. The conference at the Center for Magnetic Recording Research on the UCSD campus featured presentations by 14 world-renowned thinkers and practitioners of information theory.

With Qualcomm CEO Irwin Jacobs and others in attendance, the group dedicated a casting of a statue of Shannon originally commissioned for his hometown—underscoring UCSD’s drive to build a center of excellence in information theory among faculty and students. Notes Wolf: “We have within the last ten years had the best and the brightest young Shannon information theorists in the U.S. At least in my opinion, this is the center of gravity of information theory in the United States at a university.”

At the symposium, former colleagues, students, friends and admirers discussed Shannon’s legacy.

Caltech’s McEliece: “There’s always this historical question: if this person hadn’t existed would the world be different? And people say no, someone else would have thought the same thing. But I don’t believe that. I believe individual people make a difference. And Shannon was one of the most supreme intellects of this century.”

UMich’s Neuhoff: “I suspect we’re ten, twenty years ahead of where we would have been if Shannon hadn’t been there to make the discoveries. There would have been a lot of small discoveries, but he presented us with a big clear picture all at once.”

Qualcomm’s Viterbi: “What he did for communications and information theory, was startling and momentous and if he hadn’t come along, it would have probably taken us another thirty or forty years to come up with probably only a subset of his.”

“The world would have gone ahead; we would probably still have had the Internet today; we would have had high speed modems and that kind of thing and error correcting codes, but they might have been delayed, and there is probably no one in the world you can say that about—that except for that person, the world might not be quite like it is today,” says Telcordia’s Lucky, concluding: “I truly believe Shannon was almost unique in that sense, that we wouldn’t be as far along as we are today if he hadn’t done what he did.”




165.132.59.119 Hong-Yeop Song: 여기있는 사람의 상당수가 작년 5월경에 LA에 모여서 파티를 했습니다... 내 지도교수님인 Dr. Golomb의 70회 생일을 맞이하여 개최한 conference인데 여기서 찍은 사진들은 내 홈페이지에 가면 찾을수있습니다...  -[12/06-18:11]-

165.132.59.119 Hong-Yeop Song: http://coding.yonsei.ac.kr/photo/2002GF70_LosAngeles/index.html   -[12/06-18:13]-

165.132.116.28 이효빈: 와~ Dr. Reed, Dr. Viterbi등등.. 이름만 들었던 분들이...대거 등장하시네요...와...  -[12/06-22:14]-

220.85.167.201 백종민: 음.. 마치 영웅들을 보는듯한 -_-; 정말 총출동(?)이네요^^;  -[12/06-23:20]-

165.132.59.119 Hong-Yeop Song: Dr. Reed는 나와 논문같이 발표한 적이있는 대단한 사람이고, Dr.Viterbi는 내가 Qualcomm에 있을때 CTO이었는데 지금은 은퇴했지.... 역시 대단한 사람....  -[12/07-00:47]-

211.190.41.125 강기헌: 글 쭉 읽어봤는데요. 퀄콤의 창립자 중에 한 사람이 비터비 박사라는 사실은 처음 알았습니다. 퀼콤 중소기업 형태로 출발했다고 하던데요. 그런 것도 아닌 듯. 그런 분들이 있었으니 퀄콤이 성공할 수 밖에 없었던 것 같기도 하네요..  -[12/07-23:05]-

165.132.59.119 Hong-Yeop Song: Irwin Jacobs와 Andrew Viterbi 이렇게 두사람이 처음 시작했지... 처음엔 아마도 대여섯명이서 시작했을거야...중소기업 벤처기업 형태로... 이 두사람은 그간 벤처기업을 여러가지형태로 운영했던 경험을가지고 있었는데 퀄콤에 와서야 꽃을 활짝 피웠지...  -[12/08-00:15]-

165.132.59.119 Hong-Yeop Song: 이두사람은 각각 MIT와 USC 박사 출신인데, 묘하게도 대부분의 중진급 engineer들이 이 두 학교 출신이 가장 많단다...^^  -[12/08-00:17]-

165.132.59.119 Hong-Yeop Song: 가능하면 일그려 하기보단 인터넷에 접속해서 방송을 보면 더 재미있지.... 실제로 샤논의 젊었을때 발표하는 모습도 있고...  -[12/08-00:18]-

165.132.59.119 Hong-Yeop Song: http://coding.yonsei.ac.kr/researchinterest.html   -[12/08-00:19]-

220.85.167.201 백종민: 음... 신문같은 곳에서는 어윈 제이콥스 라는 분은 퀄컴 회장이라고 많이 나왔었는데...비터비 박사님은 잘 안나오더라구요..^^; 은퇴하셔서 그랬던걸까요.. 아무튼 중소기업이라도 저런 분들이 있다면 레벨이 다르군요 ㅡㅡ;  -[12/08-02:14]-

번호 제목 글쓴이 날짜 조회 수
공지 논문에 영어작문 주의사항 몇 가지 송홍엽 2008.05.22 9435
공지 젊은 학부생 여러분에게... 송홍엽 2008.11.20 6392
공지 우리학과 대학원생 모두에게 (특히, 박사과정들에게) 하고싶은 말입니다. 송홍엽 2014.01.20 8461
27 대학원생 생활 가이드 [퍼온글] 송홍엽 2007.10.11 8475
26 [펀글] 조선일보 1월2일 사설: 교육개혁 송홍엽 2008.01.03 3038
25 [펀글] 교육의 의미 송홍엽 2008.02.19 3025
24 [펀글] 한반도 운하 건설을 반대하며 송홍엽 2008.02.20 3350
23 [펀글] 추장의 선언 [1] 송홍엽 2008.02.22 8006
22 암호이야기 송홍엽 2008.05.21 3096
21 우리학부 전공 2-3학년생에 대한 조언 송홍엽 2008.05.22 3956
20 글 잘 쓰는 이공계가 성공한다 송홍엽 2008.05.22 27741
19 [펀글] 전화기를 최초로 발명한 사람은 누구인가 송홍엽 2008.05.23 7001
18 수학사 바로잡기 (1) - 오일러의 36명 장교문제와 조선시대 최석정 (수정본 - 일부 오류 수정) [1] file 송홍엽 2008.06.03 7294
17 수학사 바로잡기 (2) 송홍엽 2008.06.03 7902
16 수학사 바로잡기 (3) - 월간 과학동아 2008년 7월호 강석기 기자의 글 송홍엽 2008.06.20 4327
15 김정한 교수의 '창의적 수학교육' [1] file 송홍엽 2008.06.20 6154
14 [펀글]수학의 open problems 모음. 송홍엽 2008.07.30 15231
13 움직이는 글자 태그 송홍엽 2008.09.04 3927
12 [퍼온글] FM방식 개발한 암스트롱 송홍엽 2008.09.24 4160
11 [퍼온글] 해석학적 극한의 의미 송홍엽 2008.12.18 4121
10 [퍼온글] 수학의 힘 -- 동아일보 컬럼 2008.12.18 송홍엽 2008.12.18 3811