香农的简介
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
5. 克劳德. 艾尔伍德. 香农(Claude Elwood Shannon)——数学家、信息论的创始人
克劳德·艾尔伍德·香农(1916—2001)——1916年4月30日出生于美
国密歇根州的加洛德(Petoskey),1936年毕业于密歇根大学并获得数学和电
子工程学士学位,1940年获得麻省理工学院(MIT)数学博士学位和电子工
程硕士学位。
1941年他加入贝尔实验室数学部,工作到1972年。
1956年他
成为麻省理工学院(MIT)客座教授,并于1958年成为终生教授,1978年成
为名誉教授。
香农博士于2001年2月26日去世,享年84岁。
香农在普林斯顿高级研究所(The Institute for Advanced Study at Princeton)期
间,开始思考信息论与有效通信系统的问题。
经过8年的努力,从1948年6月到10月,香农在《贝尔系统技术杂志》(Bell System Technical Journal)上连载发表了影像深远的论文《通讯的数学原理》。
1949年,香农又在该杂志上发表了另一著名论文《噪声下的通信》。
在这两篇论文中,香农解决了过去许多悬而未决的问题:阐明了通信的基本问题,给出了通信系统的模型,提出了信息量的数学表达式,并解决了信道容量、信源统计特性、信源编码、信道编码等一系列基本技术问题。
两篇论文成为了信息论的基础性理论著作。
那时,他才不过刚刚三十出头。
香农的成就轰动了世界,激起了人们对信息论的巨大热情,它向各门学科冲击,研究规模象浪雪球一样越来越大。
不仅在电子学的其他领域,如计算机、自动控制等方面大显身手,而且遍及物理学、化学、生物学、心理学、医学、经济学、人类学、语音学、统计学、管理学……等学科。
它已远远地突破了香衣本人所研究和意料的范畴,即从香农的所谓“狭义盾息论”发展到了“广义信息论”。
香农一鸣惊人,成了这门新兴学科的寞基人。
20世纪80年代以来,当人们在议论未来的时候,人们的注意力又异口同声的集中到信息领域。
按照国际一种流行的说法,未来将是一个高度信息化的社会。
信息工业将发展成头号工业,社会上大多数的人将是在从事后息的生产、加工和流通。
这时,人们才能更正确地估价香农工作的全部含义。
信息论这个曾经只在专家们中间流传的学说,将来到更广大的人群之中。
香农这个名字也飞出了专家的书斋和实验室,为更多的人所熟悉和了解。
香农被尊称为是“信息论之父”。
人们通常将香农于1948年10月发表于《贝尔系统技术学报》上的论文《通信的数学原理》作为现代信息论研究的开端。
这一文章部分基于哈里·奈奎斯特和拉尔夫·哈特利先前的成果。
在该文中,香农给出了熵的定义:
这一定义可以用来推算传递经二进制编码后的原信息所需的信道带宽。
熵的概念量度的是消息中所含的信息量,而去除了消息中固有结构所决定的部分,比如,语言结构的冗余性以及语言中字母、词的使用频度等统计特性。
信息论中熵的概念与物理学中的熵有着紧密的联系。
玻耳兹曼与吉布斯在统计物理学中对熵做了很多的工作。
信息论中的熵也正是受之启发。
互信息(Mutual Information)是另一有用的信息度量,它是指两个事件集合之间的相关性。
两个事件X 和Y的互信息定义为:
其中H(X,Y) 是共有熵(Joint Entropy),其定义为:
互信息与多项式的对数可能性比率校验以及皮尔森的χ2校验有着密切的联系。
Claude Elwood.Shannon (1916-2001)
Claude Elwood Shannon was born in Petoskey, Michigan on April 30, 1916. He spent a productive 15 years at Bell Labs, working with such famous men as John Pierce, known for satellite communication; Harry Nyquist, with numerous contributions to signal theory; Hendrik Bode, who worked on feedback; and George Stibitz, who in 1938 built an early relay computer. Click here to learn more about his life and career.
Look at a compact disc under a microscope and you will see music represented as a sequence of pits, or in mathematical terms, as a sequence of 0's and 1's, commonly referred to as bits. The foundation of our Information Age is this transformation of speech, audio, images and video into digital content, and the man who started the digital revolution was Claude Shannon, who died February 24, at the age of 84, after a long struggle with Alzheimer's disease.
Shannon arrived at the revolutionary idea of digital representation by sampling the information source at an appropriate rate, and converting the samples to a bit stream. He characterized the source by a single number, the entropy, adapting a term from statistical mechanics, to quantify the information content of the source. For English language text, Shannon viewed entropy as a statistical parameter that measured how much information is produced on the average by each letter. He also created coding theory, by introducing redundancy into the digital representation to protect against corruption. If today you take a compact disc in one hand, take a pair of scissors in the other hand, and score the disc along a radius from the center to the edge, then you will find that the disc still plays as if new.
Before Shannon, it was commonly believed that the only way of achieving arbitrarily small probability of error in a communication channel was to reduce the transmission rate to zero. All this changed in 1948 with the publication of A Mathematical Theory of Communication, where Shannon characterized a channel by a single parameter; the channel capacity, and showed that it was possible to transmit information at any rate below capacity with an arbitrarily small probability of error. His method of proof was to show the existence of a single good code by averaging over all possible codes. His paper established fundamental limits on the efficiency of communication over noisy channels, and presented the challenge of finding families of codes that achieve capacity. The method of
random coding does not produce an explicit example of a good code, and in fact it has taken fifty years for coding theorists to discover codes that come close to these fundamental limits on telephone line channels.
The importance of Shannon's work was recognized immediately. According to a 1953 issue of Fortune Magazine: "It may be no exaggeration to say that man's progress in peace, and security in war, depend more on fruitful applications of information theory than on physical demonstrations, either in bombs or in power plants, that Einstein's famous equation works". In fact his work has become more important over time with the advent of deep space communication, wireless phones, high speed data networks, the Internet, and products like compact disc players, hard drives, and high speed modems that make essential use of coding and data compression to improve speed and reliability.
Shannon grew up in Gaylord Michigan, and began his education at the University of Michigan, where he majored in both Mathematics and Electrical Engineering. As a graduate student at MIT, his familiarity with both the mathematics of Boolean Algebra and the practice of circuit design produced what H.H. Goldstine called: "one of the most important master's theses ever written ... a landmark in that it changed circuit design from an art to a science". This thesis, A Symbolic Analysis of Relay and Switching Circuits, written in 1936, provided mathematical techniques for building a network of switches and relays to realize a specific logical function, such as a combination lock. It won the Alfred Noble Prize of the combined engineering societies of the USA and is fundamental in the design of digital computers and integrated circuits.
Shannon's interest in circuit design was not purely theoretical, for he also liked to build, and his sense of play is evident in many of his creations. In the 1950's, when computers were given names like ENIAC (Electronic Numerical Integrator and Calculator) Shannon built a computer called THROBAC I ( THrifty ROman-numeral BAckward-looking Computer), which was able to add, subtract, multiply and even divide numbers up to 85 working only with Roman numerals. His study in Winchester Mass. was filled with such devices, including a maze-solving mechanical mouse and a miraculous juggling machine. Traversing the ceiling was a rotating chain, like those at dry cleaners, from which were suspended the gowns from a score of honorary doctorates. They made a splendid sight flying around the room.
Shannon's 1941 doctoral dissertation, on the mathematical theory of genetics, is not as well known as his master's thesis, and in fact was not published until 1993, by which time most of the results had been obtained independently by others. After graduating from MIT, Shannon spent a year at the Institute for Advanced Study, and this is the period where he began to develop his theoretical framework that lead to his 1948 paper on communication in the presence of noise. He joined Bell Labs in 1941, and remained there for 15 years, after which he returned to MIT. During World War II his work on encryption led to the system used by Roosevelt and Churchill for transoceanic conferences, and inspired his pioneering work on the mathematical theory of cryptography.
It was at Bell Labs that Shannon produced the series of papers that transformed the world, and that transformation continues today. In 1948, Shannon was connecting information theory and physics by developing his new perspective on entropy and its relation to the laws of thermodynamics. That connection is evolving today, as others explore the implications of quantum computing, by enlarging information theory to treat the transmission and processing of quantum states.
Shannon must rank near the top of the list of the major figures of Twentieth Century science, though his name is relatively unknown to the general public. His influence on everyday life, which is already tremendous, can only increase with the passage of time.。