Nobel Prizes and computing   updated February 8, 2016  carrigan@fnal.gov (subject line must be sensible)

Fermilab
Home
Pillars
channeling
Adv. Accel. Infrared/Dyson
SETI Biography Bibliography Nobel Prizes

Nobel Prize site

Nobel Physics Prizes

Nobel medal

Nobel Chemistry Prizes

Nobel Physiology or Medicine Prizes

There is no Nobel Prize for mathematics. As a result there is no convenient slot to slip in rewards for developments in computing. In any case, many mathematicians look down on computing in spite of the fact that some important mathematicians have made seminal contributions to the computing field. One example of a famous mathematician turned computer expert is John von Neumann. Along with Alan Turing von Neumann was responsible for many of the important ideas on computer structure.  In mathematical economics he developed the concepts of game theory. While a great innovation, game theory may have been too thin for a Nobel Prize. In any case von Neumann died more than ten years before the economics prize was created. Another mathematician who has been honored through his connection with economics is John Nash. Riveting descriptions of Nash's battle with schizophrenia have been descibed in Nasar's A Beautiful Mind and the Academy Award movie based on Nasar's book.

Herbert Simon the 1978 Nobel laureate in economics, was a person deeply interested in computing. He came to Carnegie Mellon (then Carnegie Tech) in 1949 a year after Nash received his undergraduate degree there. Simon is the poster child for high school students who don't know what they want to do. Since Simon didn't know what to study he did everything. This was in the same mold as von Neumann. Simon, with Allen Newell, had built a fine computer center that we used for our physics work. Unfortunately the center was pushing an off-brand compiler called GATE. We had to use this at the cyclotron at the same time we worked in Fortran at Brookhaven.

Great science often develops at the boundary of our knowledge or at a fault line between totally different areas. Cosmology and particle physics are illustrations of the first case, computing is an illustration of the second. Modern computing is very much driven by technologies such as the transistor invented by Shockley, Bardeen, and Brattain and the integrated circuit developed by Jack Kilby. Without these Nobel Prize winners there would be no modern computers. (Parentehtically, John Bardeen's son, Bill Bardeen is an eminent theoretical physicist at Fermilab known for his contributions to the study of quantum anomalies and axions.)

For a decade I was responsible for technology transfer at Fermilab. During that time the three biggest developments with some links to Fermilab technology may have been industrial scale superconductivity, TV stereo, and the World Wide Web. Our database of technology contains many entries on superconductivity. The TV stereo patent was obtained outside of work by two employees. About the most we did there was to leave them alone and grant one of them a leave of absence. The third was the World Wide Web. This was invented by our sister laboratory CERN but we quickly became deeply committed. However there are no entries in the Fermilab technology database. Tim Berners-Lee and the World Wide Web have changed the world. I feel that networking  is an emergent science somewhat like economics. It may be soft but there are important things to be learned. Sir Berners-Lee will not receive a Nobel Prize but the world now knows he is there (>1 millions hits on Google circa 2016).