Les anglonautes

About | Search | Vocapedia | Learning | Podcasts | Videos | History | Culture | Science | Translate

 Home Up Next

 

Vocapedia > Science

 

Physics, Chemistry, Biology, Genetics

 

 

 

 

Pioneer geneticist biologist James Watson

w. molecular model of DNA.

 

Location: Cambridge, MA, US

 

Date taken: 1957

 

Photograph: Andreas Feininger

 

Life Images

 

 

 

 

 

 

 

 

 

 

 

 

 

 

science        UK

 

https://www.theguardian.com/
science

 

 

https://www.theguardian.com/science/audio/2024/apr/10/
remembering-physicist-peter-higgs-
podcast - Guardian podcast

 

https://www.theguardian.com/science/brain-flapping/2012/dec/30/
science-future-2013

 

 

 

 

 

 

 

science        USA

 

https://www.nytimes.com/2020/04/30/
opinion/the-argument-coronavirus-science-trump.html

 

https://www.nytimes.com/2020/04/28/
opinion/the-sound-of-gravity-einstein.html

 

 

 

 

http://www.npr.org/sections/13.7/2017/05/02/
526595893/the-joy-of-science

 

http://www.npr.org/sections/13.7/2017/01/22/
510384513/fact-check-science-and-the-trump-administration

 

 

 

 

http://www.npr.org/sections/13.7/2016/11/18/
501985855/can-science-save-the-world

 

http://www.npr.org/sections/13.7/2016/08/03/
488491775/the-madness-of-humanity-part-4-science-vs-religion

 

 

 

 

 

 

 

giant of science        UK

 

https://www.theguardian.com/science/audio/2024/apr/10/
remembering-physicist-peter-higgs-
podcast - Guardian podcast

 

 

 

 

 

 

 

disregard for science        USA

 

https://www.nytimes.com/2020/04/28/
climate/trump-coronavirus-climate-science.html

 

 

 

 

 

 

 

dabble in junk science        USA

 

https://www.nytimes.com/2020/04/30/
opinion/the-argument-coronavirus-science-trump.html

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

reach a fusion power milestone /

reach a breakthrough in nuclear fusion /

 achieve a tremendous scientific breakthrough        USA

 

https://www.npr.org/2022/12/13/
1142208055/nuclear-fusion-breakthrough-climate-change

 

 

 

 

 

 

 

breakthrough        UK

 

https://www.theguardian.com/science/2020/dec/20/
the-virus-free-scientific-breakthroughs-of-2020-chosen-by-scientists

 

 

 

 

 

 

 

breakthrough        USA

 

https://www.npr.org/2022/12/13/
1142208055/nuclear-fusion-breakthrough-climate-change

 

 

 

 

 

 

 

scientific breakthroughs        UK

 

https://www.theguardian.com/news/audio/2021/jan/13/
the-biggest-vaccination-programme-in-uk-history-but-is-it-fast-enough-podcast

 

 

 

 

 

 

 

achievement        USA

 

https://www.npr.org/2022/12/13/
1142208055/nuclear-fusion-breakthrough-climate-change

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

subatomic particle > neutrino        USA

 

https://www.nytimes.com/2020/12/16/
science/jack-steinberger-dead.html

 

 

 

 

 

 

 

Science review of 2011:

the year's 10 biggest stories        UK

 

Neutrino particles appeared to prove

Einstein wrong by travelling faster than light,

while the discovery of an Earth-like planet

raised hopes of finding life on another world

 

https://www.theguardian.com/science/2011/dec/18/
science-discoveries-review-2011 

 

 

 

 

 

 

 

the ten most significant objects

in the history of science,

engineering, technology and medicine        2009

 

https://www.theguardian.com/science/gallery/2009/jun/08/
computing-engineering?picture=348563646 

 

 

 

 

 

 

 

Science fiction:

Images from other worlds – in pictures        UK

 

A new exhibition at the British Library

presents the rich history of SF down the ages,

from Lucian of Samosata in the 2nd century

to the Russian novel that inspired 1984.

Take a look

 

https://www.theguardian.com/books/gallery/2011/may/12/
science-fiction-in-pictures 

 

 

 

 

 

 

 

scientist        UK / USA

 

https://www.nytimes.com/2017/04/22/
science/march-for-science.html

 

http://www.nytimes.com/2014/05/28/us/
politics/talents-on-display-at-white-house-science-fair.html

 

https://www.theguardian.com/film/video/2013/jul/01/
stephen-hawking-watch-the-trailer 

 

https://www.theguardian.com/science/2006/sep/26/
schools.highereducation 

 

 

 

 

 

 

 

boffin        UK

 

https://www.theguardian.com/science/brain-flapping/2013/jul/26/
boffins-backlash-scientists-dont-like-being-called-that 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

chemistry        USA

 

https://www.npr.org/2023/10/04/
1203554566/scientists-win-chemistry-nobel-prize-for-quantum-dots-nanoparticles

 

 

 

 

 

 

 

chemist        USA

 

http://www.nytimes.com/2012/04/25/
science/george-cowan-nuclear-scientist-dies-at-92.html

 

 

 

 

 

 

 

Nobel Prize in chemistry        USA

 

https://www.npr.org/2023/10/04/
1203554566/scientists-win-chemistry-nobel-prize-for-quantum-dots-nanoparticles

 

 

 

 

 

 

 

biochemistry and molecular biology        UK

 

https://www.theguardian.com/science/
biochemistrymolecularbiology

 

 

 

 

 

 

 

evolutionary biology        USA

 

https://www.nytimes.com/2021/12/27/
science/eo-wilson-dead.html

 

 

 

 

 

 

 

biologist        USA

 

http://www.nytimes.com/2010/01/21/us/
21nirenberg.html

 

 

 

 

 

 

 

microbiologist        USA

 

https://www.nytimes.com/2017/02/03/
science/h-boyd-woodruff-dead-antibiotics-researcher.html

 

 

 

 

 

 

 

medical research        UK

 

https://www.theguardian.com/science/
medical-research 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

physics        UK /USA

 

https://www.theguardian.com/science/
physics

 

 

https://www.npr.org/2019/10/08/
768156346/3-scientists-win-nobel-prize-in-physics-
for-work-on-the-evolution-of-the-univers

 

http://www.nytimes.com/2012/05/22/
science/american-scientists-fear-losing-edge-in-physics.html

 

https://www.theguardian.com/science/2006/oct/03/
nobelprizes.highereducation 

 

 

 

 

 

 

 

win the Nobel prize in Physics        USA

 

https://www.npr.org/2019/10/08/
768156346/3-scientists-win-nobel-prize-in-physics-
for-work-on-the-evolution-of-the-univers

 

http://www.nytimes.com/2012/10/10/
science/french-and-us-scientists-win-nobel-physics-prize.html

 

 

 

 

 

 

 

Nobel winner in Physics        USA

 

http://www.nytimes.com/2013/03/05/
science/donald-glaser-nobel-winner-in-physics-dies-at-86.html

 

 

 

 

 

 

 

particles > muons        USA

 

the muon (...)

is akin to an electron but far heavier,

and is an integral element of the cosmos.

 

https://www.nytimes.com/2021/04/07/
science/particle-physics-muon-fermilab-brookhaven.html

 

 

 

 

 

 

 

The beauty of the Higgs boson

 

The discovery of the Higgs boson

is the jewel in the crown of particle physics        2012

 

https://www.theguardian.com/science/
higgs-boson

 

 

https://www.theguardian.com/science/2014/apr/13/
particle-fever-film-higgs-boson-director-mark-levinson

 

 

 

 

http://www.guardian.co.uk/science/shortcuts/2013/apr/23/should-change-name-higgs-boson

 

http://www.guardian.co.uk/science/2013/jan/01/higgs-boson-large-hadron-collider

 

 

 

 

http://www.guardian.co.uk/science/2012/dec/25/higgs-boson-discovery-extraordinarily-tense

 

http://www.guardian.co.uk/science/2012/aug/05/jeff-forshaw-higgs-boson-discovery

 

http://www.guardian.co.uk/science/blog/2012/jul/04/higgs-boson-universe-peter-higgs

 

http://www.guardian.co.uk/commentisfree/2012/jul/04/higgs-boson-discovery-giant-leap

 

 

 

 

http://www.guardian.co.uk/science/2011/dec/13/higgs-boson-lhc-explained

 

http://www.guardian.co.uk/science/2011/dec/11/higgs-boson-cern-jeff-forshaw

 

 

 

 

 

 

 

particle physics    UK

 

https://www.theguardian.com/science/
particlephysics

 

 

 

 

 

 

 

Higgs particle / Higgs boson /  "God particle"

 

https://www.nytimes.com/topic/subject/
higgs-boson

https://en.wikipedia.org/wiki/
Higgs_boson

 

 

http://www.nytimes.com/2014/05/04/us/
gerald-guralnik-77-a-god-particle-pioneer-dies.html

 

http://www.guardian.co.uk/science/2011/apr/28/
higgs-boson-rumour-cern-lhc

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

nuclear fusion        UK / USA

 

https://www.theguardian.com/environment/2022/dec/13/
what-is-nuclear-fusion-what-have-scientists-achieved-ignition

 

https://www.theguardian.com/environment/2022/dec/13/
us-scientists-confirm-major-breakthrough-in-nuclear-fusion

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

cold fusion        USA

 

http://www.nytimes.com/2012/08/12/
science/martin-fleischmann-cold-fusion-seeker-dies-at-85.html

 

 

 

 

 

 

 

physicist        UK / USA

 

https://www.nytimes.com/2018/03/01/
obituaries/richard-e-taylor-nobel-winner-who-plumbed-matter-dies-at-88.html

 

 

 

 

http://www.npr.org/sections/thetwo-way/2015/11/04/
454594496/physicists-probe-antimatter-for-clues-of-how-it-all-began

 

 

 

 

http://www.theguardian.com/science/2014/may/18/
matter-light-photons-electrons-positrons

 

http://www.nytimes.com/2014/03/18/
science/space/detection-of-waves-in-space-buttresses-landmark-theory-of-big-bang.html

 

 

 

 

http://www.nytimes.com/2013/10/09/
opinion/no-physicist-is-an-island.html

 

http://www.nytimes.com/2013/10/02/us/
harold-m-agnew-physicist-present-at-birth-of-the-nuclear-age-dies-at-92.html

 

 

 

 

http://www.nytimes.com/2008/12/09/
science/09kantrowitz.html

 

 

 

 

http://www.nytimes.com/2008/04/14/
science/14wheeler.html

 

 

 

 

 

 

 

The 10 best physicists

From subatomic to cosmic,

the pick of the pioneers        12 May 2013

 

Galileo (1564-1642)

 

Isaac Newton (1643-1727)

 

Albert Einstein (1879-1955)

 

http://www.guardian.co.uk/culture/gallery/2013/may/12/
the-10-best-physicists

 

http://www.theguardian.com/science/2014/apr/05/
einstein-equation-emc2-special-relativity-alok-jha

 

 

 

 

 

 

 

cosmologists

— physicists that work on the properties of the universe        USA

 

http://www.npr.org/sections/13.7/2017/05/24/
529675773/what-does-an-expanding-universe-really-mean

 

 

 

 

 

 

 

antimatter        USA

 

http://www.npr.org/sections/thetwo-way/2015/11/04/
454594496/physicists-probe-antimatter-for-clues-of-how-it-all-began

 

 

 

 

 

 

 

 

 

 

 

 

 

 

US scientists get glimpse of antihelium        UK        April 2011

 

Heaviest particles of antimatter seen in a lab

survive for about 10 billionths of a second

before crashing into collider's detector

http://www.guardian.co.uk/science/2011/apr/24/
antihelium-antimatter-brookhaven
 

 

 

 

 

engineer        USA

http://www.nytimes.com/2008/12/09/science/09kantrowitz.html

 

 

 

 

research        USA

http://www.nytimes.com/2008/12/09/science/09kantrowitz.html

 

 

 

 

laser        USA

http://www.nytimes.com/2013/07/28/
science/james-gordon-dies-at-85-work-paved-way-for-laser.html

 

 

 

 

laser propulsion        USA

http://www.nytimes.com/2008/12/09/
science/09kantrowitz.html

 

 

 

 

superconductor

 

 

 

 

space studies        USA

http://www.nytimes.com/2014/10/03/us/
gerry-neugebauer-pioneer-in-space-studies-dies-at-82.html

 

 

 

 

astrophysics

 

 

 

 

 astrophysicist        USA

http://www.nytimes.com/2014/10/03/us/
gerry-neugebauer-pioneer-in-space-studies-dies-at-82.html

 

 

 

 

robotics

http://www.cogniron.org/

 

 

 

 

inventor        UK

http://www.theguardian.com/science/2006/oct/11/
food.foodanddrink 

 

 

 

 

creationist

 

 

 

 

archaeologist

 

 

 

 

genetics        UK

https://www.theguardian.com/science/
genetics

 

 

 

 

pioneering DNA scientist James Watson        UK

http://www.theguardian.com/education/2007/oct/19/
highereducation.uk 

 

 

 

 

electronics

 

 

 

 

transistor        USA        1948

http://www.nytimes.com/2009/09/01/science/01trans.html

 

http://www.nytimes.com/2009/09/01/science/01first.html

 

 

 

 

silicon nanowires        USA

http://www.nytimes.com/2009/09/01/science/01trans.html

 

 

 

 

 

 

 

 

 

 

 

 

 

 

evolution > Stephen Jay Gould    1941-2002

https://www.npr.org/templates/story/
story.php?storyId=1143690 - May 21, 2002

 

 

 

 

evolution >

British Naturalist Charles Robert Darwin >

The Origin of Species

https://www.theguardian.com/science/2008/jun/22/
darwinbicentenary.evolution

 

 

 

 

Guardian 2008 Science Quiz:

Breakthroughs and Bust-ups

https://www.theguardian.com/science/quiz/2008/dec/29/2008-
science-breakthroughs-quiz

 

 

 

 

DNA double helix

https://www.theguardian.com/science/video/2010/nov/30/
dna-double-helix-watson-crick

 

 

 

 

Francis H. C. Crick

co-discoverer of the structure of DNA

http://www.nytimes.com/2010/09/30/science/30crick.html

 

 

 

 

James D. Watson

co-discoverer of the structure of DNA

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Time Covers - The 50S

TIME cover 04-29-1957

ill. of Simon Ramo and Dean Wooldrige.

 

Date taken: April 29, 1957

 

Photograph: Boris Artzybasheff

 

Life Images

http://images.google.com/hosted/life/7f610e1e7fca97c0.html

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Time Covers - The 50S

TIME cover 11-18-1957 ill. of scientist Edward Teller.

 

Date taken: November 18, 1957

 

Photograph: Boris Artzybasheff

 

Life Images

http://images.google.com/hosted/life/bdde1ba76756e7bc.html
 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Corpus of news articles

 

Science > Physics, Chemistry,

 

Biology, Genetics

 

 

 

Science Is the Key to Growth

 

October 28, 2012

The New York Times

By NEAL F. LANE

 

Houston

MITT ROMNEY said in all three presidential debates that we need to expand the economy. But he left out a critical ingredient: investments in science and technology.

Scientific knowledge and new technologies are the building blocks for long-term economic growth — “the key to a 21st-century economy,” as President Obama said in the final debate.

So it is astonishing that Mr. Romney talks about economic growth while planning deep cuts in investment in science, technology and education. They are among the discretionary items for which spending could be cut 22 percent or more under the Republican budget plan, according to the Center on Budget and Policy Priorities.

According to the American Association for the Advancement of Science, the plan, which Mr. Romney has endorsed, could cut overall nondefense science, engineering, biomedical and technology research by a quarter over the next decade, and energy research by two-thirds.

Mr. Romney seems to have lost sight of the critical role of research investments not only in developing new medicines and cleaner energy sources but also in creating higher-skilled jobs.

The private sector can’t do it alone. We rely on companies to translate scientific discoveries into products. But federal investment in research and development, especially basic research, is critical to their success. Just look at Google, which was started by two graduate students working on a project supported by the National Science Foundation and today employs 54,000 people.

Richard K. Templeton, chief executive of Texas Instruments, put it this way in 2009: “Research conducted at universities and national labs underpins the new innovations that drive economic growth.”

President Bill Clinton, for whom I served as science adviser from 1998 to 2001, understood that. In those years, we balanced the federal budget and achieved strong growth, creating about two million jobs a year. A main reason was the longstanding bipartisan consensus on investing in science. With support from Congress, Mr. Clinton put research funding on a growth path, including a doubling over five years (completed under President George W. Bush) of the budget for the National Institutes of Health.

In 2010, the federal government invested about $26.6 billion in N.I.H. research; those investments led to $69 billion in economic activity and supported 485,000 jobs across the country, according to United for Medical Research, a nonpartisan group.

Moreover, the $3.8 billion taxpayers invested in the Human Genome Project between 1988 and 2003 helped create and drive $796 billion in economic activity by industries that now depend on the advances achieved in genetics, according to the Battelle Memorial Institute, a nonprofit group that supports research for the industry.

So science investments not only created jobs in new industries of the time, like the Internet and nanotechnology, but also the rising tax revenues that made budget surpluses possible.

American science has not been faring so well in recent budgets. President Obama has repeatedly requested steady increases for scientific research, aimed at putting the budgets of three key science agencies — the National Science Foundation, the Department of Energy’s Office of Science, and the National Institute of Standards and Technology — on a path to double, by 2016, the combined $10 billion they received in 2006. But a polarized Congress has not delivered at that rate, and the goal could be nullified if next year sees the beginning of draconian cuts.

Meanwhile, the frontiers of science continue to expand. President Obama is proposing that the United States boost its overall national research and development investments — including private enterprise and academia as well as government — to 3 percent of gross domestic product — a number that would still lag behind Israel, Sweden, Japan and South Korea, in that order.

In an increasingly complex world, that should be only a start. If our country is to remain strong and prosperous and a land of rewarding jobs, we need to understand this basic investment principle in America’s future: no science, no growth.

 

Neal F. Lane,

a professor of physics and astronomy at Rice University,

was director of the National Science Foundation

and the chief science and technology adviser

to President Bill Clinton.

Science Is the Key to Growth,
NYT,
28.10.2012,
https://www.nytimes.com/2012/10/29/
opinion/want-to-boost-the-economy-invest-in-science.html

 

 

 

 

American Physics Dreams Deferred

 

May 21, 2012

The New York Times

By DENNIS OVERBYE

 

When three American astronomers won the Nobel Prize in Physics last year, for discovering that the expansion of the universe was speeding up in defiance of cosmic gravity — as if change fell out of your pockets onto the ceiling — it reaffirmed dark energy, the glibly named culprit behind this behavior, as the great cosmic surprise and mystery of our time.

And it underscored the case, long urged by American astronomers, for a NASA mission to measure dark energy — to determine, for example, whether the cosmos would expand forever or whether, perhaps, there might be something wrong with our understanding of gravity.

In 2019, a spacecraft known as Euclid will begin such a mission to study dark energy. But it is being launched by the European Space Agency, not NASA, with American astronomers serving only as very junior partners, contributing $20 million and some infrared sensors.

For some scientists, this represents an ingenious solution, allowing American astronomers access to the kind of data they will not be able to obtain on their own until NASA can mount its own, more ambitious mission in 2024.

But for others, it is a setback. It means that for at least the next decade, Americans will be relegated to a minor role in following up on their own discovery.

“While it’s great to support other missions,” said Adam Riess of Johns Hopkins and the Space Telescope Science Institute, who shared that Nobel last year, “it would be disappointing to see the U.S. lose or outsource its own leading role in one of the hottest areas of research.”

For Dr. Riess and his colleagues, this turn of events is another example of a worrying trend in which American scientists, facing budget deficits and political gridlock, have had to pull back from or delay promising projects while teams based in Europe hunt down the long-sought Higgs boson or rocket scientists in China plan a Moon landing in 2025.

Michael Turner, a cosmologist at the University of Chicago, called dark energy “an example of how the U.S. seems to misplay its science hand these days.”

“We predicted and discovered dark energy,” he said. “We have the biggest dark-energy community and the best ground game; we have been designing a space mission since 1998; and now the Europeans will fly it with our minor participation. Something is wrong with this picture.”

Saul Perlmutter, of the University of California, Berkeley, another of the dark energy Nobel winners, said, “The danger, of course, is that we will watch the science (and scientists — and good students) move on to other countries and continents, where projects are being begun and completed.”

With them, scientists say, could go the cultural excitement and innovative spark that invigorates the economy. The World Wide Web, for example, was invented at CERN, the European Organization for Nuclear Research (home to the world’s most powerful particle accelerator, the Large Hadron Collider), to help particle physicists communicate.

By contrast, the United States’ flagship lab for high-energy physics, the Fermi National Accelerator Laboratory, known as Fermilab, had to close down its accelerator. the Tevatron, last fall, and learned from the Energy Department in March that the agency could not afford to follow through for now on a $1.3 billion underground experiment to study the spooky shape-shifting properties of particles known as neutrinos in an effort to investigate why the universe is made of matter and antimatter.

At the same time, the department also canceled money for studies for the world’s next big physics machine, the International Linear Collider, which would be the successor to CERN’s giant collider. American scientists are resigned to the likelihood that it will not be built in the United States.

American physicists are now rethinking how to carry out the neutrino experiment, which was to have been the centerpiece of a plan to convert the old 8,000-foot-deep Homestake gold mine in Lead, S.D., into a national laboratory for underground science.

Similar facilities in Italy, Canada and Japan have become centers in the search for dark matter, neutrino experiments and other delicate work that requires shelter from cosmic rays. But in 2010, the National Science Foundation walked away from the $875 million project, citing unease about safety and “stewardship” of the old mine, which is half full of water. Meanwhile with support from a philanthropist, T. Denny Sanford, the State of South Dakota has reopened parts of the mine for a pair of physics experiments.

Of course, there is no achievement of modern American science — from the Manhattan Project to the Hubble Space Telescope to the decoding of the human genome — that does not owe a debt to hard and even heroic bargaining in the formerly smoke-filled rooms of Congress and the White House. Complaints and grim prognostications about the federal research budget are part of the background music of science. The situation is always fluid.

Given all that, other scientists say that basic research is doing as well as can be expected given severe budgetary restraints. Among other things, NASA’s Webb telescope, the successor to the Hubble, is on target for launching in 2018, at a cost of $8 billion.

The science office in the Energy Department actually got an increase in the budget released by President Obama in February, but the money went to more applied research, into areas like energy and scientific computing.

Only last month, Congress added money to the neutrino experiment, although not enough to get the whole project back on track, said Katie Yurkewicz, Fermilab’s spokeswoman, and the Association of American Universities issued a statement applauding appropriators “for their bipartisan actions thus far to sustain the nation’s investment in scientific research.”

Debra Elmegreen, a professor of astronomy at Vassar and president of the American Astronomical Society, has spent a lot of time in Washington lately. “Congress seems supportive of science,” she said. “I’m encouraged that people recognize the need for science and technology to continue.”

She added: “U.S. leadership is at risk in practically every area. We’ll get by for now, but we can’t be complacent.”

Indeed, all bets could be off if the automated budget cuts, called “sequestration,” decreed by the failure of the deficit reduction negotiations last summer, go into effect in January. A certain amount of gloom has reached the most prestigious levels of American science. Writing in a recent issue of The New York Review of Books, Steven Weinberg, a Nobel laureate at the University of Texas in Austin, decried the fact that science was increasingly having to compete with other worthy causes, like health care and education, for money. The solution, he said, was to raise taxes.

Frank Wilczek, a Nobel laureate at the Massachusetts Institute of Technology, described himself as “less gloomy” than Dr. Weinberg, but nonetheless concerned that, with the end of the cold war and bad economic times, the traditional political bases of government support for basic research had dwindled, leaving only “curiosity and desire to participate, even indirectly, in doing something great.” He added, “We should be doing that anyway, of course.”

“This is all a great pity, because tremendous scientific opportunities are available,” Dr. Wilczek said.

The United States gave up its leadership in high-energy particle physics 20 years ago, when Congress canceled the Superconducting Super Collider, a particle accelerator that was under construction in Texas. That move cleared the way for CERN’s eventual supremacy.

Could the same thing happen in space? Those fears were aroused two years ago when NASA announced that the James Webb Space Telescope project needed $1.6 billion more and several years to complete. It was canceled by the House Appropriations Committee last summer but later restored. The price of that rescue, however, was to delay NASA’s dark energy mission, and to withdraw from a couple of upcoming joint Mars missions.

The travails of the dark energy researchers provide a window into the labyrinthine process by which big projects live or die. Whether you find the story hopeful or depressing depends on who you are.

Dark energy is, according to Dr. Wilczek, “the most mysterious fact in all of physical science, the fact with the greatest potential to rock the foundations.”

The discovery that the expansion of the cosmos was speeding up came from observing exploding stars known as Type 1A supernovas, luminous and uniform enough to serve as distance markers. Realizing that only a telescope in space could find and measure supernovas distant enough to shed light, so to speak, on the genesis of this behavior, Dr. Perlmutter early on urged the construction of a special space probe.

Two years ago, after a decade of wrangling among astronomers and NASA and the Energy Department, a blue-ribbon panel from the National Academy of Sciences charged with determining astronomical priorities endorsed a version of this idea as the highest-priority space science mission for the coming decade. The billion-dollar mission, called Wfirst, for Wide-Field Infrared Survey Telescope, would search for exoplanets as well as measure the effects of dark energy on the history and evolution of the universe. The academy’s deliberations were ambushed, however, by the subsequent announcement of the Webb telescope’s problems, which have pushed the Webb launch all the way to 2018.

Lia LaPiana, a program executive at NASA, said that, from 2013 to 2018, there is “zero money” in the president’s budget for the mission.

As a down payment on an eventual mission, NASA suggested spending about $200 million for a 20 percent stake in Euclid, the European Space Agency’s mission. The academy rejected that idea, saying that Euclid, which would not measure supernovas at all, did not meet the academy’s requirements and could undermine the possibility of an eventual Wfirst project.

Recently, however, a specially convened committee of the academy has given the nod to a $20 million investment in Euclid in the form of equipment, saying it would give American astronomers a seat on the Euclid science team and access to its data.

What changed in the last year? David Spergel, an astrophysicist at Princeton, who was chairman of the committee, said, “For 20 percent, we were offered a modest participation. For 2 percent, we were offered a modest participation.”

Paul Schechter of M.I.T., co-chairman of a team planning Wfirst, said he had endorsed the deal on the grounds that it would not impinge on the Wfirst project. When the president’s budget was released, however, there was $9 million for Euclid and no money for beginning the Wfirst project. The budget is only the first step in this dance, however.

In March, Dr. Schechter told a House appropriations subcommittee that the budget did not represent the “strong U.S. commitment” to Wfirst that the academy had recommended. He asked for $8 million to get things going.

The House panel agreed and responded in its report by directing NASA to explain how its plans were consistent with the academy’s recommendations. In the meantime, the corresponding Senate appropriations subcommittee has put $10 million for the NASA dark energy mission into its proposed budget report, citing last year’s physics Nobelists by name.

There are more dance moves to make in this year’s budget and the rest of the decade before the Wfirst mission can become a reality, but Dr. Schechter said he felt encouraged. “I suspect Wfirst wasn’t very high on anyone’s list of priorities,” he said, referring to turmoil at NASA last year.

He added, “I think that is changing.”

    American Physics Dreams Deferred, NYT, 21.5.2012,
    http://www.nytimes.com/2012/05/22/science/
    american-scientists-fear-losing-edge-in-physics.html

 

 

 

 

 

George Cowan,

Nuclear Scientist,

Dies at 92

 

April 24, 2012
The New York Times
By DOUGLAS MARTIN

 

George Cowan, a chemist who helped build the first atomic bomb, detect the first Soviet nuclear explosion and test the first hydrogen bomb, died on Friday at his home in Los Alamos, N.M. He was 92.

The Santa Fe Institute, a scientific research center that Dr. Cowan headed and helped found, announced the death.

For his many contributions, Dr. Cowan was awarded the federal Energy Department’s highest honor, the Enrico Fermi Award, and the highest honor given by the Los Alamos National Laboratory, the Los Alamos Medal. The citation on his Los Alamos award called him “the driving force in the early radiochemical evaluations of nuclear weapons.”

Dr. Cowan began thinking about the possibility of a bomb in 1938, when he brought a clipping about nuclear fission to his physics professor and asked him to talk about the possibility of a weapon based on splitting the atom. His professor at Worcester Polytechnic Institute in Massachusetts made a convincing argument that it would not happen, but when Dr. Cowan graduated three years later, the professor referred him to Eugene Wigner, a physicist at Princeton.

Dr. Wigner was conducting experiments on the atom’s structure with Princeton’s atom smasher, and the experimenters needed uranium. Dr. Cowan was sent to a laboratory in Massachusetts to retrieve a kilogram of uranium. He carried it back to Princeton in a convertible driven by a colleague, the precious cargo between his legs and covered in dry ice.

That experience led him to the Manhattan Project, the federal government’s secret effort to develop the atomic bomb.

Dr. Cowan was at the first controlled nuclear reaction on Dec. 2, 1942, at the University of Chicago. He was at Oak Ridge, Tenn., to help measure plutonium production; at Columbia University to study the energy of neutrons; and at Los Alamos to help track plutonium inventories. He went to Bikini Atoll in the Pacific for nuclear detonation tests.

It was unusual for a scientist to be sent to so many sites. Dr. Cowan said that his expertise made him valuable as a troubleshooter, and that his being unmarried was also helpful.

In 1946 he married a fellow chemist from the Manhattan Project, Helen Dunham. They were married 65 years and had no children. She died last year.

After World War II, Dr. Cowan earned a doctorate from Carnegie Mellon University. He returned to Los Alamos in 1949. Weeks after his arrival, an American surveillance plane detected high levels of radiation emanating from the Soviet Union. Dr. Cowan was named to the team that analyzed the data. The group pinpointed the detonation of the Russian bomb to within an hour on Aug. 29, 1949.

“It’s time to write our summary,” Hans Bethe, the group leader, said, according to Dr. Cowan. “It can be a long document about what we don’t know or a short one about what we know.”

“We wrote a short one,” Dr. Cowan said.

President Harry S. Truman ordered the development of a more powerful weapon, the hydrogen bomb, to counteract the Soviet device. Dr. Cowan became part of the group that developed it. He was on the command ship, the Estes, when it was successfully detonated on Nov. 1, 1952.

George Arthur Cowan was born on Feb. 15, 1920, in Worcester and attended local schools before moving on to Worcester Poly. He was 21 when he joined Dr. Wigner, a future Nobel Prize winner, at Princeton.

Dr. Cowan argued that nuclear weapons development contributed to scientific progress, pointing to the creation of two new elements and 15 new isotopes in the first hydrogen bomb explosion. In 1965, he directed an experiment in which an underground nuclear explosion created fermium 257, the heaviest known isotope that can be created by neutron bombardment of lighter elements.

In a 1979 article in Scientific American, he reported the startling discovery that atomic reactions were going on almost two billion years ago. He said a natural fission reactor was formed in present-day Gabon when geological changes caused water to flow into pockets of uranium. This “reactor,” over its life span of several hundred thousand years, consumed some six tons of uranium 235.

Dr. Cowan assembled scientists in 1984 to start the Santa Fe Institute, which studies different sorts of complex systems. He founded the Los Alamos National Bank in the mid-1960s and was chairman for three decades. He was part of the group that in 1953 started the Santa Fe Opera, of which he was treasurer.

Dr. Cowan also served on the White House Science Council during the Reagan administration, where he reunited with Edward Teller, a leader in developing the hydrogen bomb. Dr. Teller lobbied for President Ronald Reagan’s missile defense system, popularly called “Star Wars,” while Dr. Cowan opposed it because he did not think it would work.

The two had been part of a regular poker game at Los Alamos. Dr. Cowan said he particularly liked to play with Dr. Teller “because he had a tendency to draw to inside straights” — generally a losing hand.

    George Cowan, Nuclear Scientist, Dies at 92, NYT, 24.4.2012,
    http://www.nytimes.com/2012/04/25/science/
    george-cowan-nuclear-scientist-dies-at-92.html

 

 

 

 

 

Norman Krim,

Who Championed the Transistor,

Dies at 98

 

December 20, 2011
The New York Times
By DENNIS HEVESI

 

Norman Krim, an electronics visionary who played a pivotal role in the industry’s transition from the bulky electron vacuum tube, which once lined the innards of radios and televisions, to the tiny, far more powerful transistor, died on Dec. 14 in a retirement home in Newton, Mass. He was 98.

The cause was congestive heart failure, his son Robert said.

Mr. Krim, who made several breakthroughs in a long career with the Raytheon Company and who had an early hand in the growth of the RadioShack chain, did not invent the transistor. (Three scientists did, in 1947, at Bell Laboratories.)

But he saw the device’s potential and persuaded his company to begin manufacturing it on a mass scale, particularly for use in miniaturized hearing aids that he had designed. Like the old tube, a transistor is a semiconductor that amplifies audio signals.

As Time magazine wrote in 1953: “This little device, a single speck of germanium, is smaller than a paper clip and works perfectly at one-tenth the power needed by the smallest vacuum tube. Today, much of Raytheon’s transistor output goes to America’s hearing aid industry.” (Germanium, a relatively rare metal, was the predecessor to silicon in transistors.)

That was just the start. “Now there are over 50 million transistors on a single computer chip, and billions of transistors are manufactured every day,” Jack Ward, curator of the online Transistor Museum, said in an interview. “Norm was the first to recognize the potential and led Raytheon to be the first major transistor manufacturer.”

Thousands of hearing-disabled people benefited from Mr. Krim’s initial use of the transistor in compact hearing aids. But not every transistor Raytheon made was suitable for them, he found.

“When transistors were first being manufactured by Raytheon on a commercial scale, there was a batch called CK722s that were too noisy for use in hearing aids,” said Harry Goldstein, an editor at IEEE Spectrum, the magazine of the Institute of Electrical and Electronics Engineers.

So Mr. Krim contacted editors at magazines like Popular Science and Radio Electronics and began marketing the CK722s to hobbyists.

“The result was that a whole generation of aspiring engineers — kids, really, working in their garages and basements — got to make all kinds of electronic projects,” Mr. Goldstein said, among them transistor radios, guitar amplifiers, code oscillators, Geiger counters and metal detectors. “A lot of them went on to become engineers.”

Mr. Ward called Mr. Krim “the father of the CK722.”

Before the transistor, Mr. Krim had already made significant contributions to the industry. In 1938 he led a Raytheon team that developed miniaturized vacuum tubes for use in battery-powered radios. He also realized that the small tubes could replace cumbersome packs that hearing-aid users had to strap onto themselves in those days.

“Zenith, Beltone, Sonotone are some of the American companies that used his improved, more affordable hearing-aid technology,” said Chet Michalak, who is writing a biography of Mr. Krim. “His devices were about the size of today’s hand-held phones.” They were also a precursor to the transistor hearing aid his team later developed.

Norman Bernard Krim was born in Manhattan on June 3, 1913, one of four children of Abraham and Ida Krim. His father owned several luncheonettes. By the age of 12, he was tinkering with the refrigerator motor in his home.

After graduating from George Washington High School at 16, he was accepted at the Massachusetts Institute of Technology, where in his junior year he built an “electrical brain” that, according to newspaper articles at the time, seemed to be able to make childlike choices, deciding whether it preferred beets or spinach, for example.

“He considered it a carnival act,” Mr. Michalak said.

Raytheon hired Mr. Krim after his graduation in 1934, at 50 cents an hour. By the time he left the company in 1961, he was vice president of the semiconductor divisions.

Mr. Krim’s wife of 52 years, the former Beatrice Barron, died in 1994. Beside his son Robert, he is survived by another son, Arthur, and four grandchildren. Another son, Donald, a leading film distributor, died in May.

After leaving Raytheon, Mr. Krim bought two electronics stores in Boston called RadioShack. By the time he sold the business to the Tandy Corporation two years later, it had seven stores; today the chain has about 7,300.

Mr. Krim was a marketing consultant to Raytheon and several other companies until 1997.

    Norman Krim, Who Championed the Transistor, Dies at 98, NYT, 20.12.2011,
    http://www.nytimes.com/2011/12/21/business/
    norman-krim-who-championed-the-transistor-dies-at-98.html

 

 

 

 

 

Microbe Finds Arsenic Tasty;

Redefines Life

 

December 2, 2010
The New York Times
By DENNIS OVERBYE

 

Scientists said Thursday that they had trained a bacterium to eat and grow on a diet of arsenic, in place of phosphorus — one of six elements considered essential for life — opening up the possibility that organisms could exist elsewhere in the universe or even here on Earth using biochemical powers we have not yet dared to dream about.

The bacterium, scraped from the bottom of Mono Lake in California and grown for months in a lab mixture containing arsenic, gradually swapped out atoms of phosphorus in its little body for atoms of arsenic.

Scientists said the results, if confirmed, would expand the notion of what life could be and where it could be. “There is basic mystery, when you look at life,” said Dimitar Sasselov, an astronomer at the Harvard-Smithsonian Center for Astrophysics and director of an institute on the origins of life there, who was not involved in the work. “Nature only uses a restrictive set of molecules and chemical reactions out of many thousands available. This is our first glimmer that maybe there are other options.”

Felisa Wolfe-Simon, a NASA astrobiology fellow at the United States Geological Survey in Menlo Park, Calif., who led the experiment, said, “This is a microbe that has solved the problem of how to live in a different way.”

This story is not about Mono Lake or arsenic, she said, but about “cracking open the door and finding that what we think are fixed constants of life are not.”

Dr. Wolfe-Simon and her colleagues publish their findings Friday in Science.

Caleb Scharf, an astrobiologist at Columbia University who was not part of the research, said he was amazed. “It’s like if you or I morphed into fully functioning cyborgs after being thrown into a room of electronic scrap with nothing to eat,” he said.

Gerald Joyce, a chemist and molecular biologist at the Scripps Research Institute in La Jolla, Calif., said the work “shows in principle that you could have a different form of life,” but noted that even these bacteria are affixed to the same tree of life as the rest of us, like the extremophiles that exist in ocean vents.

“It’s a really nice story about adaptability of our life form,” he said. “It gives food for thought about what might be possible in another world.”

The results could have a major impact on space missions to Mars and elsewhere looking for life. The experiments on such missions are designed to ferret out the handful of chemical elements and reactions that have been known to characterize life on Earth. The Viking landers that failed to find life on Mars in 1976, Dr. Wolfe-Simon pointed out, were designed before the discovery of tube worms and other weird life in undersea vents and the dry valleys of Antarctica revolutionized ideas about the evolution of life on Earth.

Dr. Sasselov said, “I would like to know, when designing experiments and instruments to look for life, whether I should be looking for same stuff as here on Earth, or whether there are other options.

“Are we going to look for same molecules we love and know here, or broaden our search?”

Phosphorus is one of six chemical elements that have long been thought to be essential for all Life As We Know It. The others are carbon, oxygen, nitrogen, hydrogen and sulfur.

While nature has been able to engineer substitutes for some of the other elements that exist in trace amounts for specialized purposes — like iron to carry oxygen — until now there has been no substitute for the basic six elements. Now, scientists say, these results will stimulate a lot of work on what other chemical replacements might be possible. The most fabled, much loved by science fiction authors but not ever established, is the substitution of silicon for carbon.

Phosphorus chains form the backbone of DNA and its chemical bonds, particularly in a molecule known as adenosine triphosphate, the principal means by which biological creatures store energy. “It’s like a little battery that carries chemical energy within cells,” said Dr. Scharf. So important are these “batteries,” Dr. Scharf said, that the temperature at which they break down, about 160 Celsius (320 Fahrenheit), is considered the high-temperature limit for life.

Arsenic sits right beneath phosphorus in the periodic table of the elements and shares many of its chemical properties. Indeed, that chemical closeness is what makes it toxic, Dr. Wolfe-Simon said, allowing it to slip easily into a cell’s machinery where it then gums things up, like bad oil in a car engine.

At a conference at Arizona State about alien life in 2006, however, Dr. Wolfe-Simon suggested that an organism that could cope with arsenic might actually have incorporated arsenic instead of phosphorus into its lifestyle. In a subsequent paper in The International Journal of Astrobiology, she and Ariel Anbar and Paul Davies, both of Arizona State University, predicted the existence of arsenic-loving life forms.

“Then Felisa found them!” said Dr. Davies, who has long championed the idea of searching for “weird life” on Earth as well as in space and is a co-author on the new paper.

Reasoning that such organisms were more likely to be found in environments already rich in arsenic, Dr. Wolfe-Simon and her colleagues scooped up a test tube full of mud from Mono Lake, which is salty, alkaline and already heavy in arsenic, and gradually fed them more and more.

Despite her prediction that such arsenic-eating organisms existed, Dr. Wolfe-Simon said that she held her breath every day that she went to the lab, expecting to hear that the microbes had died, but they did not. “As a biochemist, this stuff doesn’t make sense,” she recalled thinking.

A bacterium known as strain GFAJ-1 of the Halomonadaceae family of Gammaproteobacteria, proved to grow the best of the microbes from the lake, although not without changes from their normal development. The cells grown in the arsenic came out about 60 percent larger than cells grown with phosphorus, but with large, empty internal spaces.

By labeling the arsenic with radioactivity, the researchers were able to conclude that arsenic atoms had taken up position in the microbe’s DNA as well as in other molecules within it. Dr. Joyce, however, said that the experimenters had yet to provide a “smoking gun” that there was arsenic in the backbone of working DNA.

Despite this taste for arsenic, the authors also reported, the GFAJ-1 strain grew considerably better when provided with phosphorus, so in some ways they still prefer a phosphorus diet. Dr. Joyce, from his reading of the paper, concurred, pointing out that there was still some phosphorus in the bacterium even after all its force-feeding with arsenic. He described it as “clinging to every last phosphate molecule, and really living on the edge.”

Dr. Joyce added, “I was feeling sorry for the bugs.”

    Microbe Finds Arsenic Tasty; Redefines Life, NYT, 2.12.2010,
    http://www.nytimes.com/2010/12/03/science/03arsenic.html

 

 

 

 

 

George C. Williams, 83,

Theorist on Evolution,

Dies

 

September 13, 2010
The New York Times
By NICHOLAS WADE

 

George C. Williams, an evolutionary biologist who helped shape modern theories of natural selection, died Wednesday at his home in South Setauket on Long Island, near Stony Brook University, where he taught for 30 years. He was 83.

The cause was Parkinson’s disease, said his wife, Doris Williams.

Dr. Williams played a leading role in establishing the now-prevailing, though not unanimous, view among evolutionary biologists that natural selection works at the level of the gene and the individual and not for the benefit of the group or species.

He is “widely regarded by peers in his field as one of the most influential and incisive evolutionary theorists of the 20th century,” said Douglas Futuyma, a colleague and the author of a leading textbook on evolution.

Dr. Williams laid out his ideas in 1966 in his book “Adaptation and Natural Selection.” In it, he seized on and clarified an issue at the heart of evolutionary theory: whether natural selection works by favoring the survival of elements as small as a single gene or its components, or by favoring those as large as a whole species.

He did not rule out the possibility that selection could work at many levels. But he concluded that in practice this almost never happens, and that selection should be understood as acting at the level of the individual gene.

In explaining an organism’s genetic adaptation to its environment, he wrote, “one should assume the adequacy of the simplest form of natural selection” — that of variation in the genes — “unless the evidence clearly shows that this theory does not suffice.”

The importance of Dr. Williams’s book was immediately recognized by evolutionary biologists, and his ideas reached a wider audience when they were described by Richard Dawkins in his book “The Selfish Gene” (1976).

Those ideas have continued to draw attention because group selection still has influential advocates. In highly social organisms like ants and people, behaviors like altruism, morality and even religion can be more directly explained if selection is assumed to favor the survival of groups.

Dr. Williams had a remarkably open turn of mind, which allowed him always to consider alternatives to his own ideas. David Sloan Wilson, a leading advocate of group selection, recalled in an interview that as a graduate student he once strode into Dr. Williams’s office saying he would change the professor’s mind about group selection. “His response was to offer me a postdoctoral position on the spot,” Dr. Wilson said.

Dr. Wilson did not take the position but remained close to Dr. Williams, though the two continued to differ. One matter of dispute was whether a human being and the microbes in the gut and the skin could together be considered a superorganism created by group selection. Dr. Williams did not believe in superorganisms. (Nonetheless, when Dr. Wilson came to visit him one day, Dr. Williams had taped to his door a hand-lettered sign saying, “Superorganisms welcome here.”)

Dr. Williams’s interests extended to questions that evolution seemed not to answer well: Why should a woman forfeit her chance of having more babies by entering menopause? Why do people grow old and die when nature should find it far easier to maintain a body than to build one?

An important article he wrote in 1957 on the nature of senescence led to a collaboration with Dr. Randolph Nesse, a psychiatrist at the University of Michigan. Together they developed the concept of Darwinian medicine, described in the 1995 book “Why We Get Sick.” There the authors offered Darwinian explanations for questions like why appetite decreases during a fever or why children loathe dark green vegetables.

Dr. Williams pursued his ideas even to results that he found disturbing. “He concluded that anything shaped by natural selection was inevitably evil because selfish organisms outproduced those that weren’t selfish,” Dr. Nesse said.

Dr. Williams acknowledged that people had moral instincts that overcome evil. But he had no patience with biologists who argue that these instincts could have been brought into being by natural selection.

“I account for morality as an accidental capability produced, in its boundless stupidity, by a biological process that is normally opposed to the expression of such a capability,” Dr. Williams wrote starkly in 1988.

In the field of evolutionary theory, “George was probably the most influential author in the 1960s,” said William Provine, a historian of evolution at Cornell University. But by choosing important subjects, Dr. Williams remained relevant. His ideas were approachable because he wrote in clear, simple prose and largely without the use of mathematics, an almost obligatory tool for most evolutionary biologists today.

Dr. Williams joined the State University of New York at Stony Brook (now Stony Brook University) in 1960 and worked there until his retirement in 1990.

In addition to his wife, who is also a biologist, he is survived by a son, Jacques; three daughters, Sibyl Costell, Phoebe Anderson and Judith Pitsiokos; and nine grandchildren.

Though a major expositor of evolutionary theory, Dr. Williams was always aware that his explanations were a work in progress and that they might in principle be superseded by better ones. Evolutionary theory, as stated by its great 20th-century masters Ronald Fisher, J. B. S. Haldane and Sewall Wright, “may not, in any absolute sense, represent the truth,” Dr. Williams wrote at the conclusion of his book on adaptation, “but I am convinced that it is the light and the way.”

    George C. Williams, 83, Theorist on Evolution, Dies, NYT, 13.9.2010,
    http://www.nytimes.com/2010/09/14/science/14williams.html

 

 

 

 

 

Marshall Nirenberg, Biologist

Who Untangled Genetic Code,

Dies at 82

 

January 21, 2010
The New York Times
By NICHOLAS WADE

 

Marshall W. Nirenberg, a biologist who deciphered the genetic code of life, earning a Nobel Prize for his achievement, died Friday at his home in Manhattan. He was 82.

The cause was cancer, said his stepdaughter Susan Weissman.

In solving the genetic code, Dr. Nirenberg established the rules by which the genetic information in DNA is translated into proteins, the working parts of living cells. The code lies at the basis of life, and understanding it was a turning point in the history of biology.

Dr. Nirenberg identified the particular codons — a codon is a sequence of three chemical units of DNA — that specify each of the 20 amino acid units of which protein molecules are constructed.

The achievement, in a critical experiment in 1961, was the more remarkable because Dr. Nirenberg was only 34 at the time and unknown to the celebrated circle of biologists, led by Francis Crick, who had built the framework of molecular biology.

Dr. Crick and his colleague Sydney Brenner had established, largely on theoretical grounds, that the code must be in triplets of the four kinds of chemical units of which DNA is composed. But they had not developed the experiments to work out which triplet corresponded to which amino acid.

Dr. Nirenberg amazed biologists when he and his colleague, the German scientist Johann Heinrich Matthaei, announced their identification of the first codon. He pulled another surprise when he beat out better-known scientists in the ensuing race to identify the other 63 codons in the genetic code. He received the Nobel Prize in Physiology or Medicine shortly afterward, in 1968. (Two other scientists shared the prize with him.)

Marshall Warren Nirenberg was born in Brooklyn on April 10, 1927, to Harry and Minerva Nirenberg and grew up in Florida. After earning a Ph.D. at the University of Michigan, he started work at the National Institutes of Health in Bethesda, Md., where he spent the rest of his career.

The project he chose was the synthesis of proteins, then being studied in mixtures of mashed-up cells known as cell-free systems. Dr. Nirenberg took the research a stage further by focusing on the genetic information that might be driving protein synthesis. He was joined by Dr. Matthaei, an excellent experimentalist, and the two decided to add lengths of RNA, a close chemical cousin of DNA, to the cell-free systems.

Success came when they added to their cell-free system an RNA molecule composed only of uracil, one of the four chemical units in RNA. The protein that emerged consisted only of phenylalanine, one of the 20 kinds of amino acids in proteins. Because the genetic code was known to consist of triplets, the experiment showed that UUU is the codon for phenylalanine, U being the symbol for uracil.

Dr. Nirenberg and Dr. Matthaei were such outsiders that they had not heard of messenger-RNA, made to transfer DNA’s instructions to the cell’s protein-making machinery. While biologists in the club were producing the first evidence for the existence of messenger RNA, Dr. Nirenberg and Dr. Matthaei had independently synthesized one.

By rights, their experiment should not have worked at all because natural messenger-RNAs carry at their front end a special codon that says to the ribosomes, “Start here,” a fact not known at the time. But the recipe for protein synthesis used by Dr. Nirenberg and Dr. Matthaei happened to contain twice the natural amount of magnesium, an anomaly that was later found to override the need for a start codon.

Dr. Nirenberg presented their findings at the next big conference of molecular biologists, held in 1961 in Moscow. His talk was given to an almost empty room, Horace Judson writes in “The Eighth Day of Creation,” his history of molecular biology. But one of the few participants recognized its significance and told Dr. Crick, who arranged for Dr. Nirenberg to give his talk again, this time in a large hall attended by an audience of hundreds.

Then followed the race to identify all the other codons, a prize that Dr. Nirenberg’s talk had placed in full view of a hall of better financed rivals like Severo Ochoa of New York University.

“It was a David-and-Goliath situation in which a young investigator without resources came into competition with a distinguished Nobel laureate like Ochoa,” said Philip Leder of Harvard, who joined Dr. Nirenberg’s laboratory after Dr. Matthaei had left.

Credit for the genetic code is often assigned to Dr. Crick and Dr. Brenner, who resolved its general nature through theorizing and with a clever experiment. But it was Dr. Nirenberg and Dr. Matthaei who cracked the code itself.

Mr. Judson, in his history, notes that efforts to test these ideas “achieved little until Matthaei arrived.”

Dr. Leder recalled Dr. Nirenberg as “enthusiastic and magnetic.”

“He had an idea every two or three minutes,” Dr. Leder said.

The solving of the genetic code was such a substantial advance that several researchers decided that the major problems in molecular biology had been solved and that it was time to move on to greater challenges. Dr. Nirenberg switched to neurobiology, but did not make discoveries of equal distinction there. His work on the genetic code was sufficient achievement for any scientific career.

He is survived by his wife, Myrna Weissman, a professor at the Columbia University College of Physicians and Surgeons; his sister, Joan N. Geiger, of Dallas, and four stepchildren, Susan, Judith, Sharon and Jonathan Weissman. His first wife, Perola Zaltzman Nirenberg, died in 2001.

    Marshall Nirenberg, Biologist Who Untangled Genetic Code, Dies at 82,
    NYT, 22.1.2010,
    http://www.nytimes.com/2010/01/21/us/21nirenberg.html

 

 

 

 

 

Albert Crewe,

First to Show a Single Atom,

Is Dead at 82

 

November 21, 2009
The New York Times
By JOHN MARKOFF

 

Albert V. Crewe, the University of Chicago physicist who developed the high-resolution electron microscope that captured the first image of an individual atom, died on Wednesday at his home in the Lake Michigan community of Dune Acres, Ind. He was 82.

The cause was complications of Parkinson’s disease, his daughter Jennifer said.

Dr. Crewe’s research opened a new window into the Lilliputian world of the fundamental building blocks of nature, giving scientists and engineers in fields as varied as computing and biology a powerful new tool to understand the architecture of everything from living tissue to metal alloys.

It was during the 1960s that Dr. Crewe became interested in electron microscopy, which he wanted to apply to biology research then under way at Argonne National Laboratory, outside Chicago, where he was the director.

After attending a conference in England and forgetting to buy a book at the airport for the flight home, he pulled out a pad of paper on the plane and sketched two ways to improve existing microscopes. Back in Chicago, he determined that one of the approaches had already been pursued but that the other was fundamentally new.

He immersed himself in the design of advanced microscopes with the goal of reaching resolutions as fine as a single angstrom, about a third the diameter of a carbon atom. By 1967, his research had become so compelling that he left Argonne and his role as a manager there to return to the University of Chicago, where a decade earlier he had worked on the Chicago Cyclotron, a superconducting accelerator. This time he signed on as a full professor.

Dr. Crewe designed an enhanced electron source to bombard the object being scanned and later improved lens and detection technologies as well. This culminated in the first successful version of a system known as a scanning transmission electron microscope, or STEM.

While earlier microscopes had achieved resolutions of several angstroms, Dr. Crewe used his new electron source and newly available ultra-high-vacuum technologies to attain a tenfold improvement in image contrast, making it possible to see details not previously visible.

Finally, in June 1970 in the journal Science, he published a research report titled “Visibility of Single Atoms,” in which he cautiously documented photographic evidence that showed images of uranium and thorium atoms.

“It appears that the bright spots which we have observed are probably due to single atoms,” he wrote, noting that the geometric placement of the bright spots corresponded to what would be expected theoretically.

“It was extremely dramatic in 1970 when he saw single atoms,” said Oscar Kapp, a University of Chicago biochemist, who worked with Dr. Crewe on the project as a graduate student and continued to collaborate with him throughout his career.

Five years after that success, Dr. Crewe obtained the first motion pictures of atoms, an achievement that offered new ways of understanding atomic interactions. The film was even shown on a Chicago television station by the movie critic Gene Siskel, although he noted that he would not pay a dollar to watch it, according to one of the graduate students who worked on the experiment.

Dr. Crewe, the university’s dean of physical sciences from 1971 to 1981, continued his research on electron microscopes through the 1980s. He held 19 patents for his inventions in the field and eventually published 275 research papers in it. His work was essential to the development of a generation of commercial electron microscopes, on which he consulted with the Hitachi Corporation. Today electron microscopes are widely used both in the semiconductor industry and in scientific laboratories around the world.

Albert Victor Crewe was born in Bradford, England, in what is now West Yorkshire, and grew up in a working-class family. He had average grades as a young child, but at 15 he passed a nationwide entrance exam and became the first in his family to attend high school. He won a scholarship to attend the University of Liverpool, where he also earned a Ph.D. in physics.

During the summer of 1946 in Cornwall, where he and other college students had gathered to help with a postwar wheat harvest, he met Doreen Blunsdon. They married in 1949.

After doing pioneering work on particle accelerators in England in the early 1950s, Dr. Crewe was invited to the University of Chicago in 1955 as a visiting research associate. A year later, having helped complete the Chicago Cyclotron, he was hired by the university as an assistant professor.

In 1958 he joined Argonne National Laboratory, where he led a team of researchers developing another advanced accelerator. He rose rapidly at Argonne, soon becoming director of the laboratory’s particle accelerator division and eventually managing 100 engineers.

He was appointed the laboratory’s director in 1961, when he was only 34, an assistant professor without tenure and not yet a United States citizen.

In addition to his wife, Doreen, he is survived by four children, Jennifer Crewe of New York, Sarah Crewe of Ruidoso, N.M., Elizabeth Crewe of LaGrange, Ill., and David Crewe of Sunnyvale, Calif.; and 10 grandchildren.

In a public lecture shortly after becoming the director of Argonne, Dr. Crewe bemoaned the growing gulf between scientists and laymen.

“There are too many people behaving like the proverbial ostrich and hoping that science will go away if they bury their heads in the sand,” he said, “and this in spite of the fact that the last few decades have indicated strongly that science will not go away.”

    Albert Crewe, First to Show a Single Atom, Is Dead at 82, NYT, 21.11.2009,
    http://www.nytimes.com/2009/11/21/science/21crewe.html

 

 

 

 

 

Nobel Awarded

for Advances in Harnessing Light

 

October 7, 2009
The New York Times
By KENNETH CHANG

 

The mastery of light through technology was the theme of this year’s Nobel Prize in Physics as the Royal Swedish Academy of Sciences honored breakthroughs in fiber optics and digital photography.

Half of the $1.4 million prize went to Charles K. Kao for insights in the mid-1960s about how to get light to travel long distances through glass strands, leading to a revolution in fiber optic cables. The other half of the prize was shared by two researchers at Bell Labs, Willard S. Boyle and George E. Smith, for inventing the semiconductor sensor known as a charge-coupled device, or CCD for short. CCDs now fill digital cameras by the millions.

In recent years, the physics prize has varied between perplexing, esoteric advances at the edges of physics and more comprehensible technology developments. Last year, the academy honored “broken symmetry,” a key but esoteric concept in the description of elementary particles. This year’s prize was more akin to the awards in 2007, which honored a discovery that led to smaller, higher-capacity hard disks in laptops and MP3 devices, and 2000, which honored developments in integrated circuits.

In announcing the winners Tuesday morning, Gunnar Oquist, the academy’s secretary general, said the scientific work honored by this year’s prize “has built the foundation to our modern information society.”

Dr. Boyle, raised by telephone to address a news conference held by the Nobel committee in Stockholm, sounded stunned. “I have not had my morning cup of coffee yet, so I am feeling a little bit not quite with it all,” he said.

The awards ceremony will be held in Stockholm on Dec. 10.

Fiber optic cables and lasers capable of sending pulses of light down them already existed when Dr. Kao started working on fiber optics. But at that time, the light pulses could travel only about 20 meters through the glass fibers before 99 percent of the light had dissipated. His goal was to extend the 20 meters to a kilometer. At the time, many researchers thought tiny imperfections, like holes or cracks in the fibers, were scattering the light.

In January 1966, Dr. Kao, then working at the Standard Telecommunication Laboratories in England, presented his findings. It was not the manufacturing of the fiber that was at fault, but rather that the ingredient for the fiber — the glass — was not pure enough. A purer glass made of fused quartz would be more transparent, allowing the light to pass more easily. In 197o, researchers at Corning Glass Works were able to produce a kilometer-long ultrapure optical fiber.

According to the academy in its prize announcement, the optical cables in use today, if unraveled, would equal a fiber more than a billion kilometers long.

In September 1969, Dr. Boyle and Dr. Smith, working at Bell Labs in Murray Hill, N.J., sketched out an idea on a blackboard in Dr. Boyle’s office. Their idea, originally intended for electronic memory, takes advantage of the photoelectric effect, which was explained by Albert Einstein and won him the Nobel in 1921. When light hits a piece of silicon, it knocks out electrons. The brighter the light, the more electrons are knocked out.

In a CCD, the knocked-out electrons are gathered in small wells, where they are counted — essentially one pixel of an image. The data from an array of CCDs can then be reconstructed as an image. A 10-megapixel camera contains 10 million CCDs.

“We are the ones, I guess, that started this profusion of little small cameras working all over the world,” Dr. Boyle said.

Besides consumer cameras, CCDs also made possible the cosmic panoramas from the Hubble Space Telescope and the Martian postcards taken by NASA’s rovers.

All three of the winning scientists hold American citizenship. Dr. Kao, 75, is also a British citizen, and Dr. Boyle, 85, is also a Canadian citizen. Dr. Smith is 79.

    Nobel Awarded for Advances in Harnessing Light, NYT, 7.10.2009,
    http://www.nytimes.com/2009/10/07/science/07nobel.html

 

 

 

 

 

Robert Spinrad,

a Pioneer in Computing,

Dies at 77

 

September 7, 2009
The New York Times
By JOHN MARKOFF

 

Robert J. Spinrad, a computer designer who carried out pioneering work in scientific automation at Brookhaven National Laboratory and who later was director of Xerox’s Palo Alto Research Center while the personal computing technology invented there in the 1970s was commercialized, died on Wednesday in Palo Alto, Calif. He was 77.

The cause was Lou Gehrig’s disease, his wife, Verna, said.

Trained in electrical engineering before computer science was a widely taught discipline, Dr. Spinrad built his own computer from discarded telephone switching equipment while he was a student at Columbia.

He said that while he was proud of his creation, at the time most people had no interest in the machines. “I may as well have been talking about the study of Kwakiutl Indians, for all my friends knew,” he told a reporter for The New York Times in 1983.

At Brookhaven he would design a room-size, tube-based computer he named Merlin, as part of an early generation of computer systems used to automate scientific experimentation. He referred to the machine, which was built before transistors were widely used in computers, as “the last of the dinosaurs.”

After arriving at Brookhaven, Dr. Spinrad spent a summer at Los Alamos National Laboratories, where he learned about scientific computer design by studying an early machine known as Maniac, designed by Nicholas Metropolis, a physicist. Dr. Spinrad’s group at Brookhaven developed techniques for using computers to run experiments and to analyze and display data as well as to control experiments interactively in response to earlier measurements.

Later, while serving as the head of the Computer Systems Group at Brookhaven, Dr. Spinrad wrote a cover article on laboratory automation for the Oct. 6, 1967, issue of Science magazine.

“He was really the father of modern laboratory automation,” said Joel Birnbaum, a physicist who designed computers at both I.B.M. and Hewlett-Packard. “He had a lot of great ideas about how you connected computers to instruments. He realized that it wasn’t enough to just build a loop between the computer and the apparatus, but that the most important piece of the apparatus was the scientist.”

After leaving Brookhaven, Dr. Spinrad joined Scientific Data Systems in Los Angeles as a computer designer and manager. When the company was bought by the Xerox Corporation in an effort to compete with I.B.M., he participated in Xerox’s decision to put a research laboratory next to the campus of Stanford.

Xerox’s Palo Alto Research Center pioneered the technology that led directly to the modern personal computer and office data networks.

Taking over as director of the laboratory in 1978, Dr. Spinrad oversaw a period when the laboratory’s technology was commercialized, including the first modern personal computer, the ethernet local area network and the laser printer.

However, as a copier company, Xerox was never a comfortable fit for the emerging computing world, and many of the laboratory researchers left Xerox, often taking their innovations with them.

At the center, Dr. Spinrad became adept at bridging the cultural gulf between the lab’s button-down East Coast corporations and its unruly and innovative West Coast researchers.

Robert Spinrad was born in Manhattan on March 20, 1932. He received an undergraduate degree in electrical engineering from Columbia and a Ph.D. from the Massachusetts Institute of Technology.

In addition to his wife, Verna, he is survived by two children, Paul, of San Francisco, and Susan Spinrad Esterly, of Palo Alto, and three grandchildren.

Flying between Norwalk, Conn., and Palo Alto frequently, Dr. Spinrad once recalled how he felt like Superman in reverse because he would invariably step into the airplane’s lavatory to change into a suit for his visit to the company headquarters.

    Robert Spinrad, a Pioneer in Computing, Dies at 77, NYT, 7.9.2009,        
    http://www.nytimes.com/2009/09/07/technology/07spinrad.html

 

 

 

 

 

Op-Ed Contributor

Science Is in the Details

 

July 27, 2009
The New York Times
By SAM HARRIS

 

PRESIDENT OBAMA has nominated Francis Collins to be the next director of the National Institutes of Health. It would seem a brilliant choice. Dr. Collins’s credentials are impeccable: he is a physical chemist, a medical geneticist and the former head of the Human Genome Project. He is also, by his own account, living proof that there is no conflict between science and religion. In 2006, he published “The Language of God,” in which he claimed to demonstrate “a consistent and profoundly satisfying harmony” between 21st-century science and evangelical Christianity.

Dr. Collins is regularly praised by secular scientists for what he is not: he is not a “young earth creationist,” nor is he a proponent of “intelligent design.” Given the state of the evidence for evolution, these are both very good things for a scientist not to be.

But as director of the institutes, Dr. Collins will have more responsibility for biomedical and health-related research than any person on earth, controlling an annual budget of more than $30 billion. He will also be one of the foremost representatives of science in the United States. For this reason, it is important that we understand Dr. Collins and his faith as they relate to scientific inquiry.

What follows are a series of slides, presented in order, from a lecture on science and belief that Dr. Collins gave at the University of California, Berkeley, in 2008:

Slide 1: “Almighty God, who is not limited in space or time, created a universe 13.7 billion years ago with its parameters precisely tuned to allow the development of complexity over long periods of time.”

Slide 2: “God’s plan included the mechanism of evolution to create the marvelous diversity of living things on our planet. Most especially, that creative plan included human beings.”

Slide 3: “After evolution had prepared a sufficiently advanced ‘house’ (the human brain), God gifted humanity with the knowledge of good and evil (the moral law), with free will, and with an immortal soul.”

Slide 4: “We humans used our free will to break the moral law, leading to our estrangement from God. For Christians, Jesus is the solution to that estrangement.”

Slide 5: “If the moral law is just a side effect of evolution, then there is no such thing as good or evil. It’s all an illusion. We’ve been hoodwinked. Are any of us, especially the strong atheists, really prepared to live our lives within that worldview?”

Why should Dr. Collins’s beliefs be of concern?

There is an epidemic of scientific ignorance in the United States. This isn’t surprising, as very few scientific truths are self-evident, and many are counterintuitive. It is by no means obvious that empty space has structure or that we share a common ancestor with both the housefly and the banana. It can be difficult to think like a scientist. But few things make thinking like a scientist more difficult than religion.

Dr. Collins has written that science makes belief in God “intensely plausible” — the Big Bang, the fine-tuning of nature’s constants, the emergence of complex life, the effectiveness of mathematics, all suggest the existence of a “loving, logical and consistent” God.

But when challenged with alternative accounts of these phenomena — or with evidence that suggests that God might be unloving, illogical, inconsistent or, indeed, absent — Dr. Collins will say that God stands outside of Nature, and thus science cannot address the question of his existence at all.

Similarly, Dr. Collins insists that our moral intuitions attest to God’s existence, to his perfectly moral character and to his desire to have fellowship with every member of our species. But when our moral intuitions recoil at the casual destruction of innocents by, say, a tidal wave or earthquake, Dr. Collins assures us that our time-bound notions of good and evil can’t be trusted and that God’s will is a mystery.

Most scientists who study the human mind are convinced that minds are the products of brains, and brains are the products of evolution. Dr. Collins takes a different approach: he insists that at some moment in the development of our species God inserted crucial components — including an immortal soul, free will, the moral law, spiritual hunger, genuine altruism, etc.

As someone who believes that our understanding of human nature can be derived from neuroscience, psychology, cognitive science and behavioral economics, among others, I am troubled by Dr. Collins’s line of thinking. I also believe it would seriously undercut fields like neuroscience and our growing understanding of the human mind. If we must look to religion to explain our moral sense, what should we make of the deficits of moral reasoning associated with conditions like frontal lobe syndrome and psychopathy? Are these disorders best addressed by theology?

Dr. Collins has written that “science offers no answers to the most pressing questions of human existence” and that “the claims of atheistic materialism must be steadfastly resisted.”

One can only hope that these convictions will not affect his judgment at the institutes of health. After all, understanding human well-being at the level of the brain might very well offer some “answers to the most pressing questions of human existence” — questions like, Why do we suffer? Or, indeed, is it possible to love one’s neighbor as oneself? And wouldn’t any effort to explain human nature without reference to a soul, and to explain morality without reference to God, necessarily constitute “atheistic materialism”?

Francis Collins is an accomplished scientist and a man who is sincere in his beliefs. And that is precisely what makes me so uncomfortable about his nomination. Must we really entrust the future of biomedical research in the United States to a man who sincerely believes that a scientific understanding of human nature is impossible?

 

Sam Harris is the author of “The End of Faith”

and co-founder of the Reason Project,

which promotes scientific knowledge and secular values.

    Science Is in the Details, NYT, 27.7.2009,
    http://www.nytimes.com/2009/07/27/opinion/27harris.html

 

 

 

 

 

Arthur R. Kantrowitz,

Whose Wide-Ranging Research

Had Many Applications,

Is Dead at 95

 

December 9, 2008
The New York Times
By DENNIS OVERBYE

 

Arthur R. Kantrowitz, a physicist and engineer whose research on the behavior of superhot gases and fluid dynamics led to nose cones for rockets, heart-assist pumps and the idea of nuclear fusion in magnetic bottles, among many other things, died in Manhattan on Nov. 29. He was 95.

His death was announced by his family.

In a career that was often far ahead of its time, Dr. Kantrowitz ranged from aviation and space to medicine and public policy, where he championed the formation of a “science court” for resolving controversies. He founded and directed the Avco Everett Research Laboratory in Massachusetts, taught at Cornell and Dartmouth, and served on the Advisory Group on Anticipated Advances in Science and Technology in the Ford administration and on the board of the television program “Nova.”

It was at a Thanksgiving Day party in 1954 that Dr. Kantrowitz, then a Cornell professor, made his connection with the space program. At the party was Victor Emanuel, the chairman of the Avco Corporation, an aerospace company, who mentioned the problems missile engineers were having developing ballistic missiles that could survive re-entry into the atmosphere at 18,000 miles an hour, when friction can create temperatures of thousands of degrees. Such conditions could not be duplicated in wind tunnels, and it would probably require years of expensive flight tests to solve the problem, Mr. Emanuel complained.

Dr. Kantrowitz replied that he could do it in six months in a laboratory. Avco promptly offered to build him one, in the Boston suburbs. Using a so-called shock tube that would release a pulse of gas through thinned air, creating shock waves and temperatures of up to a million degrees or more, Dr. Kantrowitz and his colleagues simulated re-entry conditions and quickly determined that the best approach would be to coat the missile’s nose cone with a skin made of a material that would slowly burn away.

In April 1958, a charred nose cone was retrieved from the Atlantic Ocean after an intercontinental flight on a Thor-Able rocket and 12,000-degree re-entry with just the amount of ablation, or burn-away, that Dr. Kantrowitz had calculated.

“The recovery of that nose cone,” Dr, Kantrowitz later said, “gave the first really solid authentication of the shock tube work that had been done several years before.”

The nose cone went into the Smithsonian, Avco set up a factory to produce nose cones for missiles and Dr. Kantrowitz became one of the first technological heroes of the space program.

Arthur Robert Kantrowitz was born in 1913 in the Bronx, the eldest of four children, three brothers and a sister. His father, Bernard, was a doctor, and his mother, Rose, designed costumes for the Ziegfeld Follies. At age 11, he was kicked out of the private Ethical Culture School in Manhattan for showing no promise.

He graduated from DeWitt Clinton High School in the Bronx and entered Columbia, determined to study science. “I knew I wanted to be a physicist even before I knew the word,” he recalled in the 1961 book “Men of Space, Volume 3.”

As a boy, he collaborated with his younger brother Adrian, later America’s first heart transplant surgeon, to fashion an electrocardiogram device out of spare radio parts. Adrian Kantrowitz died on Nov. 14.

After receiving bachelor’s and master’s degrees in physics from Columbia in 1936, he went to work for the National Advisory Committee for Aeronautics, or NACA, the precursor to NASA, at Langley Field in Virginia. It was there, in 1938, that he and Eastman N. Jacobs, his boss, did an experiment that might have changed the world, had they succeeded.

The idea was to harness the energy source that powers the sun, the thermonuclear fusion of hydrogen into helium, by heating hydrogen with radio waves while squeezing the gas with a magnetic field. At the time, nobody had ever tried to produce a fusion reaction; the Manhattan Project and other attempts to create nuclear fission were still in their infancy.

Knowing that their superiors would disapprove of anything as outlandish as atomic energy, they labeled their machine the Diffusion Inhibitor, and worked on it only at night. The experiment failed, and before the experimenters could figure out why, their director found out about the project and canceled it. Physicists unaware of the Langley experiment later reinvented the idea of thermonuclear fusion in a magnetic bottle, and they are still trying to make it work.

“It was a heartbreaking experience,” Dr. Kantrowitz recalled. “I had just built a whole future around this; I wanted to make it a career.”

He turned his attention to completing a paper about gas dynamics under the tutelage of Edward Teller, for which Columbia awarded him a Ph.D. in 1947. By then he was already teaching at Cornell.

In 1943, he married Rosalind Joseph, a biochemist. That marriage ended in divorce, and she died in 2005. Dr. Kantrowitz is survived by his second wife, Lee Stuart of Hanover, N.H.; three daughters, Barbara Kantrowitz of London, Lore Kantrowitz of Lexington, Mass., Andrea Kantrowitz of Pelham, N.Y.; and six grandchildren.

In 1956, Dr. Kantrowitz left Cornell to work full time at the Avco Everett laboratory, where he used the shock tube to explore the properties of the hot electrified gases known as plasmas. In 1959, he and his team confirmed a conjecture by the British cosmologist Thomas Gold that disturbances on the sun could send shock waves through the solar system at millions of miles an hour, creating blasts of charged particles and magnetic storms.

The team also explored ways to generate electricity from jets of hot gas and developed high-powered lasers that, Dr. Kantrowitz suggested, could be used one day to propel spacecraft away from the Earth.

He also revisited his childhood collaborations with his brother Adrian. In the 1960s, he and a group of other scientists and surgeons developed the intra-aorta balloon pump. Inserted into a femoral artery, the device expands and contracts to help the heart move blood. It has been used on three million people, including Dr. Kantrowitz himself after he suffered a heart attack on Nov. 28.

Dr. Kantrowitz retired from Avco in 1978 and joined the Dartmouth faculty. In his later years he spoke out about the need for a “science court,” in which the scientific side of controversies like the use of pesticides and nuclear reactors could be adjudicated by experts. In 1975, he was chairman of a presidential task force looking into the idea.

Dr. Kantrowitz never lost his faith in science and in humanity’s ability to solve its problems.

Writing in this newspaper in 1971, he said: “I submit that a space program directed toward exhibiting that there are no visible limits to man’s future in the universe could be a most important help in reviving faith in the hope of progress. I can imagine nothing more relevant to our current problems.”

    Arthur R. Kantrowitz,
    Whose Wide-Ranging Research Had Many Applications, Is Dead at 95,
    NYT, 9.12.2008,
    http://www.nytimes.com/2008/12/09/science/09kantrowitz.html

 

 

 

 

 

1 American, 2 Japanese

Share Nobel Physics Prize

 

October 8, 2008
The New York Times
By DENNIS OVERBYE

 

An American and two Japanese physicists on Tuesday won the Nobel Prize in Physics for their work exploring the hidden symmetries between elementary particles that are the deepest constituents of nature.

Yoichiro Nambu, of the University of Chicago’s Enrico Fermi Institute, will receive half of the 10 million kroner prize (about $1.3 million) awarded by the Royal Swedish Academy of Sciences.

Makoto Kobayashi, of the High Energy Accelerator Research Organization (KEK) Tsukuba, Japan, and Toshihide Maskawa, of the Yukawa Institute for Theoretical Physics (YITP), Kyoto University, will each receive a quarter of the prize.

Ever since Galileo, physicists have been guided in their quest for the ultimate laws of nature by the search for symmetries, or properties of nature that appear the same under different circumstances.

However, in the 1960s, Dr. Nambu, who was born in Tokyo in 1921, suggested that some symmetries in the laws of nature might be hidden or “broken” in actual practice.

A pencil standing on its end, for example, is symmetrical but unstable and will wind up on the table pointing in only one direction or the other. The principle is now embedded in all of modern particle physics.

“You have to look for symmetries even when you can’t see them,” explained Michael Turner of the University of Chicago, who described his colleague as “the most humble man of all time.”

In 1972, Dr. Kobayashi and Dr. Maskawa, extending earlier work by the Italian physicist Nicola Cabibbo, showed that if there were three generations of the elementary particles called quarks, the constituents of protons and neutrons, this principle of symmetry breaking would explain a puzzling asymmetry known as CP violation. This was discovered in 1964 by the American physicists James Cronin and Val Fitch - a discovery that also won a Nobel prize.

C and P stand respectively for charge and parity, or “handedness.” Until then, physicists had naively assumed that if you exchanged positive for negative and left-handed and right-handed in the equations of elementary particles, you would get the same answer.

The fact that nature operates otherwise, physicists hope, is a step on the way to explaining why the universe is made of matter and not antimatter, one of the questions that the Large Hadron Collider, the new particle accelerator now preparing for operation, is designed to explore.

    1 American, 2 Japanese Share Nobel Physics Prize, NYT, 8.10.2008,
    http://www.nytimes.com/2008/10/08/science/08nobel.html

 

 

 

 

 

John a. Wheeler, Physicist

Who Coined the Term ‘Black Hole,’

Is Dead at 96

 

April 14, 2008
The New York Times
By DENNIS OVERBYE

 

John A. Wheeler, a visionary physicist and teacher who helped invent the theory of nuclear fission, gave black holes their name and argued about the nature of reality with Albert Einstein and Niels Bohr, died Sunday morning at his home in Hightstown, N.J. He was 96.

The cause was pneumonia, said his daughter Alison Wheeler Lahnston.

Dr. Wheeler was a young, impressionable professor in 1939 when Bohr, the Danish physicist and his mentor, arrived in the United States aboard a ship from Denmark and confided to him that German scientists had succeeded in splitting uranium atoms. Within a few weeks, he and Bohr had sketched out a theory of how nuclear fission worked. Bohr had intended to spend the time arguing with Einstein about quantum theory, but “he spent more time talking to me than to Einstein,” Dr. Wheeler later recalled.

As a professor at Princeton and then at the University of Texas in Austin, Dr. Wheeler set the agenda for generations of theoretical physicists, using metaphor as effectively as calculus to capture the imaginations of his students and colleagues and to pose questions that would send them, minds blazing, to the barricades to confront nature.

Max Tegmark, a cosmologist at the Massachusetts Institute of Technology, said of Dr. Wheeler, “For me, he was the last Titan, the only physics superhero still standing.”

Under his leadership, Princeton became the leading American center of research into Einsteinian gravity, known as the general theory of relativity — a field that had been moribund because of its remoteness from laboratory experiment.

“He rejuvenated general relativity; he made it an experimental subject and took it away from the mathematicians,” said Freeman Dyson, a theorist at the Institute for Advanced Study across town in Princeton.

Among Dr. Wheeler’s students was Richard Feynman of the California Institute of Technology, who parlayed a crazy-sounding suggestion by Dr. Wheeler into work that led to a Nobel Prize. Another was Hugh Everett, whose Ph.D. thesis under Dr. Wheeler on quantum mechanics envisioned parallel alternate universes endlessly branching and splitting apart — a notion that Dr. Wheeler called “Many Worlds” and which has become a favorite of many cosmologists as well as science fiction writers.

Recalling his student days, Dr. Feynman once said, “Some people think Wheeler’s gotten crazy in his later years, but he’s always been crazy.”

John Archibald Wheeler — he was Johnny Wheeler to friends and fellow scientists — was born on July 9, 1911, in Jacksonville, Fla. The oldest child in a family of librarians, he earned his Ph.D. in physics from Johns Hopkins University at 21. A year later, after becoming engaged to an old acquaintance, Janette Hegner, after only three dates, he sailed to Copenhagen to work with Bohr, the godfather of the quantum revolution, which had shaken modern science with paradoxical statements about the nature of reality.

“You can talk about people like Buddha, Jesus, Moses, Confucius, but the thing that convinced me that such people existed were the conversations with Bohr,” Dr. Wheeler said.

Their relationship was renewed when Bohr arrived in 1939 with the ominous news of nuclear fission. In the model he and Dr. Wheeler developed to explain it, the atomic nucleus, containing protons and neutrons, is like a drop of liquid. When a neutron emitted from another disintegrating nucleus hits it, this “liquid drop” starts vibrating and elongates into a peanut shape that eventually snaps in two.

Two years later, Dr. Wheeler was swept up in the Manhattan Project to build an atomic bomb. To his lasting regret, the bomb was not ready in time to change the course of the war in Europe and possibly save his brother Joe, who died in combat in Italy in 1944.

Dr. Wheeler continued to do government work after the war, interrupting his research to help develop the hydrogen bomb, promote the building of fallout shelters and support the Vietnam War and missile defense, even as his views ran counter to those of his more liberal colleagues.

Dr. Wheeler was once officially reprimanded by President Dwight D. Eisenhower for losing a classified document on a train, but he also received the Atomic Energy Commission’s Enrico Fermi Award from President Lyndon B. Johnson in 1968.

When Dr. Wheeler received permission in 1952 to teach a course on Einsteinian gravity, it was not considered an acceptable field to study. But in promoting general relativity, he helped transform the subject in the 1960s, at a time when Dennis Sciama, at Cambridge University in England, and Yakov Borisovich Zeldovich, at Moscow State University, founded groups that spawned a new generation of gravitational theorists and cosmologists.

One particular aspect of Einstein’s theory got Dr. Wheeler’s attention. In 1939, J. Robert Oppenheimer, formerly the head of the Manhattan Project, and a student, Hartland Snyder, suggested that Einstein’s equations had made an apocalyptic prediction. A dead star of sufficient mass could collapse into a heap so dense that light could not even escape from it. The star would collapse forever while spacetime wrapped around it like a dark cloak. At the center, space would be infinitely curved and matter infinitely dense, an apparent absurdity known as a singularity.

Dr. Wheeler at first resisted this conclusion, leading to a confrontation with Dr. Oppenheimer at a conference in Belgium in 1958, in which Dr. Wheeler said that the collapse theory “does not give an acceptable answer” to the fate of matter in such a star. “He was trying to fight against the idea that the laws of physics could lead to a singularity,” Dr. Charles Misner, a professor at the University of Maryland and a former student, said. In short, how could physics lead to a violation itself — to no physics?

Dr. Wheeler and others were finally brought around when David Finkelstein, now an emeritus professor at Georgia Tech, developed mathematical techniques that could treat both the inside and the outside of the collapsing star.

At a conference in New York in 1967, Dr. Wheeler, seizing on a suggestion shouted from the audience, hit on the name “black hole” to dramatize this dire possibility for a star and for physics.

The black hole “teaches us that space can be crumpled like a piece of paper into an infinitesimal dot, that time can be extinguished like a blown-out flame, and that the laws of physics that we regard as ‘sacred,’ as immutable, are anything but,” he wrote in his 1999 autobiography, “Geons, Black Holes & Quantum Foam: A Life in Physics.” (Its co-author is Kenneth Ford, a former student and a retired director of the American Institute of Physics.)

In 1973, Dr. Wheeler and two former students, Dr. Misner and Kip Thorne, of the California Institute of Technology, published “Gravitation,” a 1,279-page book whose witty style and accessibility — it is chockablock with sidebars and personality sketches of physicists — belies its heft and weighty subject. It has never been out of print.

In the summers, Dr. Wheeler would retire with his extended family to a compound on High Island, Me., to indulge his taste for fireworks by shooting beer cans out of an old cannon.

He and Janette were married in 1935. She died in October 2007 at 99. Dr. Wheeler is survived by their three children, Ms. Lahnston and Letitia Wheeler Ufford, both of Princeton; James English Wheeler of Ardmore, Pa.; 8 grandchildren, 16 great-grandchildren, 6 step-grandchildren and 11 step-great-grandchildren.

In 1976, faced with mandatory retirement at Princeton, Dr. Wheeler moved to the University of Texas.

At the same time, he returned to the questions that had animated Einstein and Bohr, about the nature of reality as revealed by the strange laws of quantum mechanics. The cornerstone of that revolution was the uncertainty principle, propounded by Werner Heisenberg in 1927, which seemed to put fundamental limits on what could be known about nature, declaring, for example, that it was impossible, even in theory, to know both the velocity and the position of a subatomic particle. Knowing one destroyed the ability to measure the other. As a result, until observed, subatomic particles and events existed in a sort of cloud of possibility that Dr. Wheeler sometimes referred to as “a smoky dragon.”

This kind of thinking frustrated Einstein, who once asked Dr. Wheeler if the Moon was still there when nobody looked at it.

But Dr. Wheeler wondered if this quantum uncertainty somehow applied to the universe and its whole history, whether it was the key to understanding why anything exists at all.

“We are no longer satisfied with insights only into particles, or fields of force, or geometry, or even space and time,” Dr. Wheeler wrote in 1981. “Today we demand of physics some understanding of existence itself.”

At a 90th birthday celebration in 2003, Dr. Dyson said that Dr. Wheeler was part prosaic calculator, a “master craftsman,” who decoded nuclear fission, and part poet. “The poetic Wheeler is a prophet,” he said, “standing like Moses on the top of Mount Pisgah, looking out over the promised land that his people will one day inherit.” Wojciech Zurek, a quantum theorist at Los Alamos National Laboratory, said that Dr. Wheeler’s most durable influence might be the students he had “brought up.” He wrote in an e-mail message, “I know I was transformed as a scientist by him — not just by listening to him in the classroom, or by his physics idea: I think even more important was his confidence in me.”

Dr. Wheeler described his own view of his role to an interviewer 25 years ago.

“If there’s one thing in physics I feel more responsible for than any other, it’s this perception of how everything fits together,” he said. “I like to think of myself as having a sense of judgment. I’m willing to go anywhere, talk to anybody, ask any question that will make headway.

“I confess to being an optimist about things, especially about someday being able to understand how things are put together. So many young people are forced to specialize in one line or another that a young person can’t afford to try and cover this waterfront — only an old fogy who can afford to make a fool of himself.

“If I don’t, who will?”

    John a. Wheeler,
    Physicist Who Coined the Term ‘Black Hole,’ Is Dead at 96,
    NYT, 14.4.2008,
    http://www.nytimes.com/2008/04/14/science/14wheeler.html

 

 

 

 

 

James Watson

Retires After Racial Remarks

 

October 25, 2007
The New York Times
By CORNELIA DEAN

 

James D. Watson, the eminent biologist who ignited an uproar last week with remarks about the intelligence of people of African descent, retired today as chancellor of the Cold Spring Harbor Laboratory on Long Island and from its board.

In a statement, he noted that, at 79, he is “overdue” to surrender leadership positions at the lab, which he joined as director in 1968 and served as president until 2003. But he said the circumstances of his resignation “are not those which I could ever have anticipated or desired.”

Dr. Watson, who shared the 1962 Nobel Prize for describing the double-helix structure of DNA, and later headed the American government’s part in the international Human Genome Project, was quoted in The Times of London last week as suggesting that, overall, people of African descent are not as intelligent as people of European descent. In the ensuing uproar, he issued a statement apologizing “unreservedly” for the comments, adding “there is no scientific basis for such a belief.”

But Dr. Watson, who has a reputation for making sometimes incendiary off-the-cuff remarks, did not say he had been misquoted.

Within days, the Cold Spring board had relieved him of the administrative responsibilities of the chancellor’s job. In that position, a spokesman for the laboratory said, he was most involved with educational efforts and fund-raising.

In the years after he left Harvard to direct the laboratory, Dr. Watson transformed it from a small facility into a world-class institution prominent in research on cancer, plant biology, neuroscience and computational biology, the board said in announcing his retirement. Bruce Stillman, who succeeded him as president, said today that he had created an “unparalleled” research environment at the laboratory.

In his statement, Dr. Watson said the work of the Human Genome Project, an international effort which deciphered the chemical contents of human genes, had opened the door to work on many diseases, particularly illnesses such as schizophrenia and bipolar disorder, ailments he said have afflicted members of his family.

He also referred to his Scots and Irish forebears, saying their lives were guided by faith in reason and social justice, “especially the need for those on top to help care for the less fortunate.”

    James Watson Retires After Racial Remarks, NYT, 25.10.2007,
    http://www.nytimes.com/2007/10/25/science/25cnd-watson.html

 

 

 

 

 

Text

Statement by James D. Watson

 

October 25, 2007
The New York Times
 

Here is the text of the statement issued by Dr. James D. Watson announcing his retirement,
which was transmitted to The New York Times in an e-mail message.
 


This morning I have conveyed to the Trustees of the Cold Spring Harbor Laboratory my desire to retire immediately from my position as its Chancellor, as well as from my position on its Board, on which I have served for the past 43 years. Closer now to 80 than 79, the passing on of my remaining vestiges of leadership is more than overdue. The circumstances in which this transfer is occurring, however, are not those which I could ever have anticipated or desired.

That the Cold Spring Harbor Laboratory is now one of the world’s premier sites for biological research and education has long warmed my heart. So I am grateful that its Board now will allow me to remain along my beloved Bungtown Road. Forty-nine years ago, as a newly appointed young Assistant Professor at Harvard, I gave my first course on this pernicious collection of diseases of uncontrolled cell growth and division. Cancer, then an intellectual black box, now, in part because of research at the Laboratory, is almost full lit. Though important facts remain undiscovered, there is no reason why they should not soon be found. Final victory is within our grasp. Strong in spirit and intensely focused, I wish to be among those at the victory line.

The ever quickening advances of science made possible by the success of the Human Genome Project will also soon let us see the essences of mental disease. Only after we understand them at the genetic level can we rationally seek out appropriate therapies for such illnesses as schizophrenia and bipolar disease. For the children of my sister and me, this moment can not come a moment too soon. Hell does not come close to describing the impact of psychotic disorders on human life.

This week’s events focus me ever more intensely on the moral values passed on to me by my father, whose Watson surname marks his long ago Scots-Irish Appalachian heritage; and by my mother, whose father, Lauchlin Mitchell, came from Glasgow and whose mother, Lizzie Gleason, had parents from Tipperary. To my great advantage, their lives were guided by a faith in reason; an honest application of its messages; and for social justice, especially the need for those on top to help care for the less fortunate. As an educator, I have always striven to see that the fruits of the American Dream are available to all.

I have been much blessed.

James D. Watson

One Bungtown Road

Cold Spring Harbor, New York

October 2007

    Statement by James D. Watson, NYT, 25.10.2007,
    http://www.nytimes.com/2007/10/25/science/26wattext.html

 

 

 

 

 

Physicist, Activist Panofsky Dies at 88

 

September 29, 2007
By THE ASSOCIATED PRESS
Filed at 12:12 a.m. ET
The New York Times

 

SAN FRANCISCO (AP) -- Wolfgang K.H. Panofsky, a physicist who was a consultant on the Manhattan Project and devoted much of his life to promoting nuclear arms control, has died. He was 88.

Panofsky, affectionally known as ''Pief,'' had a heart attack Monday in Los Altos, Stanford University officials said.

''He was a towering figure, admired greatly not only as a great scientist and a leader, but as a man deeply committed to principle, and to the issue of arms control,'' said Sidney Drell, a professor who works at the Stanford Linear Accelerator Center, a lab Panofsky was instrumental in developing.

He won the National Medal of Science in 1969 and the U.S. Department of Energy's Enrico Fermi Award in 1979, and he played a key role in shaping the government's science and nuclear policies.

''He was moved by a deep ethical concern,'' Drell said. ''He had worked on the atom bomb project, and understood what would be the effect of a nuclear war. He was profoundly committed to enhancing prospects of peace.''

Panofsky was born in Berlin on April 24, 1919, the son of prominent art historian Erwin Panofsky. He showed scientific promise at an early age. After his father, fearing for his Jewish family's well-being in mid-1930's Germany, accepted a teaching post in the United States, Panofsky attended Princeton University. He enrolled at 15 and after graduation pursued a doctorate at the California Institute of Technology.

He received his doctorate in 1942 and married Adele Irene DuMond, the daughter of the physicist in whose lab he had worked as a graduate student. At first classified as an ''enemy alien'' under California's wartime enemy exclusion law, he soon became a naturalized citizen and worked from 1943 to 1945 as a consultant on the Manhattan Project, which produced the atomic bomb.

He showed his leadership skills in his first professorship at the University of California, Berkeley. Particle physics was a nascent field, and Panofsky was exploring some of the earliest particle accelerators.

In 1951, he resigned in protest when the university's Board of Regents required faculty members to sign an oath of loyalty. He joined Stanford, where he took a key role in building a larger, higher-energy accelerator.

His work ultimately led to congressional authorization of the 2-mile-long electron linear accelerator at the heart of the Stanford Linear Accelerator Center, in 1961. Three Nobel prizes emerged from research promoted by Panofsky over the more than two decades he was lab director.

He was a tireless adviser to presidents Eisenhower through Carter and important figures in their administrations on issues of international security. Notable achievements included helping to secure the Treaty Banning Nuclear Weapon Tests in the Atmosphere, in Outer Space and Under Water, in 1963, and the signing of the Treaty on the Limitation of Anti-Ballistic Missile Systems, in 1972.

He also helped found the Center for International Security and Arms Control in his later years at Stanford.

Panofsky also pushed for scientific openness and exchange with researchers in the former Soviet Union and in China.

    Physicist, Activist Panofsky Dies at 88, NYT, 29.9.2007,
    http://www.nytimes.com/aponline/us/AP-Obit-Panofsky.html

 

 

 

 

 

John W. Gofman, 88,

Scientist and Advocate

for Nuclear Safety,

Dies

 

August 26, 2007
The New York Times
By JEREMY PEARCE

 

Dr. John W. Gofman, a nuclear chemist and doctor who in the 1960s heightened public concerns about exposure to low-level radiation and became a leading voice against commercial nuclear power, died on Aug. 15 at his home in San Francisco. He was 88.

The cause was heart failure, his family said.

In 1964, while he was director of the biomedical research division at Lawrence Livermore National Laboratory in California, Dr. Gofman helped start a national inquiry into the safety of atomic power. At a symposium for nuclear scientists and engineers, he raised questions about a lack of data on low-level radiation and also proposed a wide-ranging study of exposure in medicine and the workplace, from fallout and other sources.

With a colleague at Livermore, Dr. Arthur R. Tamplin, Dr. Gofman then looked at health studies of the survivors of Hiroshima and Nagasaki, as well as other epidemiological studies, and conducted his own research on radiation’s influences on human chromosomes. In 1969, the two scientists suggested that federal safety guidelines for low-level exposures be reduced by 90 percent.

The findings were contested by the Atomic Energy Commission, and the furor made Dr. Gofman a reluctant figurehead of the antinuclear movement. In 1970, he testified in favor of a legislative bill to ban commercial nuclear reactors in New York City and told the City Council that a reactor in urban environs would be “equal in the opposite direction to all the medical advances put together in the last 25 years.”

Both he and Dr. Tamplin left Livermore in the 1970s, and Dr. Gofman went on to become an expert witness in radiation-exposure lawsuits and help found an advocacy group, the Committee for Nuclear Responsibility, based in San Francisco. In an unsuccessful project, he and others called for a five-year federal moratorium on new nuclear power stations, citing problems in the safe storage of radioactive waste. Yet, for all his efforts as a nuclear gadfly, he did not oppose the building of nuclear missiles.

“Because we live in a dangerous world,” he said in 1993, “I think the only thing you have is the deterrence value” of such weaponry.

Dr. Gofman’s appearance in the nuclear debate surprised some colleagues, since a thrust of his earlier research had been in cardiology. In the late 1940s and ’50s, he and his collaborators investigated the body’s lipoproteins, which contain both proteins and fats, and their circulation within the bloodstream. The researchers described low-density and high-density lipoproteins and their roles in metabolic disorders and coronary disease.

In his earliest work, while still a graduate student at the University of California, Berkeley, Dr. Gofman studied nuclear isotopes and helped to describe several discoveries, including protactinium-232, uranium-232, protactinium-233 and uranium-233. He also helped to work out the fissionability of uranium-233.

John William Gofman was born in Cleveland. He graduated from Oberlin College, and received a doctorate in nuclear and physical chemistry from Berkeley in 1943. Dr. Gofman went on to earn a medical degree from the University of California, San Francisco, in 1946.

He joined Berkeley in 1947 and retired as professor emeritus of molecular and cell biology in 1973.

With Egan O’Connor, he wrote a book, “X-Rays: Health Effects of Common Exams” (1986). He also wrote “Radiation-Induced Cancer from Low-Dose Exposure: An Independent Analysis” (1990).

Dr. Gofman’s wife, Dr. Helen Fahl Gofman, a pediatrician, died in 2004.

He is survived by a son, Dr. John D. Gofman, an ophthalmologist, of Bellevue, Wash.

    John W. Gofman, 88, Scientist and Advocate for Nuclear Safety, Dies,
    NYT, 26.8.2007,
    http://www.nytimes.com/2007/08/26/us/26gofman.html

 

 

 

 

 

MIT Team Powers

Light Bulb Without Wires

 

June 8, 2007
By THE ASSOCIATED PRESS
Filed at 12:16 a.m. ET
The New York Times

 

CAMBRIDGE, Mass. (AP) -- In a perfect world, there'd be no wires. They clutter the view, get tangled behind desks and limit how far networks can reach. That's why the telegraph gave way to the radio. Cell phones unstrung telecommunications. Wi-Fi liberated computer data.

Now even the last knotty wire that seemed destined to remain -- the power cord -- could be on its way out.

Massachusetts Institute of Technology researchers announced Thursday they had made a 60-watt light bulb glow by sending it energy wirelessly, potentially previewing a future in which cell phones and other gadgets get juice without having to be plugged in.

The breakthrough, disclosed in Science Express, an online publication of the journal Science, is being called ''WiTricity'' by the scientists.

The concept of sending power wirelessly isn't new, but its wide-scale use has been dismissed as inefficient because electromagnetic energy generated by the charging device would radiate in all directions.

Last fall, though, MIT physics professor Marin Soljacic (pronounced soul-ya-CHEECH) explained how to do the power transfer with specially tuned waves. The key is to get the charging device and a gadget to resonate at the same frequency -- allowing them to efficiently exchange energy.

It's similar to how an opera star can break a wine glass that happens to resonate at the same frequency as her voice. In fact, the concept is so basic in physics that inventor Nikola Tesla sought a century ago to build a huge tower on Long Island that would wirelessly beam power along with communications.

The new step described in Science was that the MIT team put the concept into action. The scientists lit a 60-watt bulb that was 7 feet away from the power-generating appliance.

''It was quite exciting,'' Soljacic said. The process is ''very reproducible,'' he added. ''We can just go to the lab and do it whenever we want.''

The development raises the prospect that we might eliminate some of the clutter of cables in our ever-more electronic world. Is that necessarily a good thing? Soljacic acknowledged ''that it's far from obvious how crucial people will find this.''

But at least one benefit could be that if devices can get their power through the air, they might not need batteries and their attendant toxic chemicals.

Before that can happen, the technology has a ways to go.

The MIT system is about 40 percent to 45 percent efficient -- meaning that most of the energy from the charging device doesn't make it to the light bulb. Soljacic believes it needs to become twice as efficient to be on par with the old-fashioned way portable gadgets get their batteries charged.

Also, the copper coils that relay the power are almost 2 feet wide for now -- too big to be feasible for, say, laptops. And the 7-foot range of this wireless handoff could be increased -- presumably so that one charging device could automatically power all the gadgets in a room.

Soljacic believes all those improvements are within reach. The next step is to fire up more than just light bulbs, perhaps a Roomba robotic vacuum or a laptop.

The MIT team stresses that the ''magnetic coupling'' process involved in WiTricity is safe on humans and other living things. And in the initial experiments on the light bulb, nothing bad happened to the cell phones, electronic equipment and credit cards in the room -- though more research on that is needed.

The harmlessness apparently extends both ways: The researchers noted that putting people and other things between the coils -- even when they block the line of sight -- generally has no effect on the power transfer.

------

On the Net:

Soljacic's Web page: http://tinyurl.com/ytz2t3

    MIT Team Powers Light Bulb Without Wires, NYT, 8.6.2007,
    http://www.nytimes.com/aponline/technology/AP-Wireless-Power.html

 

 

 

 

 

Donald M. Ginsberg, 73,

Expert in the Working

of Superconductors, Is Dead

 

May 19, 2007
The New York Times
By JEREMY PEARCE

 

Donald M. Ginsberg, a physicist who became a leading expert on the production and functioning of superconductors, died on May 7 at his home in Urbana, Ill. He was 73.

The cause was melanoma, his family said.

In their various forms, superconductors are used in computers, electrical generators and medical equipment, among other applications. Dr. Ginsberg was widely recognized for his work on superconducting crystals, and in particular for improving the production of a useful metallic crystalline compound called YBCO.

Beginning in the 1980s, Dr. Ginsberg grew YBCO crystals in his laboratory at the University of Illinois and studied their ability to conduct electricity with great efficiency when heated to high temperatures. Although other scientists had developed the YBCO compound, Dr. Ginsberg established a process that allowed the growth of crystals of exceptional purity. He then distributed the results to fellow researchers at Illinois and institutions worldwide.

Michael Tinkham, emeritus professor of physics and applied physics at Harvard, said the results of such efforts had made Dr. Ginsberg the world’s leading expert on making high-temperature crystalline superconductors.

Dr. Ginsberg wrote about the subject in “Physical Properties of High Temperature Superconductors,” an influential five-volume series of reference books that he edited in the 1980s and ’90s for an audience of physicists, chemists and materials scientists.

In other work, Dr. Ginsberg studied superconductors made from films of lead, tin and mercury. He also wrote and published often humorous poetry on subjects related to physics and laboratory research.

Donald Maurice Ginsberg was born in Chicago. He earned his doctorate in physics from the University of California, Berkeley, in 1960.

He joined the faculty at Illinois in 1959, remained there for his career and was named an emeritus professor of physics in 1996.

Dr. Ginsberg is survived by his wife of 50 years, the former Joli Lasker; a daughter, Dana, of Columbus, Ohio; a son, Mark, of Urbana; and a brother, David, of Ann Arbor, Mich.

Donald M. Ginsberg, 73, Expert in the Working of Superconductors, Is Dead,
    NYT, 19.5.2007,
    http://www.nytimes.com/2007/05/19/science/19ginsberg.html

 

 

 

 

 

Scientists Unearth

Ft. Duquesne Remnants

 

May 16, 2007
By THE ASSOCIATED PRESS
Filed at 9:48 p.m. ET
The New York Times

 

PITTSBURGH (AP) -- About two weeks ago, archaeologist Tom Kutys thought he'd found a stone wall when he came across mortared capstones in a trench at the state park that once was the site of French and British forts. Instead, archaeologists at Point State Park believe they very well might have uncovered long-buried remnants of Fort Duquesne, Pittsburgh's original fort.

''If we are correct about this, we are looking at the earliest example of European masonry in Pittsburgh,'' said Brooke Blades, an archaeologist with A.D. Marble and Co., which is working on the $35 million renovation of the park in downtown Pittsburgh.

After excavating around Kutys' discovery, workers found what they believe to be a drainage system that once served the fort in the mid-1700s, he said.

''It argues that there may be extensive other evidence of Fort Duquesne,'' Blades said. ''People always knew where Fort Duquesne was, but the question was how much of it was left? ... It is tangible evidence. It's where the permanent occupation of Pittsburgh began.''

The discovery, however, won't slow down the renovation of the park. In fact, Kutys' discovery will be buried as work continues to upgrade the 36-acre park to include a new lawn area, irrigation and electrical systems, landscapes, vendor hookups, benches and wireless Internet in time for Pittsburgh's 250th anniversary celebration next year.

No artifacts associated with Fort Duquesne have been found, but Blades said the location of the drain only 45 feet from where the fort stood -- coupled with the fact that the brick dates back at least to the early 19th century -- indicates that the drainage system likely was part of the fort established by the French in 1754.

''I can't think of it getting much better,'' said Kutys, 25, of Philadelphia.

As French and British forces fought to seize control of North America, the French built Fort Duquesne where the Allegheny and Monongahela rivers meet to form the Ohio River. The French destroyed the fort as British forces advanced in 1758 during the French and Indian War. The British then built Fort Pitt on the ruins of Fort Duquesne between 1759 and 1761.

Blades said excavation in another section of the park will begin next week and he hopes to find evidence of the fort's stockade.

The renovation has angered preservationists, who said history was being buried.

Michael Nixon, a lawyer and volunteer with the Fort Pitt Preservation Society, did not immediately return a phone call Monday seeking comment on the discovery of the drainage system.

Supporters of the renovation, however, said they plan to start an archaeological program at the park sometime in the future.

''We have to figure out who's going to do it, how it's managed and how it's funded,'' said Laura Fisher of the Allegheny Conference on Community Development. ''It has to be real research. The vision is to have an actual program of archaeology. Things like this find help build the case to do it.''

Scientists Unearth Ft. Duquesne Remnants, NYT, 16.5.2007,
    http://www.nytimes.com/aponline/us/AP-Point-State-Park.html

 

 

 

 

 

James Hillier, 91, Dies;

Co-Developed Electron Microscope

 

January 22, 2007
The New York Times
By JEREMY PEARCE

 

James Hillier, a physicist and inventor who helped develop an early and commercially successful electron microscope for RCA and then found ways to apply it for medical research, died last Monday in Princeton, N.J. He was 91.

The cause was a stroke, his family said.

In 1938, while he was still a graduate student at the University of Toronto, Dr. Hillier and a fellow researcher, Albert Prebus, adapted the work of German scientists and others to develop a prototype that would later become a widely used electron microscope.

The device sent a stream of electrons through magnetic coils to produce an image 7,000 times the size of the object being studied, a magnification three times more powerful than contemporary optical microscopes.

Dr. Hillier took the prototype to the Radio Corporation of America in Camden, N.J., in 1940, and set out to produce a compact microscope that would be cheaper and more effective for biological research.

He quickly saw “the real name of the game was to find out how to put significant things” like blood cells and bacteria into the microscope without “burning them to a crisp” in the potent electron beam. Dr. Hillier and others developed methods of preparing samples using protective colloid film, which allowed them to view bacteria. RCA’s microscopes were used in 1949 at the Sloan-Kettering Institute to view cancer cells taken from animal tumors.

By that time, the microscope’s magnification had increased to 200,000 times, a landmark on the route to today’s magnifying power of two million.

In subsequent refinements, Dr. Hillier helped correct the problem of astigmatism in the lenses and worked on a scanning electron microscope that produced images of even higher resolution. From 1940 to the 1960s, when RCA ended its production, the company sold about 2,000 electron microscopes.

The technology could easily have gone to General Electric, which had recruited Dr. Hillier after his early research, but he said he preferred the emphasis on practical innovation and application he found at RCA.

James Hillier was born in Brantford, Ontario. He received his doctorate in physics from the University of Toronto in 1941.

In 1958, Dr. Hillier became director of RCA’s research laboratories in Princeton and later oversaw the company’s projects involving transistors, lasers and liquid crystal display. He was named an executive vice president for research and engineering and a senior scientist before retiring in 1977.

Dr. Hillier was an officer of the Order of Canada and became an American citizen in 1945. He received an Albert Lasker Award for Basic Medical Research in 1960.

Dr. Hillier’s wife, the former Florence Bell, died in 1992. The couple resided in Princeton.

He is survived by a son, J. Robert Hillier, an architect, of New Hope, Pa.; two sisters, Mary Hillier of Brantford and Thelma Henshaw of Naples, Fla.; four grandchildren; and three great-grandchildren.

James Hillier, 91, Dies; Co-Developed Electron Microscope, NYT, 22.1.2007,
    http://www.nytimes.com/2007/01/22/science/22hillier.html

 

 

 

 

 

Column five

2nd law of robotics:

give them faces

 

Wednesday August 30, 2006

Guardian

James Randerson

 

In a nondescript house somewhere near Hatfield, something that could pass for any student digs, groups of men and women have been rehearsing for the future. In a year-long series of experiments, scientists and engineers are studying how people behave around the building's sole permanent resident, a 1.2 metre-tall, silver-headed robot with sinister-looking gripping claws.
Their goal is to improve the way robots interact with people: everything from what the machines should look like to how they should behave.

And the early evidence from inside the Robot House is that our utopian vision of a future of splendid idleness may be clouded by a distinct unease in the company of our robot servants.

"It is not enough that the robot is in your house and doing different things," says Kerstin Deutenhahn, an expert in human-robot interaction at the University of Hertfordshire. "That same robot should also be able to perform this behaviour in a way that is acceptable and comfortable to people."

The idea is to look ahead to the day when silicon-brained home-helps have relieved us of the burden of household chores and work out which robot behaviours people like and which distress us. What should the robot look like? How should it move? How should it attract our attention?

The researchers have resisted the temptation to give the robot a name because they do not want the volunteers in their experiments to feel too familiar with it. "Once you name them then people will put gender associations on them, which is a big problem," says researcher Kheng Lee Koay.

It moves on three wheels and can stop itself bumping into walls using devices that emit a rapid-fire stream of sonar pulses. By analysing the echoes from its surroundings, rather like a bat surveying its environment, it can work out whether it is heading for a collision with a nearby object. But its sonar pulses cannot tell the machine that people get really squeamish when it creeps up behind them.

A typical experiment involves sitting a volunteer down, so that the robot is slightly taller, and sending the machine on a pre-programmed approach route. The volunteers then indicate when they feel the robot has come uncomfortably close.

"People strongly dislike it when the robot moves behind them for example," says Prof Deutenhahn. "Most volunteers also felt uncomfortable when the robot came at them directly from in front, possibly because it seems aggressive. A more subservient, oblique approach seems the best option."

The volunteers also preferred the robot to look a little human, with a face containing mouth and eyes that light up to give rudimentary expressions. A purely mechanical exterior was apparently harder to relate to.

But robotics experts note that the human guinea pigs don't want their machine-servants to be too much like them. Ben Krose, a professor at the University of Amsterdam, says: "The more human-like the robot becomes, the more it is accepted, but after a certain point it gets scary."

The current crop of robots - most of which are only capable of carrying out menial tasks such as cleaning carpets or mowing lawns - are too simple for anyone to be concerned with their behaviour. But engineers are fast developing more complicated and flexible machines, and working out how these should be programmed to interact with people is becoming an important research question. Sophisticated robots will never be successful if people do not like their behaviour.

A conference on human-robot interaction at the University of Hertfordshire next week may offer more cause for human anxiety as one Japanese expert will advocate a fundamental shift from Isaac Asimov's first law of robotics, which states that a robot should be programmed never to harm a human, either deliberately or by its inaction.

Shuji Hashimoto will propose what he calls a "new relationship between machine and human", where robots should be allowed to go through a kind of adolescence, and be given the ability to think and make decisions for themselves and even to harm humans if necessary. "The philosophy of Asimov is too human-centred," says Professor Hashimoto.

2nd law of robotics: give them faces,
G,
30.8.2006,
https://www.theguardian.com/science/2006/aug/30/
frontpagenews.uknews 

 

 

 

 

 

July 31, 1844

 

The man who applied maths

to chemistry

 

From the Guardian archive

 

Wednesday July 31, 1844

Guardian

 

[Dr John Dalton was considered so well known

that this death notice did not bother

to give his Christian name.]

 

Our venerable and venerated townsman - one of the greatest philosophers of his age and the father of the present race of chemical investigators and discoverers - is no more.

Science has lost one of its most devoted sons; England one of her greatest savants, and humanity one of its brightest examples of the wisdom of the philosopher united to the purity of the child; for truly he was wise as the serpent, harmless as the dove.

His long and useful life closed unexpectedly, but apparently without suffering, on Saturday morning last.

Shortly before his death, Dr. Dalton attended a meeting of the council of the Manchester Literary and Philosophical Society, and received an engrossed copy, on vellum, of a resolution of that society, recording:

"Their admiration of the zeal and perseverance with which he has deduced the mean pressure and temperature of the atmosphere, and the quantity of rain for each month, and for the whole year, with the prevailing direction and force of the wind at different seasons in this neighbourhood, from a series of more than 200,000 observations, from the end of the year 1793 to the beginning of 1844, being a period of half a century."

He appears to have had an early tendency to mathematical pursuits. It is related that, when about ten years old, his curiosity was excited by a dispute among some mowers, as to whether sixty square yards and sixty yards square were identical: at first he concluded that they were, but reflection showed him they were not.

[In 1804] Dalton stated that he thought he could render greater service to science by remaining at home and pursuing his own peculiar studies.

The greatest of Dr. Dalton's discoveries - the discovery of the atomic theory - first presented itself to the philosopher's mind in 1803.

In the last course of lectures ever delivered by Sir Humphrey Davey to the Royal Institution, in 1813 or 1814, he stated that the greatest step in science was the application of mathematics to chemistry, for which the world was indebted to Dalton.

Even in death, his brow was radiant with the characteristic expression of a benevolent spirit. The aspect and features were those of the aged Christian philosopher, who having finished his work had laid down to rest, calm in the the tranquil faith which ever distinguished him, and without one struggle of departing mortality, or one fear of the darkening gloom of the "valley of the shadow of death".

From the Guardian archive,
July 31, 1844,
The man who applied maths to chemistry,
G,
Republished 31.7.2006,  
https://www.theguardian.com/news/1844/jul/31/
mainsection.fromthearchive 

 

 

 

 

 

 

 

 

 

Explore more on these topics

Anglonautes > Vocapedia

 

science, numbers, figures, data, statistics

 

science > computing

 

 

 

 

 

Related > Anglonautes > Science

 

science, technology, medicine, discoveries, inventions

timeline in pictures

 

 

physics > timeline > 2016 > Gravitational waves

 

 

physics > timeline > 2013 > Higgs Boson

 

 

biology / genetics > timeline > 1953 >

DNA double helix

 

 

 

 

 

scientists > timeline in pictures

 

 

Stephen Hawking    UK    1942-2018

 

 

Albert Einstein    Germany, USA    1879-1955

 

 

Alexander Fleming    UK    1881-1955

 

 

Alan Mathison Turing    UK    1912-1954

 

 

Thomas Alva Edison    USA    1847-1931

 

 

 

 

 

Related

 

UK > The Guardian > Science

https://www.theguardian.com/science

https://www.theguardian.com/science/series/science

 

 

 

UK > The British Medical Journal    BMJ

https://www.bmj.com/

 

 

 

UK > The Guardian > Science

https://www.theguardian.com/science

https://www.theguardian.com/science/series/science

 

 

 

UK > The Journal of Infection

https://www.journalofinfection.com/

 

 

 

UK > The Lancet

https://www.thelancet.com/

 

 

 

UK > Nature

https://www.nature.com/

 

 

 

USA > The New England Journal of Medicine

https://www.nejm.org/

 

 

 

USA > The Proceedings of the National Academy of Sciences (PNAS),

the official journal of the National Academy of Sciences (NAS)

https://www.pnas.org/

 

 

 

home Up