Issue Date: November 29, 2004
Bethany Halford's "Insights" documented some of the most recent examples of fuzzy boundaries in chemistry with other disciplines and how pathbreaking research has been recognized by the Nobel Foundation (C&EN, Oct. 25, page 56). But with all due respect to Håkan Wennerström, chairman of the Nobel Committee for Chemistry, fuzzy boundaries are nothing new either in chemistry or in the Nobel Prize.
In the first half of the 20th century, several Nobel Prizes in Chemistry were awarded to scientists who identified themselves as physicists. These awards, mainly in the area of radioactivity and isotopes, were given in chemistry because they dealt with the nature of elements. Such recipients began with Ernest Rutherford (who characterized sciences other than physics as "stamp collecting") in 1908; Francis Aston and Irène and Frédéric Joliot-Curie were other physicists who received the chemistry Nobel.
Fuzzy boundaries are not a new phenomenon. Indeed, the disciplines of chemistry, physics, and biology only emerged from natural philosophy and natural history in the century before the Nobel Prizes began. In the late-18th and early-19th centuries, several important contributors to analytical chemistry had medical degrees, and investigators of gas laws worked in an area firmly between the modern fields of chemistry and physics.
I read Hovione chief executive officer Guy Villax's guest editorial with interest and sympathy for his suffering under the yoke of the European Union (C&EN, Oct. 25, page 5). This editorial should be a wake-up call for all U.S. citizens, especially members of ACS.
As many readers probably know, the EU was created and sold to Europeans as being about trade, not as an overarching regional government. The architects of the EU knew that the people of Europe would never surrender their nations to some unaccountable regional government in one step. Thus, they approached forming this regional government piecemeal by forging "trade" agreements, which were really more about an overall governance scheme and which gradually brought the EU as we now know it into existence.
The same trick is being played on the U.S. and, indeed, all of the Americas. The North American Free Trade Agreement (NAFTA) not only made our borders more porous and sent a lot of jobs out of the U.S., but it also established oversight tribunals that claim to have powers superseding that of our government in certain areas. In short, it is about managed trade and submergence of the U.S. to a regional government. Now, we are being sold CAFTA (Central American Free Trade Area) and FTAA (Free Trade Area of the Americas). These will make the problems begotten by NAFTA look like not much at all. If they go through, expect to see Villax's lament about the EU chemical industry apply fully to the U.S. as well.
And not only is the gradualistic surrender of control being planned and implemented for the U.S. by agents of our own government, but it is on an aggressive timetable. President George W. Bush has committed the U.S. to implementing FTAA next year.
There is still time to voice opposition to senators and representatives in Congress. CAFTA and FTAA will likely come up during Congress' lame-duck session. Remember, this is not just about free trade; it is about managed trade and creating an all-Americas (Canada to Tierra del Fuego) union with all the bureacratic nightmare that comes along with that and which has been foreshadowed by the EU.
I take issue with your choosing Daniel Sperling, Joan Ogden, and Anthony Eggert to review my book "The Hype about Hydrogen" (C&EN, Oct. 11, page 48). While I have the greatest respect for them professionally and appreciate their calling my book "a useful overview of hydrogen and energy challenges," it is quite well known they have a running debate with me on hydrogen. Indeed, Sperling has twice debated me on the subject in 2004. Sperling and Ogden also had an essay paired with mine in the spring 2004 issue of Issues in Science & Technology. Normally, magazines and newspapers try to achieve a greater semblance of objectivity in their choice of reviewers.
Their review has a number of mischaracterizations. They claim that "Romm's estimates of hydrogen costs are also biased. He consistently relies on sources that tend to the high side of the cost range in the literature, and often cites only the highest cost case in referenced studies." The book explicitly states that I was not going to use estimates from either hydrogen advocates or opponents but instead rely on the most objective independent analyses, such as the 2002 analysis for the National Renewable Energy Laboratory by Dale Simbeck and Elaine Chang (who don't present a range). There are many analyses that have higher costs than I cite.
Sperling and the others claim that "Romm states that gasoline hybrids are approximately as energy efficient as hydrogen fuel-cell vehicles" and even claim that studies by Massachusetts Institute of Technology and others disagree with me. I do talk about the rough equivalence of overall (well-to-wheels) efficiency and overall emissions of hybrids compared to fuel-cell vehicles, but this analytical conclusion actually comes from the 2003 MIT study by Malcolm Weiss and colleagues. That study concluded that in 2020, gasoline hybrids would have roughly the same overall efficiency as a fuel-cell vehicle and roughly the same overall greenhouse gas emissions if the hydrogen came from natural gas, the source of 95% of U.S. hydrogen today.
Interestingly, an article by Nurettin Demirdoven and John Deutch in the Aug. 13 issue of Science concludes that "fuel-cell vehicles using hydrogen from fossil fuels offer no significant energy efficiency advantage over hybrid vehicles operating in an urban drive cycle." This is a key point, since hydrogen vehicles are likely to be used mainly in an urban setting for a long time given the limited range and limited fueling options they are likely to have for the foreseeable future.
The reviewers also claim that "[Romm] identifies many near- and mid-term opportunities for reducing carbon emissions, but stops short of suggesting how we might proceed. What policies might encourage near-term opportunities while also inducing industry to invest in promising long-term technologies?" Apparently, they skipped my conclusion, which describes at length several such policies, including a cap on carbon dioxide emissions, a renewable portfolio standard, a major national effort to encourage energy efficiency and cogeneration, a sharp increase in R&D for clean energy, and a carbon dioxide standard for cars and light trucks.
I do not believe it was appropriate for you to have them review my book. I would urge your readers to review my book for themselves.
What could be a cleaner source of energy than H2 that burns to make H2O? Unfortunately, hydrogen cannot be mined; it has to be manufactured. Therefore, it is not a primary source of usable energy.
A good term for hydrogen as an energy source would be "secondary energy source," grouping it together with electrical batteries and other media designed for energy storage. When producing hydrogen, energy is used. The "energy crisis," if there is one, has to do with two key problems: the limited amount of available oil and the greenhouse gas emissions of burning both oil products and coal.
Ogden, Sperling, and Eggert, in their book review of "The Hype about Hydrogen," do not sufficiently delineate this central problem of hydrogen as a fuel: that hydrogen is not a primary source of energy. Language is important, especially in our politically charged time, and it is best for technical professionals to use descriptive terms in order to help educate the public. Hydrogen is not an "alternative fuel," and any economy based on it must produce it by using fossil, nuclear, or renewable energy. The book review in question missed an opportunity to direct attention where it belongs: reduction of the thirst for oil and reduction of CO2 emission by our energy-intensive society.
Susan R. Morrissey wrote an excellent article describing the concern of some members of the academic community about the prospects that the National Institutes of Health "road map" will have an adverse effect on the funding of R01 grants (C&EN, Nov. 8, page 40). I am sympathetic with those concerns and therefore offered a quote, which appeared in her article.
However, there is another issue regarding the road map that is worthy of note and that, in the long term, may be even more serious than the loss of countless R01 grants. This issue has to do with appropriate modalities by which the government favors or disfavors scientific programs. Over the years, NIH has relied on peer review as the dominant mechanism in the fashioning of scientific tastes as well as in the allocation of resources. For all its limitations, peer review has worked remarkably well. We should always be mindful of the unhappy outcome of science in the Soviet Union. Clearly, much of that failure arose from excessive intervention of government administrators in concert with committees staffed by "reliable" critics.
My sense is that the underlying assumptions of the road map did not levitate upward from the working scientific community. Rather, they arose as a set of preferences "from the top" in conjunction with academic consultants who already shared its basic premises. In short, the priorities of the road map were not broadly vetted to include potentially tough and harsh critics. Moreover, retrospective evaluation of the road map, not to speak of accountability, will be difficult since its goals and milestones are identified in such a hazy and intangible manner.
Hopefully, one's fears will prove to have been misplaced and, in time, the accomplishments of the road map will accrue to the benefit of science and the society it serves. Certainly, its goals are laudable, and doubtless its architects are driven by the best of intentions. Nonetheless, we who are so fortunate as to practice our science in a free and open society should be particularly vigilant in avoiding the top-down model in the fashioning of scientific priorities.
New York City
The brief article "Radioactive Waste To Be Left In Tanks" didn't give readers a factual or objective understanding of the subject (C&EN, Oct. 18, page 12).
The alkaline sludge in the high-level waste tanks is being pumped out and converted into highly stable borosilicate glass contained in 10- by 2-foot stainless steel canisters for disposal in the nation's geological waste repository. Then the sides of the tanks are rinsed with a jet and the heel on the floor is jetted to a collection point and pumped out. The residue (less than 1%) is then incorporated and entrained in a specially formulated (and tested) concrete grout. Outside the tank there is secondary, and in most cases tertiary, monitored containment. Testing and modeling show that the long-term probability of contaminating the aquifer is incredibly low. Even if it occurred, it could be remediated.
The approach has been reviewed and approved by the Defense Nuclear Facilities Safety Board, the South Carolina Department of Health & Environmental Control, and the Nuclear Regulatory Commission.
The approach preferred by the antinuclear community would greatly increase worker exposure to radiation and greatly increase the probability of contaminating the aquifer. It would also take two to three decades longer and cost tens of billions of dollars more--a really bad idea.
J. M. (Mal) McKibben
The letter by edible oil specialist Thomas H. Applewhite attacks "misguided advocacy" as responsible for governmental regulations requiring future nutritional labeling to include the amount of trans-fatty acid esters in food products (C&EN, Oct. 11, page 6). Yet that letter itself is an example of misleading advocacy. To counter what the letter calls "questionable clinical studies" and "a rash of epidemiological studies," that writer cites some published papers--each of which was published in 1994 or 1995. The studies on which the recent regulatory action was based were completed later than that, and the scientific validity of the later work is obviously not considered in the earlier papers.
On Jan. 4, 1993, the Washington Post published a letter I wrote, in which, as a nonspecialist in that field, I gave a tutorial on trans fats and their potential for coronary damage. I also cautioned about misleading interpretations of epidemiological and clinical studies and about evaluation of the effects of trans fats on health by those having a financial interest in industrial hydrogenation or hardening of food oils. I concluded with a plea for further sound research and evaluation and for food labels to include quantitative information on trans fat content. My closing sentence was, "Those consumers who do not want such information can ignore it; those who want to limit their trans fat intake need it."
On Nov. 20, 1997, the Post published an Associated Press report of a 14-year study of 80,000 female nurses. The results of that research, published in the prestigious New England Journal of Medicine, show, in the words of the researcher, Walter C. Willett of the Harvard School of Public Health, "the worst type of fat appears to be trans fat." On July 9, 2003, the Post reported that the Food & Drug Administration would announce that day a requirement for nutritional labels to include trans fat content (effective 2006).
There are still those who believe that ignorance is bliss. The rest of us would do well to study the outstanding work of Mary G. Enig in this field.
Silver Spring, Md.
- Chemical & Engineering News
- ISSN 0009-2347
- Copyright © American Chemical Society