Electric Cars, Their Grid, and Climate Scientology

A report in The Jerusalem Post suggests that President-elect Obama is quite favorably-inclined towards the electric car program being developed in Israel as a joint venture with Renault-Nissan:


The incoming Obama administration is “closely monitoring” the innovative electric car project being developed by Israel’s Better Place company, “and may be adopting it,” Idan Ofer, chairman of Better Place, has told The Jerusalem Post.


However, to power the grid required for such a system in the US, now, and into the foreseeable future, we would have to rely upon energy derived from both coal and nuclear plants. That Obama and his designated energy secretary, Mr. Chu, are not favorably inclined toward coal may be an understatement, because, sadly, both are adherents of Climate Scientology. Moreover, Obama has been lukewarm at best about nuclear power and the construction of new nuclear power plants. So how on earth will the US power the grid required for a massive re-tooling of our cars making electric vehicles the predominant mode of car transportation?


I suggest President-elect Obama start by educating himself about the sham science—the “Scientology” of CO2 as a climate changing “pollutant” which is the keystone of the anthropogenic climate change nonsense. He might wish to peruse data such as those presented below (and a mountain of other data which contradict the  flimsy CO2-based anthropogenic warming hypothesis). Is there a chance Mr. Obama and his staff could be convinced by such data?




At any rate, hope springs eternal. Here are some data for Mr. Obama and Mr. Chu to ponder compiled by Climatology Prof. Tim Ball:





Most people don’t know that thousands of direct measures of atmospheric CO2 were made beginning in 1812. Scientists took the readings with calibrated instruments and precise measurements as the work of Ernst-Georg Beck has thoroughly documented. Guy Stewart Callendar was an earlier visitor to these records. He rejected most of the records including 69% of the 19th century records and only selected certain records that established the pre-industrial level as 280 ppm. Here is a plot of the records with those Callendar selections circled.


It is clear how only low readings were chosen. Also notice how the slope and trend is changed compared to the entire record.

As Jaworowski notes,

“The notion of low pre-industrial CO2 atmospheric level, based on such poor knowledge, became a widely accepted Holy Grail of climate warming models. The modelers ignored the evidence from direct measurements of CO2 in atmospheric air indicating that in 19th century its average concentration was 335 ppmv.”

Beck recently confirmed Jaworowski’s research. A September 2008 article in Energy and Environment examined the readings in great detail and validated the 19th century findings. In a devastating conclusion Beck writes,

Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC. Review of available literature raise the question if these authors have systematically discarded a large number of valid technical papers and older atmospheric CO2 determinations because they did not fit their hypothesis? Obviously they use only a few carefully selected values from the older literature, invariably choosing results that are consistent with the hypothesis of an induced rise of CO2 in air caused by the burning of fossil fuel.

So the pre-industrial level is at least 50 ppm higher than the level put into the computer models that produce all future climate predictions. The models also incorrectly assume uniform atmospheric global distribution and virtually no variability of CO2 from year to year.

Beck found, “Since 1812, the CO2 concentration in northern hemispheric air has fluctuated exhibiting three high level maxima around 1825, 1857 and 1942 the latter showing more than 400 ppm.” Here is a plot from Beck comparing 19th century readings with ice core and Mauna Loa data.


Compare the variability of the atmospheric measures with the smooth line of the ice core record. Eliminating extreme readings and then applying a long term smoothing average achieved this. When smoothing is done on the scale of the ice core record a great deal of information is lost. Elimination of high readings prior to the smoothing makes the loss even greater. Also note that as with all known records the temperature changes before the CO2, in this record by approximately 5 years.

Elimination of data is also done with the Mauna Loa and other atmospheric readings, which can vary up to 600 ppm in the course of a day. Beck explains how Charles Keeling established the Mauna Loa readings by using the lowest readings of the afternoon. He ignored natural sources, a practice that continues. Beck presumes Keeling decided to avoid these low level natural sources by establishing the station at 4000 meters (m) up the volcano. As Beck notes “Mauna Loa does not represent the typical atmospheric CO2 on different global locations but is typical only for this volcano at a maritime location in about 4000 m altitude at that latitude.” (Beck, 2008, “50 Years of Continuous Measurement of CO2 on Mauna Loa” Energy and Environment, Vol 19, No.7.)

Keeling’s son continues to operate the Mauna Loa facility and as Beck notes, “owns the global monopoly of calibration of all CO2 measurements.” Since the young Keeling is a co-author of the IPCC reports they accept the version that Mauna Loa is representative of global readings and that they reflect an increase since pre-industrial levels.


Andrew G. Bostom is the author of The Legacy of Jihad (Prometheus, 2005) and The Legacy of Islamic Antisemitism " (Prometheus, November, 2008) You can contact Dr. Bostom at @andrewbostom.org

Comments are closed.