Schachter Energy Report Articles

Reader response to my book Sunlight on Climate Change: A Heretic’s Guide to Global Climate Hysteria has resulted in an invitation to be a guest contributor to the monthly subscription investment newsletter the Schachter Energy Report. This publication is a leader in evaluating global events, market fundamentals and using common sense to understand the big energy picture. My guest contributions will be focused on government policies and programs designed to fight climate change. After they are released in the Schachter Energy Report I will be posting them here at the beginning of each month. I hope you enjoy them. 

Dare to know!

(Emmanuel Kant on the Age of Enlightenment; 1784)

 
February 01, 2021: Why Everyone has the Federal Clean Fuel Standard Debate Wrong
 
March 01, 2021: Why the Carbon Tax Question Before the Supreme Court is Incomplete
 

April 01, 2021: Even Global Net Zero Emissions in 2050 Will Not Change Climate Change

May 01, 2021: If the Supreme Court Declaration of an Existential Threat from Climate Change Is Correct, We Would Already Be Dead

June 01, 2021: Fairness Demands a Carbon Tax on Greenhouse Gas Emissions from Hydroelectricity

July 01, 2021: Elon Musk Called Hydrogen Fuel Cells “Mind-Boggling Stupid”– I Bet He Hasn’t Read the Hydrogen Strategy for Canada.

August 01, 2021: The Next Big Short is Carbon Dioxide Caused Global Warming

September 01, 2021: Five Climate Change Solutions

September 01, 2021

Five Climate Change Solutions

If the 2021 federal election campaign is anything like the ones in 2015 and 2019, climate change will be an important issue. The mainstream media and some political parties have convinced voters that reducing human emissions of carbon dioxide will stop climate change. All voters want reassurance that appropriate action is being taken to preserve ecosystems and minimize human-caused climate change.

The major political parties are merely virtue signalling about reducing carbon dioxide emissions. They fail to devise a realistic plan and instead set distant goals. Their posturing is largely irrelevant as classical science, recent global warming and the climatic history of the Earth nullify the prospect that CO2 will cause catastrophic global warming. As an alternative to the moot goals of politicians, I invite you to consider five things we should be doing to safeguard our ecosystems, reduce human emissions of other air pollutants and greenhouse gases, and strengthen the Canadian economy. The fact that some of the five programs will reduce carbon dioxide emissions is unintended. At the very least it’s better than voting for the most convincing virtue signalling.

1) Clean Up the Air by Removing Particulates and Poisons

Coal and diesel produce ultra-small particulates that hang in the air (aerosols) and cause a multitude of problems in our lungs. The Bill and Melinda Gates Foundation funded a study that determined over four million deaths per year (7.6% of all global deaths) were attributable to this particulate matter, and 59% of those deaths were in south and east Asia. Burning coal can additionally release sulphur dioxide (leading to sulphurous or London smog), mercury (26% of global mercury emissions), arsenic, chromium and lead. Diesel also emits nitrous oxides (leading to photochemical or Los Angeles smog that aggravates asthma sufferers), and unburnt carbon particles.

Canada is already retiring coal-fuelled electricity generation, and we also have the technology to eliminate the black carbon plumes from buses and heavy hauling trucks. Those vehicles can be switched over to compressed natural gas with largely off-the-shelf equipment. Canada has a lot of natural gas cheaply available. Large cities can clean up the air above them with battery or hybrid electric vehicles. The electricity for those vehicles can be generated using fuel with minimal particulates and other harmful emissions such as natural gas, nuclear power or hydroelectricity. The coal industry would be challenged to remove the particulates and poisons released in the combustion of coal to be considered as a continuing fuel source for export or domestic use. These are all Canadian energy sources that would not export jobs.

Full disclosure: Clearing the atmosphere of deadly human-emitted aerosols will cause warming, as those particles are currently reflecting sunlight back to space.

2) Preserve Habitats by Reducing Life Cycle Physical Footprints of Energy Generation

An easy way to protect ecosystems is to not dig them up. Many green energy generation technologies triple-dip on ecosystems with massive footprints. The first dip is the mining phase to produce the materials needed. The second dip is the actual infrastructure needed for the green energy project. The third dip is the non-recyclable toxic waste from short lifespan green energy projects that ends up in landfills, is incinerated into the atmosphere, or is dumped into the oceans. Let’s set some rough benchmarks.

We’ll start with the mining phase. The International Energy Agency (IEA) recently estimated that to meet President Biden’s goal to halve U.S. greenhouse gas emissions by 2035, the world will need to increase rare earth mining from today’s levels by seven times to 42 times, depending on the rare earth mineral. These new mines will be mostly third world (un)regulated mines and China will be dominant. Mark Mills, a Senior Fellow at the Manhattan Institute, has calculated that it takes at least 10 times the conventional materials (such as concrete, steel and glass) to build wind, solar and hydropower machines as it takes to build a natural gas electrical generator of the same output. All these conventional materials and rare earth minerals start in mines that run on hydrocarbon power, and IEA data show that in some cases the mine emissions may offset the emission reductions achieved by the end products. Mark Mills’ investigations found that a battery capable of holding the energy equivalent of one barrel of oil takes the energy equivalent of 100 barrels of oil to produce. Switching to green energy technologies will cause a global mining boom that is anti-green.

To address the infrastructure phase, the International Renewable Energy Agency (IREA) has produced an estimate of the physical footprint of the project infrastructure for various power sources in the U.S. The comparisons below use nuclear power as the base with the smallest footprint. To obtain the same amount of electrical power consistently, these are the infrastructure areas needed for the following sources (the adjustments in brackets are mine):

  • Natural gas = 10 times greater than nuclear
  • Coal = 82 times greater
  • Hydro = 169 times greater
  • Wind = 382 times greater (includes area between towers and 34% time online)
  • Solar = 536 times greater (includes 28% time online)

Switching to green energy technologies takes up a lot more infrastructure area that disturbs natural habitats.

Regarding the disposal phase of green energy technologies, the IREA estimates that by 2050 solar panel garbage will be double the amount of all of today’s global plastic waste. On top of this are the plastic blades on wind turbines which cannot be recycled either (and our Federal Government has declared even plastic drinking straws as toxic). Natural gas electricity generators last twice as long as solar or wind and are recyclable.

Solar, wind and batteries have massive mining, operating and disposal footprints on ecosystems. Hydro has big mining and operating footprints and emits methane, a greenhouse gas. Canada has abundant and cheap natural gas which has a very low footprint in each of the three phases. Nuclear power creates the least amount of overall habitat disturbance, and Canada was once a leader in this technology.

3) Buy Things Built to Last

Bill Gates provides some excellent data in his book How to Avoid a Climate Disaster. While in Canada we seem to be fixated about the greenhouse gas emissions caused by transportation and heating/cooling, they are the two lowest of Gates’ categories of global emissions (16% and 7% respectively). The biggest source of global greenhouse gas emissions by far is manufacturing (31%). Gates reminds us to look at the full life cycle of emissions in making things before we decide what is the best choice.

When I was a kid my dad used to buy a car every three years (always a Canadian-made Chevrolet). Dad would drive it until it was close to 100,000 miles (160,000 km) and then sell it. In the 1960s and 1970s that was the life expectancy of a family car, and now the same brand of car reaches twice that mileage. Cars, however, are an exception in today’s world. For example, in our cabin we have a circa 1950 Norge refrigerator that still works, but in our house we have to replace the refrigerator about every seven years. Our new energy efficient appliances have probably used up to 10 times the steel, copper, manufacturing and delivery energy than the trusty Norge that dims the lights when the compressor kicks in. Furthermore, there is a lot of plastic in the newer machines that won’t get recycled. The 70-year-old Norge has had a smaller life cycle environmental impact, yet it is rated as less energy efficient.

Remember the spiral light bulb? It had low energy use but contained mercury that was destined for the landfill—another lesson in full life cycle considerations.

It is great that governments inform us how much energy an appliance uses in daily operations, but they should go much further than that. They should also rate products on their life cycle environmental impact from manufacturing to disposal/recycling, and on expected service life.

4) Improve Extreme Weather Preparedness

In Canada we have put all our eggs in the global basket hoping that if all nations meet their emission reduction targets, we will avoid increases in extreme weather events. When Canadians experience extreme weather events such as the polar vortex and the heat dome, our government responds by doubling down on greenhouse gas emission reduction pledges with no realistic plan to meet them.

The polar vortex and heat dome are the result of waviness in the jet stream, which are naturally occurring Rossby waves. These can be escalated into extreme weather by the naturally occurring El Nino and La Nina events, plus other things we don’t yet understand. Since Canada can be greatly affected by extreme weather, we should devote some efforts to understanding all the factors that result in a polar vortex and heat dome to help predict the resulting floods, droughts, wildfires and ice storms.

We could also investigate:

  • Following the advice of forestry experts and First Nations to carry out more frequent prescribed burns
  • Allowing the natural growth of non-commercial deciduous trees to serve as fire breaks, sometimes referred to as “asbestos trees”
  • Supressing human ignition of forest fires through agents such as trains, all-terrain vehicles, and overhead electrical wires
  • Updating construction standards for power lines, or burying the power lines
  • Discouraging irrigation of crops used for biofuels to conserve water (Biofuels don’t physically reduce CO2 emissions.)
  • Protecting urban developments on flood plains
  • Enhanced severe weather warning alerts and responses

Understanding and addressing ways to forecast and mitigate severe weather impacts in Canada is better than naively waiting for the entire world to become carbon neutral on the off-chance that it works.

5) Be Pragmatic About Canada and the Global Future of Oil

We are the fourth largest oil producer with the third largest oil reserves in the world, and we are in a rush to shut it down to reduce carbon dioxide emissions. Our competitors must view us as very useful idiots. Consider the following steps we are taking to reduce CO2 emissions:

  • We shut down new natural gas power generation and install more expensive green energy at much higher electricity prices, which shifts manufacturing advantages to other countries that burn coal.
  • We delay or cancel tidewater oil export facilities, which adds to the glut of oil in the U.S. so they can buy our landlocked crude at a steeper discount, refine it into gasoline and sell it at a North American premium to Vancouver. In that city they have shut down all except one refinery for environmental reasons, but have built the biggest coal export facility in North America to supply the coal needed for increased overseas manufacturing. Much of that coal comes from the U.S.
  • We restrict domestic pipelines, which disables our east coast refineries from accessing the domestic crude they want. This forces the importation of oil by marine transport from some environmentally and/or ethically challenged regimes. The U.S. concurrently almost triples their oil production and sells some to us.
  • We implement a carbon tax in a country described by my colleague David Yager as “uninhabitable without hydrocarbons” while our largest trading partners have no carbon tax, and we have to compete with them for export markets.
  • We hobble our refineries with clean fuel standards that don’t reduce carbon dioxide emissions, but allow methane emissions from hydropower a free ride, and subsidize new and unneeded hydropower that, as Bill Gates points out, typically has a life cycle greenhouse gas emission total greater than coal.

Our attempts to shut down the Canadian oil and gas industry have resulted in nothing more than shifting emissions elsewhere to our economic disadvantage. Those who desire the shutting down of the global oil and gas industry should recall the sage advice of Sheik Yamani, the venerable Saudi oil minister who transformed the geopolitics of oil in the 1970s. He predicted “The Stone Age did not end for lack of stone, and the Oil Age will end long before the world runs out of oil.” A corollary might be that the Stone Age did not end because of a tax on stone either, but rather ended because of a technological advance that replaced stone. The world does not yet have an affordable and reliable substitute for hydrocarbons, and we should be realistic about that. In the meantime, the Canadian oil and gas industry can help clean up the global atmosphere, grow our economy and strengthen our confederation.

Please vote for that.

 

August 01, 2021

The Next Big Short is Carbon Dioxide Caused Global Warming

The 2015 Academy award winning movie The Big Short by director Adam McKay is an entertaining explanation of the 2008 global financial crisis. It used clever analogies and layman’s terms to take the viewer from the initial concept of a legitimate, well-founded and simple financial concept through a 30-year incubation to the ultimate corrupt, greed-driven, malignant cancer that took down the international banking order and partially collapsed the global economy. In the US alone eight million jobs were lost, six million homes were defaulted on, and $5 trillion in savings and stock market value disappeared. According to author Michael Lewis of the namesake book, this was allowed to happen because the big banks, the public, and the government at best never bothered to check the facts, or at worst willingly ignored the truth.

The American banking industry was transformed by the realization that a bond consisting of thousands of high-quality housing mortgages was worth more than the sum of the individual mortgages. The limit to how many bonds could be issued was set by how many AAA-rated prime mortgages were in existence. When these ran out the banks began issuing bonds using sub-prime BBB and BB-rated mortgages, claiming the risk was not increasing as American housing prices had never gone down (forgetting that they did in the Great Depression). The bond rating agencies agreed that the risk of mortgage defaults was still very low. The banks ignored the first warning signal that take-home pay was stagnant while housing prices soared, and missed the second warning signal of steadily increasing mortgage defaults.

There are many parallels between The Big Short and what I am suggesting is The Next Big Short, the eventual collapse of the international political movement that carbon dioxide (CO2) is causing catastrophic global warming. The Big Short was fuelled by greed for money, and The Next Big Short is fuelled by greed for political power. Both willfully lost sight of foundational principles, both ignored important warning signs, and both compounded the losses by blind faith in their own exaggerated claims.

The Next Big Short began in 1896 when a chemist named Svante Arrhenius recognized that carbon dioxide was a greenhouse gas, and that it would cause global warming if its concentration in the atmosphere was increased by the burning of fossil fuels. Carbon dioxide causes global warming by absorbing a specific narrow range of wavelengths of infrared radiation (heat) and recycles some of that back to the surface of the Earth. This is the greenhouse gas effect. Arrhenius predicted that each time the carbon dioxide concentration in the atmosphere doubled, the global temperature would increase between 3oC and 5oC. The limitation of global warming caused by CO2 is not how much CO2 is in the atmosphere, but how much infrared radiation of the specific narrow range of wavelengths is left for the additions of CO2 to absorb. As the surplus of this infrared radiation diminishes, it takes more and more CO2 to capture what remains. Exponentially more CO2.

The foundational principle of the greenhouse gas effect—as established by Arrhenius and universally accepted to this dayis that the amount of CO2 in the atmosphere has to be doubled each time to get the same temperature increase.

Late in the 20th century environmental activists claimed that carbon dioxide from the burning of fossil fuels would cause catastrophic global warming. They aligned themselves with political parties that bought into the idea that humanity had caused the warmest global temperatures in one thousand years. They conveniently forgot that this hottest-in-a-millennium was based on the work of one research team, whose findings were thoroughly and repeatedly discredited.

The first warning signal that the global warming activists and politicians ignored was irrefutable evidence that the Earth’s climate also changes naturally.

Compared to the late 1990s it was about 2oC warmer during medieval times and about 1oC cooler during the subsequent Little Ice Age, which included the coldest years in the last 10,000 years (circa 1875). Prior to that the Roman Warming period (2,500 years ago to 1,500 years ago) was at least 2oC warmer, and the Holocene Maximum temperature peak (about 6,000 years ago) was probably 2 ½ o C warmer.

Arrhenius’ identification of CO2 as a greenhouse gas was correct; the burning of fossil fuels was increasing atmospheric CO2 concentrations, and the average global temperature was getting warmer. The unknown was how much of the warming was natural climate change and how much was caused by the burning of fossil fuels. Politicians and environmental activists conveniently forgot to mention that it takes exponentially more CO2 to keep the greenhouse gas effect going, and that the recent ending of the Little Ice Age was caused by natural warming. Even the Intergovernmental Panel on Climate Change (IPCC) concluded that only at least 0.3oC of the total recent global warming of 0.85oC was human-caused. Stated another way, the IPCC acknowledged that up to 65% of the recent global warming could be natural. That statement did not sound apocalyptic enough to sway voters, so the environmental activists and political groups raised the stakes with statements connecting severe weather events to climate change:

  • Record occurrence of hurricanes, cyclones and tornadoes
  • Increased flooding and drought
  • Unprecedented fires in the Amazon, California and Australia
  • Hottest day ever recorded
  • Rapidly rising sea levels
  • Acidification of the oceans

The press assumed the role of bond rating agencies in The Big Short, and unwaveringly expressed their opinion that the global warming science was settled, often implying that human activities were the sole cause. They perpetuated the severe weather stories without conducting due diligence and mocked as deniers the many high-profile scientists and inside whistleblowers who came forward to challenge all the above false statements. The media were only stating their opinion, not verifying a fact, and it was self-serving to express a popular opinion everyone wanted to hear and was willing to pay for.

The press broke no laws, unless you count the expectation that they should do their job ethically and professionally.

The 2015 Paris Agreement was signed to reduce global carbon dioxide emissions by about 50% by 2030. When it became apparent that goal could not be met, many of the signatories switched to a Net Zero Carbon Emission pledge by 2050. Annual carbon dioxide emissions from human sources continued to rise, and even during the COVID-19 pandemic the carbon dioxide concentration in the atmosphere continued to increase.

 The second warning signal that the environmental activists and politicians missed was that global temperatures were not increasing as predicted by the IPCC.

Not only is it possible that global warming has almost stopped, but recent global warming does not necessarily mean higher daily maximum temperatures. The data provided by the IPCC indicates the doubling of the atmospheric CO2 concentration is on track to coincide with only 1.5oC global warming, not the computer-modelled forecasts of between 3oC to 5oC. The same data provides the well-documented pause in global warming from 1998 to 2012 and minimal warming since then. Satellite data confirms that the average global temperature in April 1983 was the same as in April 2021, but with many small cyclic changes in between. This may be a confirmation that there is very little of the specific narrow range of infrared wavelengths for the additions of CO2 to absorb, and that the weather is heavily influenced by cyclic events such as El Niño.

Dr. Steven Koonin, who was the Undersecretary for Science in the Obama administration, explains in his new book Unsettled: What Climate Science Tells Us, What It Doesn’t, And Why It Matters that global warming does not necessarily mean record-high temperatures. In the continental US the average maximum daily temperature has been about the same since 1900. What has changed is that the average minimum daily temperatures have been warmer. For the last century the US has not experienced more extreme heat, it has experienced less extreme cold, and this includes the last 40 years. The average daily temperature is higher because of warmer nights, not hotter days. The US climate has become more moderate, not more extreme, and Dr. Koonin unravelled the misleading statistical analysis that was designed to hide this.

The year 2020 was not a good one for global warming alarmists. It began with the prominent, dedicated 20-year global warming activist Michael Shellenberger issuing the apology “On behalf of environmentalists everywhere, I would like to formally apologise for the climate scare we created over the past 30 years. Climate change is happening. It’s just not the end of the world. It’s not even our most serious environmental problem.” Shellenberger detailed his epiphany in his book Apocalypse Never. Another prominent environmentalist, Bjorn Lomborg who is famous as The Skeptical Environmentalist, published his new book False Alarm: How Climate Change Panic Cost Us Trillions, Hurts The Poor, And Fails To Fix The Environment. But the crushing blow came in 2021 from the above-mentioned Dr. Koonin, former top climate advisor to President Obama, who stated that human contributions to global warming are “small or subtle.” His point should be obvious when one considers that less than 4% of the carbon dioxide added to the atmosphere each year is from human origin, and the rest is from natural sources.

The Big Short meltdown was triggered when with mathematical certainty mortgagees finally defaulted as higher interest payments kicked in. The Next Big Short will be triggered when, with mathematical certainty, the predictions of carbon dioxide-caused catastrophic global warming fail to materialize.

Many prominent former IPCC-supporting scientists are now acknowledging that human activities are not going to cause catastrophic global warming, and that natural causes are slowly being better understood. If the concentration of carbon dioxide does double from Arrhenius’s time, it may be a little bit warmer than it was in 1896 but still well within the range of the Earth’s natural climate variability. But we will still have environmental problems such as chemical pollution, loss of wildlife habitat, plastics in the oceans and overfishing because the global warming activist groups have been chasing the wrong molecule for far too long.

In The Big Short the biggest loser turned out to be the taxpayer because of what has come to be known as a moral hazard. The big banks were reckless and took extreme risks in exchange for extreme profits because they knew if they failed the government would be forced to bail them out. In The Next Big Short the global warming activists are taking advantage of a similar moral hazard because when expensive green energy projects fail, or when carbon taxes induce energy poverty, it will be future generations of taxpayers who will bail them out. But the biggest loser will be the planet, because our focus on carbon dioxide has kept us from dealing with actual resolvable environmental problems.

November 01, 1952 USA detonated its first Hydrogen bomb, named “Mike” on the Pacific island of Elugelab, which was vaporized. December 16, 2020 Canada released its first hydrogen bomb, named “Hydrogen Strategy for Canada” in Ottawa, which went unnoticed.

July 01, 2021

The Climate Change Heretic: Elon Musk Called Hydrogen Fuel Cells “Mind-Boggling Stupid”– I Bet He Hasn’t Read the Hydrogen Strategy for Canada.

Elon Musk has repeatedly—he claims about 8,000 times–dismissed the potential of hydrogen as a fuel, but his most famous quote is directed specifically at hydrogen fuel cells where he describes them as “mind-boggling stupid.” My apologies to the marvellously accomplished Mr. Musk, but he ain’t seen nothin’ yet.

The Federal Government has recently released a strategy with a year 2050 objective for Canada to meet 30% of our end-use energy needs with hydrogen. This Hydrogen Strategy for Canada (the Strategy) is part of our government’s objective of transforming the Canadian economy to a net zero emitter of carbon dioxide by the year 2050. The attractiveness of hydrogen as a fuel in this context is that it emits no carbon dioxide when burned directly or used to generate electricity in a fuel cell. The Strategy has three key components for reducing carbon dioxide emissions: using hydrogen fuel cells for transportation; burning hydrogen gas for heating; and using hydrogen from non-emitting sources in existing industrial applications.

The pro-hydrogen fuel camp likes to lead in with the fact that hydrogen is the most abundant element in the universe. The problem is that hydrogen exists in this abundance as atoms tied up in other molecules (as in hydrocarbons or water). It is the hydrogen molecule (two hydrogen atoms bonded together) that is used as fuel and exists on Earth in only very small amounts. That means we have to manufacture it. About 95% of hydrogen manufactured in the world is made by mixing natural gas (or any other source of methane) with high-temperature high-pressure steam in a process called Steam Methane Reforming. This reaction gives off carbon dioxide, and the heating of water with natural gas to make steam also gives off carbon dioxide. The total carbon dioxide generated and emitted to the atmosphere in this hydrogen manufacturing process is about the same as from the combustion of an amount of gasoline containing an equivalent amount of energy. The hydrogen produced by the Steam Methane Reforming method is called grey hydrogen because the co-produced carbon dioxide is vented to the atmosphere. When the carbon dioxide is captured and stored (no venting to the atmosphere) the product is called blue hydrogen.

There is also green hydrogen, which is made by another manufacturing process altogether called hydrolysis. Electricity is used to split a water molecule into hydrogen and oxygen, but if the electricity is sourced from a fossil-fueled power generation plant, the goal of reducing carbon dioxide emissions is largely lost. For green hydrogen to be a viable means of eliminating carbon dioxide emissions, it must use electricity from a non-carbon dioxide emitting source such as nuclear power.

Both grey and blue hydrogen prices are dependent on the price of natural gas, while green hydrogen is dependent on the price of electricity. An excellent depiction of pricing was posted by David Layzell and Jessica Lof on the Canadian Energy Systems Analysis Research website. They estimated that at today’s price of natural gas in Alberta, about $3/Gigajoule (GJ), the production cost of an energy equivalent amount of grey hydrogen would be about $6 and blue hydrogen would be about $10. These costs do not include transportation to the retail site, which is expensive. Electricity currently sells in Alberta for about $0.80/Megawatt-hour, equivalent to $20/GJ, and it is estimated that the 1GJ energy equivalent in green hydrogen would cost about $33 to produce, again with transportation extra. As a rule-of-thumb, the production-only costs of hydrogen compared to an energy equivalent of natural gas is grey hydrogen (resulting in about the same amount of carbon dioxide emissions) is twice as expensive, blue hydrogen (carbon dioxide emissions are captured and stored) is about three times as expensive, and green hydrogen (no carbon dioxide is produced) is about 10 times as expensive.

Let’s start our examination of the Strategy with the hydrogen fuel cell for automobile transportation, which represents about one third of the carbon dioxide reductions expected. The hydrogen fuel cell accomplishes the reverse of green hydrogen hydrolysis; it combines hydrogen and oxygen (from the air) to produce electricity and water. The electricity is used to constantly charge a small battery which drives the vehicle’s electric motors. The fuel cell electric vehicle is only truly free of carbon dioxide emissions when either blue or green hydrogen is used, as the production of grey hydrogen emits about the same carbon dioxide as the fuel cell avoids. The Strategy wants to have five million fuel cell electric vehicles on the road using blue or green hydrogen by 2050. The Strategy also claims fuel cell vehicles are twice as energy efficient as gasoline vehicles, which is also supported by the California Hydrogen Business Council. California has the world’s first (possibly only) retail network of 44 automotive hydrogen fueling stations. In 2019, the retail hydrogen fuel cost was four times that of gasoline on an energy equivalent basis. Since hydrogen fuel cell vehicles are twice as efficient, they were only twice as expensive to operate as gasoline vehicles on a milage driven basis.

Up to this point, I don’t think Elon Musk really cares much. The Tesla, his battery electric vehicle, differs primarily in that the electricity is not generated on the vehicle but it is stored in a very large battery. Neither the fuel cell electric vehicle nor the battery electric vehicle emits any pollutant exhaust, which results in cleaner air above our cities. The battery electric vehicle only contributes to a reduction in carbon dioxide emissions when the electricity used is from a non-carbon dioxide emitting power generation source.

It is after this point Mr. Musk is willing to say “8,000 times” that hydrogen fuel cells are mind-boggling stupid. According to the US Department of Energy, when gasoline is burned in an internal combustion engine vehicle only 16% to 25% of the energy in the gasoline is used to rotate the vehicle wheels. Most of the energy loss is within in the internal combustion engine itself. In a hybrid gasoline-battery vehicle, that number range jumps to 24% to 38% of the energy being used at the wheels, primarily because of the vehicle’s electricity generating braking technology. In a Tesla an amazing 86% to 90% of the energy stored in the battery goes to rotating the wheels, again taking into account the regenerative braking.

In a hydrogen fuel cell only about one half of the energy contained in the hydrogen is converted into electricity to be used at the wheels. That makes the battery electric vehicle (Tesla) almost twice as energy efficient as a fuel cell electric vehicle. And it gets better for Elon. Electricity transmission losses from the power plant via wire to the charging station are about 5%. Hydrogen is most efficiently transported by pipeline and is delivered into the vehicle’s storage tank at approximately 10,000 pounds per square inch. The pipeline and compression energy consumption required to do this is equivalent to about a 25% energy loss. This makes the Tesla about three times more energy efficient. When green hydrogen is used in a fuel cell the electricity consumed in the hydrolysis reaction to make the green hydrogen contributes to another 30% energy loss and the amount of electricity used to turn the wheels drops to about 25% of the total input electricity. That makes the Tesla four times as energy efficient as a fuel cell vehicle using green hydrogen, and the green hydrogen fuel cell vehicle about the same overall energy efficiency as a gasoline engine driven the same distance.

That is the part that boggles Mr. Musk’s mind. Who would convert to a new system that wastes between two thirds and three quarters of the input energy, similar to the existing system? If you turn that equation around it means that to drive the same distance the fuel cell electric vehicle requires between three and four times the input energy of a battery electric vehicle. Then multiply that energy wastage by five million Canadian vehicles and the target of the Strategy is met. It only makes sense if the real strategy was to find a new market for existing or planned hydroelectric excess capacity to make green hydrogen. One should also factor in that while hydroelectricity may not emit carbon dioxide, the hydroelectric reservoirs emit a much more potent greenhouse gas in unmeasured but significant quantities: biogenic methane (which I examined in last month’s article).

But are hydrogen fuel cells stupid, Mr. Musk? No, because the fuel cell has three major advantages that may come into play for heavier vehicles used for hauling. Fuel cell vehicles can be refuelled with hydrogen much faster than recharging a battery. They can travel a further distance on a single tank of hydrogen than a battery vehicle can on a single charge. And most importantly, they can be scaled up in size with less overall vehicle weight gain than a battery can. These three factors make the hydrogen fuel cell much more favorable in heavy long-haul transport applications. In a Canadian context the fuel cell may prove to be more operable in severe cold weather.

To get to the stupid level assigned by Musk we have turn our attention to the use of hydrogen as a replacement or supplement to natural gas, which is the second part of the Strategy. It envisions that slightly under one third of the total carbon dioxide emission reductions from using hydrogen as a fuel will come from blending hydrogen into natural gas and displacing 50% of the natural gas we now use for heating in industry and buildings. The Strategy does recognize that required modest adjustments to existing natural gas pipelines and end-use burners restrict the blend to 20% hydrogen by volume. The Strategy fails to mention that above 20% hydrogen by volume creates significant leakage concerns in plastic pipes in confined spaces (as in your basement) and a wider range of ignition conditions (more likely to explode if it leaks into your basement). The Strategy also fails to mention that under the right conditions, hydrogen combustion can produce nitrogen oxides which are hazardous to human health and react with sunlight to cause smog.

What also has not been recognized is that at the same temperature and pressure, the energy equivalent of hydrogen gas takes up three times the space of natural gas. In order to replace 50% by volume of the natural gas in a pipeline with the energy equivalent in hydrogen, the volume of hydrogen added would be three times the volume of the natural gas removed. The result is an original one cubic foot of natural gas has been replaced with its energy equivalent of a blend that takes up two cubic feet. Not only are the existing pipelines too small to carry twice the volume, but if you exceed a 20% hydrogen content in that volume things might start blowing up, literally.

Blending 50/50 is a serious safety risk. The only way to displace 50% of the natural gas used in Canada today for industrial and building heat with hydrogen is to build a second transcontinental pipeline system dedicated to hydrogen and convert the end-use burners to hydrogen specific appliances. The hydrogen blending strategy amounts to this:

  • Convert cheap natural gas to expensive blue hydrogen with carbon capture and storage; this conversion triples the cost of the energy. (I elected to rule out very expensive green hydrogen as direct electric heating would then make more sense.)
  • Then duplicate the existing transcontinental natural gas pipeline system with one for hydrogen-only. (This would be no small matter in a country where the Federal Government’s signature legislation, Bill C-69, is dubbed the No More Pipelines Bill. Could we even find enough pipeline protestors?)
  • Switch 50% of the end users to pure hydrogen (and be prepared to discuss increased health and safety risks without mentioning the name Hindenburg).

Even the Federal Government must realize this is quite impossible. It may be a proposal ostensibly to promote hydrogen, but it could result in a conversion from cheap, abundant, carbon dioxide-emitting natural gas directly to expensive, overbuilt, methane-emitting hydroelectricity.

That leaves us with the final component of the Strategy, which would contribute slightly more than one third of all carbon dioxide emission reductions by 2050. Large industries such as oil refining, fertilizer production and steel making already use significant quantities of pure hydrogen in their industrial processes. Currently most of this is grey hydrogen supplied by the steam methane reforming process without carbon capture and storage. The Strategy states that grey hydrogen should be upgraded to blue hydrogen. This is a case of the Federal Government double-dipping on carbon capture and storage initiatives already under review in most provinces and under provincial jurisdiction.

The Hydrogen Strategy for Canada is flawed because it selects the least efficient transportation option, proposes an unviable heating option, and claims credit for an industrial option that has already been initiated. Bad economics are not a barrier, or even a speed bump, and safety considerations are unmentioned. I think Elon Musk’s “mind-boggling stupid” comment sums it up well.

The Perfect Camouflage for Human Emitted Methane - Photo Credit: Lysle Barmby

June 01, 2021

The Climate Change Heretic: Fairness Demands a Carbon Tax on Greenhouse Gas Emissions from Hydroelectricity

In an odd role reversal, Canada appears to be a greenhouse gas emissions denier. The reservoirs behind hydroelectric dams are often serene man-made lakes full of plant and animal life. They enable photo opportunities and bragging rights in the “fight against climate change” by providing carbon dioxide emissions-free renewable electricity in massive quantities. But they are still major sources of ongoing greenhouse gas emissions, and over the life of a project hydroelectricity may be one of Canada’s worst greenhouse gas emitters.

In our 2021 National Inventory Report (NIR) on greenhouse gas emissions, there is no indication that Canada will ever acknowledge that man-made hydroelectric reservoirs emit significant amounts of methane. If we do not acknowledge these emissions, they will continue to exist but we will be unable to tax them. 60% of our electrical power is generated from these large greenhouse gas sources, and we are the third largest hydroelectricity producer in the world. Ignoring the existence of methane gas emissions from hydroelectric reservoirs is unfair preferential treatment that violates the first pillar of our Federal Government’s Pan-Canadian Framework on Clean Growth and Climate Change: pricing carbon pollution.

Hydroelectric reservoirs, the bodies of water created by hydroelectric dams, collect biogenic materials from a large catchment area and concentrate them into the reservoir, where they join similar terrestrial material that was in-situ pre-flooding, and new aquatic organisms that were introduced post-flooding. The still waters also promote the growth of phytoplankton (microscopic algae). When all these organisms die and degrade, they produce the greenhouse gases carbon dioxide and methane, with methane by far the more significant emission. This methane generation mechanism is well-established globally in the scientific literature; the main area of uncertainty is the amount.

[Note to Reader: The IPCC National Greenhouse Gas Emissions inventory reports various greenhouse gases and converts them to the equivalent amount of carbon dioxide. The IPCC stipulates that methane has a conversion value of 25, meaning that one tonne of methane released in the atmosphere is equivalent to releasing 25 tonnes of carbon dioxide. This has come under some criticism as published estimates of the potency of methane as a greenhouse gas range from 20 times to 84 times that of carbon dioxide. This large uncertainty is dependent on how long methane is estimated to stay in the atmosphere. Environment and Climate Change Canada uses a conversion factor of 70 on its website. For consistency, all conversions by the author expressed in this article use the IPCC Guidelines sanctioned value of 25 and the author agrees that this value may be low. The Brazilian study mentioned later in this article also quoted the IPCC Guidelines.]

There are a couple of benchmarks that indicate methane emissions are significant and should not be ignored. The first is from a study of 10 Brazilian dams in 2011. The study concluded that the hydroelectric reservoirs give off up to 18.2% of the greenhouse gas equivalent emissions that the most efficient natural gas-fired generators give off to generate the same amount of electricity. The NIR would have readers believe that percentage is zero. Canada’s total hydroelectric generation capacity is 347 terrawatt-hours per year. If the Brazilian benchmark applies, that equates to about 9 million tonnes of CO2 equivalent per year, or 25% of Canada’s oil and gas industry fugitive (leaked) methane emissions. It would also increase our nation’s total greenhouse gas emissions by 1.2%. The Brazilian benchmark applicability to Canada is uncertain, as it is expected that shallow, tropical hydro reservoirs give off more methane than deeper, boreal ones.

The second benchmark is from the recent book by Bill Gates, How to Avoid a Climate Disaster. Gates points out that hydroelectric dams are responsible for large volumes of carbon dioxide emitted in the production of the cement to build the dam. Cement production accounts for 3% of global greenhouse gas emissions, or twice Canada’s total emissions. He references a life cycle assessment of all the greenhouse gases a dam is responsible for and concludes “… a dam can actually be a worse emitter than coal for 50 to 100 years before it makes up for all the methane it is responsible for ….” Mr. Gates did not disclose his data, but his life cycle assessment indicates hydroelectric dams could be responsible for more than 10 times the greenhouse gas emissions of the Brazilian study when both the carbon dioxide from the cement and the methane from the reservoir are included and spread over 50 to 100 years. This would be twice the greenhouse gas emissions than from a natural gas-fired station generating the same amount of electricity.

Even if both benchmarks are viewed as not representative of Canadian hydroelectric reservoirs, they contain the scientific validity to contradict the Government of Canada’s often-repeated official position that hydroelectricity is “clean, green and non-emitting.” That is not true.

The United Nations Educational, Scientific and Cultural Organization (UNESCO) addressed the question of how much methane is produced in flooded lands in a research project from 2006 to 2012. That work was then transferred to the Intergovernmental Panel on Climate Change (IPCC) to formulate the new 2019 Refinement to the 2006 IPCC Guidelines on National Greenhouse Gas Emissions. This document provides equations, parameters and categories to estimate methane production from flooded land, in particular hydroelectric reservoirs. It is literally a fill-in-the-blanks calculation. However, the IPCC downplays the significance of this work on its website:

“The 2019 Refinement was not to revise the 2006 IPCC Guidelines, but update, supplement and/or elaborate the 2006 IPCC Guidelines where gaps or out-of-date science have been identified. It will not replace the 2006 IPCC Guidelines. It should be used in conjunction with the 2006 IPCC Guidelines.”

After 13 years of study within the United Nations to prove and quantify that hydroelectric reservoirs emit methane, the IPCC failed to require that the new science be used. “Should be” can easily be interpreted as “optional.” This makes emission counting like a cafeteria; select what you prefer and avoid what you don’t prefer. Two of the most influential member countries of the United Nations would benefit from this: China, by far both the largest greenhouse gas emitter and largest hydroelectricity producer; and the United States, the second largest greenhouse gas emitter and fourth largest hydroelectricity producer.

Unsurprisingly, it appears that the Canadian government has elected not to implement the 2019 Refinement in the 2021 NIR. Only the 2006 IPCC Guidelines are referenced, and the report’s section dedicated to flooded land contains the statement “The assessment includes CO2 emissions only.” Canadian hydroelectric greenhouse gas emissions remain uncounted and hiding in plain sight. All the while our government extols their nil greenhouse gas emissions status and promotes their expansion. That is not fair.

The current federal price of carbon pollution is $40 per tonne of carbon dioxide equivalent. Based on the Brazilian benchmark, the Canadian hydroelectricity industry would be hit with a $360 million bill in 2021 and $1.5 billion in 2030 when the price reaches $170 per tonne. If we applied the Environment and Climate Change Canada methane-to-carbon dioxide conversion rate of 70, the $170 per tonne tax would bring in $4.2 billion per year.

Instead of implementing the optional 2019 Refinement to the 2006 IPCC Guidelines on National Greenhouse Gas Emissions and accurately counting methane emissions, the Federal Government has recently announced the $750 million Emissions Reduction Fund (ERF), which is largely made up of loans to reduce methane emissions from oil and gas production operations. Government literature indicates that oil and gas operations make up 44% of human-caused methane emissions in Canada, and they have set a target of reducing methane emissions from this sector by 40–45% from the 2012 levels by 2025. About 5­­% of our total greenhouse gas emissions in 2019 were fugitive releases from oil and gas operations (with hydroelectric methane emissions unaccounted for).

Reducing methane venting and leaks in the oil and gas industry is the right thing to do and the Federal Government’s help should be welcomed. However, the oil and gas industry should not be demonized as the only energy provider emitting methane. Once a hydroelectric dam is filled and operational, it is most likely impossible to do anything to mitigate the emissions of greenhouse gases. But being allowed to pollute for free is inconsistent with Federal Government climate change philosophy. The oil and gas production industry faces penalties for its inability to eliminate greenhouse gas emissions, and the same should apply to the hydroelectric industry. The recent Supreme Court decision enabling these penalties did not give a waiver to hydroelectricity, and the existential threat to humanity they declared was from all sources of human-caused greenhouse gases.

In Canada we are closing down coal-fueled electrical power generation plants because they are deemed high greenhouse gas emitters and we are building hydroelectric generating facilities because they are deemed clean and green. It appears from Mr. Gates’ research that all we have done is front-loaded the same amount of life cycle emissions to the construction phase and after 50 to 100 years we may see net gains. In Canada hydropower is on a green pedestal that is unwarranted. The Canadian government is virtue signalling by setting targets for carbon reduction while concealing the fact that there are other sources of greenhouse gas emissions. They are signalling that the sacrifice should fall only on coal, oil and gas. This is like saying the budget should be balanced by only taxing the rich while exempting others who share responsibility for the deficit.

Hydroelectric dams should face the same environmental scrutiny as coal and natural gas-fired generators. This should include life cycle greenhouse gas emissions of construction (carbon dioxide from cement) and operations (biogenic methane). This is clearly established science endorsed by the IPCC. Hydroelectricity should bear the same pricing on carbon pollution as hydrocarbon-fired electricity generation. It is in the national interest to be truthful and fair about it.

Ivistitia (Justice) stands guard outside the Supreme Court of Canada

May 01, 2021

The Climate Change Heretic: If the Supreme Court Declaration of an Existential Threat from Climate Change Is Correct, We Would Already Be Dead

The Supreme Court of Canada was tasked with a final determination of whether or not the Federal Government’s national carbon levy was constitutional. What was expected was a yes or no answer, but the just-released Supreme Court ruling went much further: they ruled that human carbon dioxide emissions are an existential threat to humanity.

The Supreme Court does not explain their rationale for elevating human-caused climate change to an existential threat; therefore, we are left to speculate. As a heretic I would like to know the basis of their ruling. What sources guided the judges in their determination? There are three potential sources they could have consulted, but whether they did is unclear.

1) Did They Rely on the Intergovernmental Panel on Climate Change?

The United Nations Intergovernmental Panel on Climate Change (IPCC) released their last full report in 2014. Significant findings included they were confident that human activity caused at least 50% of the increase in global temperature from 1950 to 2012, and they provided several scenarios forecasting global warming in the 21st century.

From the beginning of the 20th century to the above report’s data cut-off date of 2012, it is generally accepted that the average global temperature increased by about 0.8oC. Media headlines infer that humans are the sole cause of this global warming. That is not the IPCC view. The preferred IPCC database shows approximately 0.64oC global warming occurred from 1950 to 2012, and the IPCC concludes at least half of that was caused by humans (about 0.3oC), while remaining silent on the cause of the balance (about 0.5°C) of all other warming. It must be caused by non-human factors.

My first speculation in this article is that this human-caused 0.3°C global increase in temperature from 1950 to 2012—as claimed by the IPCC—was insufficient cause for the Supreme Court to elevate human carbon dioxide emissions to an existential threat.

Perhaps it was the forecasts of global warming that influenced the ruling.

When we look at the IPCC forecasts of global warming caused by greenhouse gas emissions the worst-case scenario is almost 5oC of global warming in the 21st century. IPCC forecasts are not scientific fact; they do not meet the rules of the Scientific Method (it would be impossible to do so because we cannot mathematically model the Earth’s climate to those standards). And as in all computer simulations, some decisions that are made may unavoidably reflect the opinions and prejudices of the programmer (called confirmation bias). Let’s look at the performance of the 108 climate change computer simulations the IPCC utilized up to 2014 as analyzed by P.C. Knappenberger and P.J Michaels at CATO.org:

  • For the 30-year period from 1985 to 2014 the average predicted global temperature increase was 0.75oC; the actual measured global warming was 0.5o
  • For the 20-year period from 1995 to 2014 the average predicted global temperature increase was 0.54oC; the actual measured global warming was 0.2o
  • For the 10-year period from 2005 to 2014 the average predicted global temperature increase was 0.2oC; the actual measured global warming was 0.0o

A critical point to emphasize is the IPCC is not a scientific research organization. It is an international political organization that selects the public domain climate forecasts it prefers to use in its reports. My second speculation is that the IPCC-selected climate change forecasts by themselves lack the scientific certainty and the performance credibility to legally amplify climate change to an existential threat.

2) Did They Consult Classical Science?

I question whether the Supreme Court relied on classical science, that is to say, learnings established by the Scientific Method. Had they gone that route they would have discovered many discrepancies in what is implied in media scientific reporting.

The Earth is not a greenhouse. The roof of a greenhouse prevents the escape of evaporated water and convection currents (think of a thunderhead cloud) and can trap up to 70% of the Sun’s energy (think of a mirror). The terms “greenhouse gas” and “greenhouse gas effect” can be misleading because the warming of the Earth is nothing like the warming of a greenhouse. Greenhouse gases do not form an impermeable shell around the Earth. They allow convection and evaporation heat to escape, and only specific wavelengths of infrared radiation from the sun are recycled back to the Earth. Greenhouse gases absorb and re-emit this limited supply of infrared radiation in all directions (think of a lightbulb). This places a practical limit on global warming by the greenhouse gas effect.

This is the same theory used by the IPCC and is definitively proven by the fact that over the last 540 million years we have had up to 12 times the carbon dioxide in the atmosphere than exists today without runaway global warming.

Carbon dioxide is plant fertilizer, not a pollutant. Actual greenhouses get 40% higher plant growth when the carbon dioxide concentration is increased from the ambient 410 parts per million to as much as 1500 – 2000 parts per million. Your office building air is probably at 1000 parts per million carbon dioxide. That level does no harm to you and your office plants benefit.

Carbon dioxide cannot turn the oceans to acid. That’s because the oceans contain salt which prevents this from happening by a chemical process called buffering. Since ocean life evolved with higher atmospheric carbon dioxide concentrations, they are largely unaffected by today’s increases.

We don’t know if or how much current climate change is affecting the ocean level because, prior to satellites, all our measurement reference points were based on land. England is sinking while Scotland is rising. Another example is Venice, a stone city built on ancient wooden piles driven into mud; is the sea level rising or is Venice sinking? The current 25-year satellite-based global sea level rise is about 13 inches per century, which is less sea level rise per year than the height of this 12-point font I’m using.

I’ll speculate for a third time: The Supreme Court did not consult classical science before declaring human-caused climate change an existential threat. If they had, they would have encountered reasonable doubt that ocean levels are rising catastrophically and that carbon dioxide is a pollutant. Furthermore, they would have encountered proof that the oceans cannot be acidified and that runaway global warming cannot be caused by atmospheric CO2.

[Note to reader: Media statements of overwhelming scientific consensus on the cause and severity of climate change repeatedly fail upon close examination but are rarely retracted. One example is: On May 16, 2013 President Barack Obama tweeted that “Ninety-seven percent of scientists agree: #climate change is real, man-made, and dangerous.” This was one day after a study of 11,944 scientific papers released the headline: “Among abstracts expressing an opinion on AGW, 97.1% endorsed the consensus position that humans are causing global warming.” A careful examination of those papers (not just the abstracts) by Dr. David Legate found that none of the papers supported what Obama said. Only 41 of the 11,944 papers agreed with the IPCC view that “Mankind was responsible for at least 50% of the global temperature increase since 1950,” and two-thirds of the papers expressed no opinion. Ninety-seven percent consensus occurs in dictatorships, not in healthy scientific communities.]

3)  Did They Review the Climatic History Since the Last Ice Age?

About 14,500 years ago, the Last Glacial Period (Earth’s last ice age) began to end with 12oC of global warming occurring over a few hundred years. This was followed by 3,000 years of wildly fluctuating temperatures, which dragged the globe back into ice-age temperatures. That was an existential climate change event, but we survived. Then 11,500 years ago the Earth warmed up by 10oC over about half a century. From then to today the Earth was appreciably warmer than now for at least 90% of the time, with many spikes in temperature of up to 5oC. That data is derived from Greenland ice cores. The very first IPCC report in 1990 was in full concurrence in representing global conditions:

  • “There is growing evidence that worldwide temperatures were higher than at present during the mid-Holocene (especially 5 000 – 6 000 BP) at least in summer, though carbon dioxide levels appear to have been quite similar to those of the pre-industrial era.”
  • “The Early and Middle Holocene was characterized by a relatively warm climate with summer temperatures in high northern latitudes about 3 – 4oC above modern values.”

With the advancement of many civilizations, we can reconstruct climate change with more accuracy. During the Roman Warming Period of 500 BCE to 535 AD, people were wearing togas and growing grapes in Scotland, and growing olives in the Rhine region of Germany. Vikings used an ice-free pass in Norway that was later ice covered, and closer to home there was a thick forest growing 130 km north of the current treeline in Quebec. These are all confirmations the Earth was warmer than at present.

Then came the Dark Ages (535 to 900 AD), or actually the Cold and Dark Ages because there was ice reported on the Black sea and the Nile, and glaciers expanded in North America.  A global drop of 10C may have caused crop failures and famines in Europe and weakened the population to be susceptible to the bubonic plague.

The next climate change event was the Medieval Warm Period from 900 to 1300 AD when agriculture expanded northward in Europe. Vikings established farms in Greenland and visited a place they called Vineland (now called Canada) because of the grapes growing there. Grapes grew in England also. Over 100 academic studies concluded it was a global warming event greater than what we experienced in the 20th century.

The Little Ice Age occurred from 1300 to 1850 AD, which was probably just a few degrees cooler than the 20th century. The forests in northern Quebec that started growing in the Roman Warming Period burned down and it was too cold to regrow. And woefully, a 1799 painting depicts Englishmen playing ice hockey in London on the frozen Thames (it is not our game after all). This is what the 1990 IPCC report (no longer accessible on their website) had to say about the Little Ice Age:

“The Little Ice Age came to an end only in the nineteenth century. Thus, some of the global warming since 1850 could be a recovery from the Little Ice Age rather than a direct result of human activities. So it is important to recognize that natural variations of climate are appreciable and will modulate any future changes induced by man.”

It is not the Supreme Court’s place to judge the science but rather to rule on the narrower issue of whether a federal carbon tax is constitutional. Since it has extended its reach into unfamiliar terrain, the Court might have to explain this: if this current 21st century climate change cycle is an existential threat, how did the human race survive all the more significant climate change events during the last 14,500 years?

What Happens as Climate Science Evolves?

Those who say “the science is settled” do not understand how science evolves and just want to terminate the discussion. A good example is the 31-year history of the IPCC itself and its description of climate change in just the last millennium:

  • In the 1990 IPCC First Assessment Report there was clear agreement with conventional academia that there was a Medieval Warm Period from 900 to 1300 AD, followed by the Little Ice Age until 1850, and then it warmed up to present-day temperatures that are still cooler than the Medieval Warm Period.
  • In 2001 the famous “hockey stick” graph was adopted and printed six times in the Third Assessment Report. On this graph the Medieval Warm Period and the Little Ice Age are missing; the graph shows only that the Earth started to warm up in 1960. The result: 1998 is declared the hottest year of the millennium.
  • Canadian Ross McKitrick testified in 2005 the UK House of Lords that, “The flawed computer program can even pull out hockey stick shapes from lists of trendless random numbers.” This was supported in 2006 by the USA House Energy and Commerce Committee, and the USA National Research Council confirmed that the hockey stick shapes were a result of misused statistical methods. The hockey stick graph appears once in the 2007 Fourth Assessment Report but failed to make it into the 2014 Fifth Assessment Report. No explanations for the disappearance are offered.

Seeing as that is the IPCC’s findings on recent climatic history, it follows that the climatic forecasts may also change. The hidden reasoning of the Supreme Court cannot rectify the biased non-science embedded in the findings of the Intergovernmental Panel on Climate Change; it cannot overrule the laws of classical physics and chemistry; and it cannot rewrite the long history of humans surviving more severe—and naturally occurring—climate change.

We are left to wonder how the Supreme Court came to find carbon dioxide guilty of being an existential threat, but it will undoubtedly be portrayed by some as legal fact and obligation to meet our voluntary 2015 Paris Agreement greenhouse gas emission reduction targets. I suggest that amongst all the greenhouse gases Carbon Dioxide Deserves a Fair Trial. Incidentally, that’s the title of the first chapter of my book.

The Court’s mandate was to settle the constitutional dispute between Ottawa and the provinces over the authority to price carbon. But if a court bases its decision on hidden science rather than on its reading of the constitution on the use of non-renewable resources, can we consider the constitutional issue settled?

Artwork by JANE EVERETT

April 01, 2021

The Climate Change Heretic: Even Global Zero Net Emissions in 2050 Will Not Change Climate Change

‘Ron Barmby clearly shows how our society’s simplified narrative about “1 unit of CO2 = 1 unit of temperature increase in perpetuity” is folly – we may be asymptotically approaching a limit on CO2s impact on earth’s temperature.’ – Goodreads review of Sunlight on Climate Change.

According to my good friend the eminent Dr. Carl Hodge, recently retired professor of political science at the University of British Columbia, a politician in power who is faced with a major issue that the voting public demands resolution of has only two effective courses of action. The first course would be to immediately solve the problem, if possible. The politician could then use that track record of success in the next election. The second course of action would be to kick the issue sufficiently down the road so either the public believes the problem is being resolved in the long term, or the issue actually resolves itself. The politician’s future memoirs could then claim statesmanlike foresight in prudently delaying immediate action.

Canadians are fearful of global warming and have been propagandized that the only cause of global warming is greenhouse gases, and that chief amongst these is carbon dioxide. Canadians want human-induced global warming to stop and believe reducing or eliminating human carbon dioxide emissions is the only solution. To meet this voter demand our Federal Government has decided to hedge their bets and take both immediate and long-term courses of action.

The first, immediate course is the implementation of higher taxes and new regulatory standards to promote reductions in existing carbon dioxide emissions from hydrocarbon fuels, combined with policies to prevent access to new domestic and foreign markets that would enable growth of the Alberta oil sands industry. This immediate course of action exacerbates our current problems of economic recession and national division. Canada’s dismal track record on carbon dioxide emissions reductions is predictive that these measures will not work, but the Federal Government wants to be seen as taking action. The practice has come to be known as virtue signalling and it is essentially image management, but it is not necessarily linked to results.

The second, long-term course of action taken is to phase in some of the higher taxes progressively up to 2030, delay the first progress report of emissions reductions to 2028 (Bill C-12), and set the target date for full issue resolution at 2050. This second, long-term course of action of kicking the problem down the road is politically the best way forward. Unbeknownst to our government, by 2050 it will most likely be immaterial from a global warming perspective whether we emit no CO2 or a lot of CO2. This is dictated by the level of CO2 in the atmosphere combined with the physics of the greenhouse gas effect.

What is not commonly known is that the same constant amount of carbon dioxide additions to the atmosphere will cause less global warming when there is more carbon dioxide already in the atmosphere. We will expand on this concept shortly, but rest assured that this is not disputed by the Intergovernmental Panel on Climate Change (IPCC); in fact, their agreement with this statement is on their website. The proof is that in the 540 million years since multicellular life began on Earth, there has been up to 12 times more carbon dioxide content in the atmosphere than today with no runaway global warming. This is established science that is not disputed.

The inevitable conclusion is that at some point the concentration of carbon dioxide present in the atmosphere will be so high that the addition of more carbon dioxide will have no material global warming effect at all. Likewise, even the worldwide cessation of additions of carbon dioxide, human or natural, will have no material effect on global warming. Other known climate change factors will then override the diminished greenhouse gas effect of carbon dioxide.

Let’s start with the accepted greenhouse gas theory and the first climate change model ever. It was developed in 1896 by Svante Arrhenius, who became a Nobel Laureate in Chemistry (1903). He recognized that carbon dioxide was a greenhouse gas and calculated that the doubling of the concentration of carbon dioxide in the atmosphere would result in a global temperature increase between 3oC and 5oC. His statement that carbon dioxide is a greenhouse gas is part of accepted classical science. His statement that doubling the atmospheric concentration will cause an increase in global temperature is not disputed, but the global temperature increases of between 3oC and 5oC was his best estimate, and was made 125 years ago. I will use his midpoint, 4oC, for discussion purposes, although global warming alarmists suggest that the actual temperature increase is probably 5oC or higher (some claim as high as 8.5oC). It is also accepted that the majority of the increase of carbon dioxide in the atmosphere since Arrhenius’ time was emitted by human activity. I am not a denier; I am a heretic; I base my conclusions on proof, not prejudice. As a heretic I reject the reigning climate change orthodoxy that stifles sober debate.

In Arrhenius’ time the atmospheric carbon dioxide concentration was generally accepted to be 280 parts per million (ppm), which is a very small component (0.028%). Arrhenius proposed that if this were doubled to 560 parts per million, the global temperature would increase by 4oC (midpoint) above 1896 levels. Global warmers agree on this, but then become silent about what happens next. They would like you to extrapolate that finding yourself so you will conclude that every time the atmospheric concentration of carbon dioxide climbs by 280 ppm, the global temperature will become another 4oC warmer. That would be a linear progression, and that is incorrect.

The Arrhenius theory, which is accepted in both classical science and by the IPCC, is that the existing atmospheric carbon dioxide concentration has to be doubled for each 4oC temperature increase. The progression looks like this:

  • Doubling the first time from 280 ppm to 560 ppm results in 4oC of warming.
  • Doubling a second time from 560 ppm to 1120 ppm results in an additional 4oC of warming for a total of 8o
  • Doubling a third time from 1120 ppm to 2240 ppm results in an additional 4oC of warming for a total of 12o

The above is called a logarithmic progression; the total amount of carbon dioxide in the atmosphere needs to be doubled again each time to get the same greenhouse gas effect. That is the undisputed relationship between carbon dioxide and global warming. The amount of temperature increase for each doubling of the concentration of carbon dioxide is termed the CO2 Sensitivity, which in the example above is 4oC (the midpoint of Arrhenius’ 1896 estimate).

(Note: If carbon dioxide caused global warming in a linear progression, or 4oC for each 280-ppm increase, then the global temperature would be 28oC higher at 2240 ppm CO2. This has never happened in the 540-million-year history of multicellular life on Earth with up to 5000 ppm CO2.)

You can readily see that at some point the addition of carbon dioxide will contribute a global warming effect that is too small to matter, and likewise the non-addition of CO2 ppm will only prevent a global warming effect that is too small to matter. Our next piece of the puzzle is to determine whether, by the Federal Government’s ‘kick it down the road date’ of 2050, the amount of carbon dioxide in the atmosphere will be high enough that any additions or lack of additions will make any difference to the greenhouse gas effect.

In 2020 the Earth’s atmosphere contained about 410 ppm carbon dioxide and it is rising by about 2.3 ppm every year. Since Arrhenius wrote his theory in 1896, to date the carbon dioxide concentration in the atmosphere has increased by 130 ppm, halfway to the first doubling. While it is a very inexact science, the generally accepted global temperature increase from 1896 to 2012 is about 0.8oC. (2012 is the last year of data included in the most recent IPCC Fifth Assessment Report on Climate Change.) That temperature increase results in a CO2 Sensitivity of about 1.5oC for every doubling of CO2. This is not a theoretical calculation; it is an observation using the same databases as the IPCC. Instead of looking at the IPCC forecasts for 2050, let’s look at what has happened from 1896 to 2012 (116 years) and extrapolate that forward from 2013 to 2050 (38 years) using the Arrhenius theory:

  • With business-as-usual carbon dioxide emissions that cause an annual increase in atmospheric concentrations of 2.3 ppm (0.00023% of the atmosphere), the 1896 level will be doubled to 560 ppm in 2080. (I have accelerated this by five years for rounding and as coal use in Asia is still growing.) With that doubling we will have experienced a 1.5°C global temperature increase over about 180 years. This is one-half of the lowest estimate by Arrhenius and about one sixth of today’s highest estimates reported by the media.
  • The atmospheric concentration of CO2 in the year 2050 should be about 480 ppm. As this is about 85% of the way to the first doubling since pre-industrial 1896, it suggests that the global temperature increase from 2012 to 2050 must be slightly less than the 0.7°C increase expected by 2080.

The above estimates support the Federal Government’s politically motivated 2050 target date for resolution, as we are currently experiencing a carbon dioxide global warming effect which for the average human lifespan is too small to matter. Furthermore, the rate of temperature increase by the greenhouse gas effect is only going to decrease dramatically as more carbon dioxide is added to the atmosphere and the logarithmic relationship between CO2 and global warming rapidly diminishes its greenhouse gas effect. By 2050 the issue may be reduced to: how does humanity adapt to minor climate change?

There is already strong evidence that the concentration of carbon dioxide in the atmosphere is reaching the point where its ability to increase global warming by the greenhouse gas effect is minimal. Earlier I quoted the 0.8°C increase in global temperature from 1896 to 2012. The IPCC database from January 1, 1998 to December 31, 2012 shows that during this 15-year period there was no global temperature increase, but interestingly, there was a 30 ppm increase in CO2. This 15-year pause in global warming has been recognized by many qualified climate agencies and scientists, and it was addressed by the IPCC as merely a lower trend in the rate of warming. In a recent press release, the USA National Oceanic and Atmospheric Administration (NOAA) announced that 2020 was the second hottest year globally on record after 2016. As I have written previously, what that actually means is there was no global warming from the start of 2016 to the end of 2020.

These two recent extended periods of zero global temperature increase support the possibility that the inevitable diminishment of the carbon dioxide greenhouse gas effect is imminent, or in fact has already arrived. They support the Federal Government’s second political strategy of kicking the issue down the road and undermine the strategy of immediate carbon dioxide emission reductions at all costs.

There is accepted classical science, combined with current observed physical evidence and 540 million years of Earth’s history, that proves carbon dioxide has a limit to its effect on global warming. We have long known carbon dioxide concentrations in the atmosphere have to be doubled each time in order to get the same fixed amount of global warming; it is a logarithmic relationship. In the last 23 years we had a 15-year continuous pause in global warming based on the IPCC data, then a data gap of three years followed by a five-year pause in global warming based on the NOAA data. And the long history of life on Earth proves that even 5000 ppm carbon dioxide cannot cause runaway global warming.

The peak of global warming caused by CO2 may already be behind us, and future effects are destined to be too small to matter. Global zero net carbon dioxide emissions in 2050 may make no change to climate change. This is worth investigating before committing Canada to the first course of action—the immediate reduction of carbon dioxide emissions—that may further weaken our economy and our confederation. It also supports that our Federal Government’s second course of action of kicking down the road to 2050 makes more science-based sense.

Here is a parting thought: The IPCC has never stated that the 0.8oC temperature increase from 1896 to 2012 was 100% caused by humans. Their claim is that human activity is only responsible for at least 50% of the global temperature increase since 1950. With that in mind I think Dr. Hodge would advise giving this issue a huge punt also makes more political sense. Not that governments, in particular the Canadian Federal Government, need any coaching on the art of punting.

Photo by LYSLE BARMBY

March 01, 2021

The Climate Change Heretic: Why the Carbon Tax Question Before the Supreme Court is Incomplete

Over the last few years Canadians have become accustomed to having the courts weigh in on major energy issues as they relate to the impact on the environment or indigenous rights. Perhaps the biggest of all energy issues is the Federal Government’s national carbon tax which is currently under deliberation by the Supreme Court of Canada. The question being asked is whether or not the Federal Government has the right to impose a national carbon tax policy predicated on the idea that reducing greenhouse gas emissions is an environmental matter of national concern. That doesn’t go far enough to resolve the issue.

The foundation of the Federal Government’s case is that in 2014 the United Nations Intergovernmental Panel on Climate Change (IPCC) forecast catastrophic global warming caused by human greenhouse gas emissions over the next few decades. This led to Canada’s signature on the 2015 Paris Agreement to reduce national greenhouse gas emissions, and we have so far failed to meet our stated targets. In 2019 the Federal Government declared a national climate emergency based on Canada’s rate of warming and the need to meet our Paris Agreement commitments.

But what if there is no catastrophic global warming and therefore no national climate emergency?

Let’s examine that question from a heretic’s point of view, which is to start at the beginning and question the basic assumptions. The last comprehensive IPCC report, IPCC Fifth Assessment Report Climate Change, was published in 2014, just before the 2015 Paris Agreement was signed. The technical work for this report was completed a year earlier and published as Climate Change 2013 The Physical Sciences Basis. The data cut-off for this technical work was 2012. Buried in this second report, which I doubt was read by any of the politicians who signed the 2015 Paris Agreement, was the statement that “The rate of warming over the past 15 years (1998-2012; 0.05 [-0.05 to +0.15] oC per decade) is lower than the trend since 1951 (1951-2012; 0.12 [0.08 to 0.14] oC per decade).” In plain language the quoted statement means the rate of global warming from 1998 to 2012 of 0.05 o C per decade was only about 42% of the trend from 1951 to 2012 of 0.12 o C per decade.

This reported substantial reduction in global warming occurred during a time purported to have the highest human greenhouse gas emissions in history. It also occurred just before the signing of the most important climate change agreement in history. So, it’s perplexing that this statement is buried on page 37 of a subreport published in 2013, which was incorporated into the main report of 2014, which was used to justify signing the 2015 Paris Agreement.

Instead of comparing two overlapping time periods with different start points but the same end point, it would have been more descriptive and understandable to compare two separate time periods—one from 1951 to 1998 and the second from 1998 to 2012. Also, as temperature trends can be statistically manipulated, including the raw data would have been helpful.

I sourced the public domain database preferred by the IPCC, named HadCRUT4. According to HadCRUT4 the global average temperature in 1998 was approximately 0.6oC higher than in 1951, and the 2012 global average temperature was approximately 0.1C cooler than in 1998 (I have rounded the numbers off). With the time periods separated into two distinct intervals, it is apparent that from 1998 to 2012 (the data cut-off date for the 2014 IPCC report), global warming had ceased.

That significantly contradicts the 2014 statement that the rate of global warming in the last 15 years was only 42% of the rate of the last 62 years. If the annual data is smoothed to decade averages, the trend-corrected 2012 average global temperature is about 0.2C higher than the trend-corrected 1998 temperature. This may be the IPCC’s justification for their claim of continued but reduced global warming. It also hides the raw data that 2012 temperatures were slightly cooler than those of 1998, and that there was no abrupt downward spike. There was, in fact, a 15-year temperature plateau from 01January1998 to 31December2012.

I’m certainly not the first to be aware of this 15-year period of actual lack of global warming. Several eminent scientists have described this as the Global Warming Hiatus, but they were dismissed by the media and politicians as simply climate change deniers. In 2015 whistleblowers at the USA National Oceanic and Atmospheric Administration (NOAA) said their agency doctored global temperature data to hide the 1998 to 2012 temperature plateau from world leaders ahead of the 2015 Paris Agreement. They were likewise dismissed as deniers, but harder to ignore due to their prominent roles. Then 15 co-authors of Climate Change 2013 The Physical Science Basis published an independent paper stating they acknowledge the 1998 global warming pause. Now we had whistleblowers within the IPCC community itself to clarify the misleading 2014 IPCC report. All these scientists have risked their reputations and positions to object to the manipulation of data in documents critical to the 2015 Paris Agreement. Sadly, the media largely ignored them to the greater public peril.

The preferred IPCC database indicates that the global average annual temperature from 1998 to 2012 had no warming trend, and this was manipulated and buried in a 2013 subreport to the 2014 IPCC report used to promote the 2015 Paris Agreement. This burying of data enabled the 2019 Canadian declaration of a climate emergency and the justification of the imposition of a national carbon tax. Our Supreme Court should not be deliberating only whether the federal government has the jurisdiction to legislate a national climate change policy because of a national climate emergency; the question should also include whether they can do so if there is no climate emergency.

Current media coverage suggests that global warming has continued with a vengeance, but we won’t know what the IPCC official view is until they release their Sixth Assessment Report Climate Change in 2022. We do know from a recent press release from the NOAA that they have concluded 2020 was the second hottest year globally on record after 2016. Let’s turn that headline around and state what it actually means: there was no global warming from the start of 2016 to the end of 2020. In the last 23 years we had a 15-year continuous pause in global warming based on the IPCC data, then a data gap of three years followed by a five year pause in global warming based on the NOAA data. Between the two agencies their own data suggest that for 87% of the last 23 years, global warming has paused.

There is another major flaw in the NOAA statement that 2020 was the second hottest year on record: their records only go back to 1880. I would never presume that the Supreme Court of Canada is unaware of this, but advanced civilizations were around long before 1880 and their records tell us how warm the Earth was. In a later article for the Schachter Energy Report I will provide proof that for the at least 90% of time in the last 10,000 (ten thousand!) years the global temperature was warmer than today’s temperature.

Photo by LYSLE BARMBY

February 01, 2021

The Climate Change Heretic: Why Everyone has the Federal Clean Fuel Standard Debate Wrong

The proposed federal Clean Fuel Standard (CFS) is about to get a lot of press as it is scheduled to be in force in 2022. It is legislation that is aimed at reducing the national greenhouse gas emissions of hydrocarbon-based fuels by 30 million tonnes of carbon dioxide per year by 2030. This is estimated to be the equivalent of removing 7 million cars from the road—roughly 30% of all Canadian passenger cars, light trucks, minivans and SUVs. The CFS does not specify how this is to be achieved; it instead imposes penalties to fuel suppliers if the carbon dioxide reduction targets are not met. The government expects the transportation sector to effect about two thirds of this greenhouse gas reduction by reducing the carbon dioxide intensity of liquid fuels by 10% below the 2015 levels. They suggest the low-hanging fruit is the increased use of biofuels—specifically, more ethanol blended into gasoline and more biodiesel blended into diesel.

Objections to the Clean Fuel Standard are many. Some economists estimate the CFS is in fact a stealth increase in the carbon tax by another $300/tonne. Some political policy analysts point out that much of the objectives of the CFS are double-dipping on programs already in place. For example, British Columbia already has a similar target of a 15% reduction of transportation fuel carbon dioxide intensity by 2030. Provincial politicians weigh in with the response that the CFS infringes on provincial jurisdiction. Even your mechanic has a negative opinion; increased ethanol can cause engine stalling, accelerated breakdown of aluminum and rubber components, and clogging of fuel lines. (If your gasoline-fueled lawn mower does not start next spring, and the carburetor is full of a green tinged gel, that’s the result of 10% ethanol blended gasoline left stagnant in the fuel system.)

These are all well-thought-out positions, but if your objective is to reduce the amount of tailpipe emissions of carbon dioxide into the atmosphere, then proponents and critics of the CFS are missing a fundamental question: Will the CFS incentives result in the physical reduction of carbon dioxide emissions from liquid transportation fuels? If the incentives result in the expected increased use of ethanol or biodiesel, the answer is no. The level of CO2 emissions physically released will still be the same, maybe higher. Oddly, in the opaque accounting world of climate change, the CFS will still contribute significantly to the stated goal of a 30 million tonne carbon dioxide reduction as calculated by the Paris Agreement targets.

At first it seems like a paradox to not physically reduce CO2 emissions but still meet your CO2 emissions reductions targets. Unfortunately, it is quite possible, and we have been deluding ourselves for quite some time now. A similar situation in Canada is that until recently, we thought we were successfully recycling our plastics when we put them in the big blue recycle bin, only to discover 91% was going to landfills and incinerators in Asia. We are likewise told that ethanol-blended gasoline reduces CO2 emissions at the tailpipe of our vehicles, but it doesn’t.

We should remind ourselves why we started putting ethanol into gasoline in the first place. Let’s go back to the 1970s era of muscle cars, smoking inside airplanes, and using DDT to get rid of nuisance insects. Refiners used to add tetraethyl lead to gasoline as an octane booster to stop engine knock, and car manufacturers liked it because the lead deposits on the valve seats allowed for higher engine compression ratios. When leaded gasoline was banned due to elevated cancer risks, it was discovered that most substitutes for tetraethyl lead were even more carcinogenic, except ethanol. Three percent ethanol in regular gasoline stopped engine knocks, and manufacturers dropped their engine compression ratios (bye-bye muscle cars). A second step in adding ethanol to gasoline was in 2007 when the US Energy Independence and Security Act mandated an increase to 10% ethanol in gasoline to reduce foreign oil dependency. Ten percent ethanol blended gasoline was not introduced to reduce CO2 emissions; it was a measure to both replace tetraethyl lead and increase the American domestic automotive fuel supply.

The concept that ethanol reduces CO2 emissions is from an internationally accepted idea of carbon dioxide offsets, even though it is widely known that tailpipe CO2 emissions are unchanged. The offset theory is that physical CO2 emissions from the burning of ethanol are not counted as increased human emissions because they are offset by the atmospheric CO2 consumption during photosynthesis by the plants grown by humans as feedstock for ethanol. That implies that the ethanol source plant—in North America that is corn—has increased the amount of photosynthesis that would have otherwise occurred. Therefore, the physical CO2 released by the ethanol in the blended gasoline is simply not counted in the total emissions because it is offset by increased photosynthesis. If this theory were correct, then carbon dioxide emissions in 10% ethanol blended gasoline would drop by 10%. By applying this offset theory the Clean Fuel Standard will entice refiners to increase the ethanol content by as much as 15% in regular gasoline (the limit your current automobile can likely handle). This increased ethanol content in gasoline, by the offset theory, might achieve one half of the CFS CO2 emissions reduction goal.

The premise that ethanol offsets CO2 emissions is false. Either the corn would have been grown anyway for human or cattle consumption, or some other plant (crop or forest) would have grown. The same applies for biodiesel, except the plants involved are soy beans in North America and palms (for palm oil) in the tropics. There is no idle farmland on a planet with seven billion mouths to feed. Diverting farmland from food to ethanol or biodiesel does not increase photosynthesis; it increases the demand for farm yields as now they can be sold as either food or fuel.

Let’s take ethanol out of the political arena and put it in a lab. Ethanol releases only about two thirds of the energy on combustion that gasoline does, so the more ethanol that is blended into gasoline, the more volume of the blend is required to drive the same distance. Reports available from the US Energy Information Administration show that the same amount of CO2 is physically released from 10% ethanol blended gasoline as regular gasoline for the same distance driven. But now there is another CO2 stream to track: in the process of converting corn to ethanol, more CO2 is produced. The assumption to date has been that this food-grade CO2 has been conserved by using it in carbonated beverages, commercial greenhouses, or other industrial applications. I don’t think anyone really knows if that is true or an aspiration (like plastic recycling), but at some point the market for CO2 will become (or already is) saturated and the corn-to-ethanol production facilities may vent their CO2. In that case the atmospheric CO2 emissions (tailpipe plus ethanol manufacturing facility) of corn-derived 10% ethanol blended gasoline increases by 3% over regular gasoline for the same distance driven in the same vehicle. If the ethanol content of gasoline is increased to 15% in this scenario, the tailpipe CO2 combined with ethanol manufacturing vented CO2 increases by about 5% over normal unblended gasoline. That is, if your vehicle still runs well with the higher ethanol content.

That’s not all the downside to producing biofuels. Consider the freshwater diversions and aquifer pumping to irrigate corn crops in North America to grow the corn to convert to ethanol and to grow the soybeans to convert to biodiesel. Palms for biodiesel grow best in a range of 10 degrees latitude from the equator, and many farmers there have now been incentivized to burn down jungles to grow palm for biodiesel.

There are still more negative consequences for corn-derived ethanol. Corn makes up 20% of all the calories consumed by humans. One result of the USA 2007 mandate for 10% ethanol in gasoline was that by 2012, 40% of all corn grown in the USA was converted to ethanol. This is more corn than is consumed by humans on the entire continent of Africa, and caused world corn prices to increase by 30%. A significant increase in the world’s staple crop is quite an economic blow for families in developing countries who spend up to half their household income on food.

There are many problems with the proposed Clean Fuel Standard. It is another wedge issue in an already fractured Canadian confederation since it targets the fuel supplier. It may be a significant economic drain on families and the national economy. It might be claiming credit for other provincial programs, and more ethanol will be hard on your vehicle. What isn’t being discussed is that, in terms of reducing the physical carbon dioxide emissions from the road transport sector, ethanol blended gasoline and its twin biodiesel are bigger shams than the plastics recycling bins. They do not physically achieve the claimed carbon dioxide emissions reduction; they do not achieve any carbon dioxide emissions reduction at all. At least the blue bins resulted in 9% of the plastic being recycled.

Increased percentages of ethanol blended gasoline and biodiesel will incentivize more irrigated farmlands and jungle plantations to produce biofuels, and will be taking food off of plates in developing countries and putting it into your gas tank. The real paradox is that this all helps Canada meet its Paris Agreement goals simply because of the bureaucratic assumption that more photosynthesis has taken place than otherwise would have. That logic leads to the outrageous conclusion that we should go back to cutting down trees for steam driven power plants and home heating, because it creates new photosynthesis. That has already happened in the US and led Michael Moore to produce his recent documentary Planet of the Humans.

The Clean Fuel Standard needs to be rethought not only because of potential unintended consequences, but also because its intended consequences are merely a cheap accounting trick designed to mislead the public into believing they are physically reducing carbon dioxide emissions. After all the costs and burdens of the CFS are borne by the consumer, the environment and global food supply, the reality is that the carbon dioxide equivalent of zero cars will be taken off the Canadian roads.