So the question of the century is, how could building 7 have experienced freefall acceleration without using controlled demolition? How did eight floors of this modern, well-constructed building quickly vanish into thin air?
In yesterday’s article, we spoke about the existence of nano-thermite in the 9/11 dust. There are many unanswered questions regarding the events on that beautiful September day with its clear blue sky 21 years ago at the World Trade Center.
Questions regarding the fact that the fires below the rubble at ground zero burned for so long that a November 19, 2001 piece by the New York Times stated that some firefighting experts called the fire the “longest commercial building fire in United States history.” How firemen and first responders gave testimony that they witnessed rivers of molten steel, “like lava,” flowing down the channels under the debris pile at ground zero even though the government says such molten steel did not exist. Or the fact that the Pentagon has never produced a video or a single photo of the plane that crashed into it.
No steel-framed building on planet earth has ever had a total collapse before or since 9/11
But of these unexplained anomalies, the one event that stands out the most that the government has refused to explain is the perfectly symmetric collapse of World Trade Center Building 7.
World Trade Center Building 7 was a modern 47-story high-rise steel-framed building, taller than any building in 33 states.
It had 40 thousand tons of structural steel, 24 steel beam core columns, and 57 perimeter columns and met all of the building codes when constructed in the mid-1980s.
Building 7 was not soaked by jet fuel and didn’t have its fireproofing insulation blown off since it was not hit by a plane. Yet, it was the third steel-framed building in world history to experience a total collapse due to office fires. The first and second time such a total collapse was known to have occurred with steel-framed buildings was on the same day WTC buildings one and two collapsed. No steel-framed building on planet earth has ever had a total collapse before or since 9/11.
Building 7’s freefall acceleration is the critical question that needs to be answered. To understand freefall acceleration, consider dropping a brick from a height of 100 feet toward the pavement below. As long as the brick has an unimpeded, straight-down descent to the ground, the speed at which it lands would be freefall acceleration. Meaning nothing got in the way of the brick to slow it down as it fell all 100 feet and hit the ground.
If the brick hits an object after falling the first 75 feet, then that brick fell at freefall acceleration for the first 75 feet before it met resistance and slowed down.
Building 7 fell at freefall acceleration for nearly 105 feet or approximately eight floors
It took a quick 6.5 seconds for all 47 floors of building 7 to plummet into a neat pile. Using video footage and physics software David Chandler, M.S., who holds a B.S. in Physics and an M.S. in mathematics, proved to NIST (The National Institute of Standards and Technology) that building 7 fell at freefall acceleration for nearly 105 feet or approximately eight floors.
“What we are seeing is a building coming straight down through itself with zero resistance.” Said Chandler adding, “Buildings don’t have zero resistance.” Chandler also stated that floors have to be instantly removed for a building to fall at freefall speed. And the only way to remove floors and obtain zero resistance is by controlled demolition.
Even though the U.S. government’s position changed to agree with Chandler that there was freefall acceleration in building 7, the government still contends that no controlled demolition was used.
According to NIST, simple office fires caused one column (column 79) on the 12th floor to fail, which caused the entire building to fall for 105 feet at freefall acceleration symmetrically.
Roland Angle, P.E. civil and structural engineer, said, “NIST is telling us that the building below it (building 7) ceased to exist for the first few seconds of the collapse of the building.” Angle added that things in physics don’t cease to exist.
Steel structural high-rise buildings “simply do not collapse due to fire”
Since the conclusion made by NIST did not make scientific sense, many scientists wanted to examine the input data that NIST used to explain building 7’s freefall collapse. However, Catherine Fletcher of NIST denied the Freedom of Information Act request for the collapse calculations used by NIST. She stated that releasing such information might “Jeopardize public safety.”
“Impossible,” Kamal Obeid C.E., S.E, stated regarding NIST’s explanation for the collapse of WTC building 7. A civil and structural engineer with over 30 years of practicing engineering, Obeid said that the one-column theory offered by NIST, whereby the collapse of column 79 on the 12th floor caused the entire building to fall symmetrically, is “embarrassing.” “This is the crime of the century,” the veteran engineer said, urging other structural engineers to look at the evidence. According to Obeid, the collapse of building 7 was “a classic case of controlled demolition.”
Scott Grainger, a fire protection engineer with over 40 years of experience and licensed in 13 states, said that steel structural high-rise buildings “simply do not collapse due to fire.” He added that other steel buildings have had much worse fires burning for much longer times and yet did not collapse. According to Grainger, the fires in building 7 were scattered, and many earlier fires were already out. Yet, with the building having asymmetrical damage, it collapsed uniformly and symmetrically. “That just does not make any sense.”
So the question of the century is, how could building 7 have experienced freefall acceleration without using controlled demolition? How did eight floors of this modern, well-constructed building quickly vanish into thin air?
Rick Hayes lives in the epicenter of liberal land where reality and truth will never encounter a welcome mat.
An award-winning writer and photographer, with over twenty years of professional experience in both fields, Hayes started his journalism adventure after a successful, eye-opening career as a Banker in Wall Street. Although he spent his early work life surrounded by custom made shirts, expensive ties and the shiniest of shoes, Hayes was an accomplished singer, cutting a few records with a local band and appearing on one of the first cable shows.
Working for a weekly New York paper, in one of the most politically corrupt areas in the State, he began investing his time trying to understand the nature of corruption.
Highlights from the paper linked below for your attention:
“Despite its importance, the characterisation of uncertainty on estimates of the global total FFCO2 [fossil fuel CO2] emission made from the CDIAC database is still cumbersome. The lack of independent measurements at the spatial and temporal scales of interest complicates the characterisation. The mix of dependent and independent data used in the CDIAC calculations further complicates the determination.”
Bud: Ten years after the data in this report (link below), the increase in CO2 from 2019 to 2020 was measured to be 2.58 ppm, an increase due to all CO2 sources minus all CO2 sinks, natural and human, i.e., only 0.6% increase in net global CO2 concentration for the year. (NOAA-Scripps Mauna Loa) Obviously, net CO2 from fossil fuels cannot exceed 2.58 ppm and 0.6% of total net CO2 for this year.
The uncertainty in the CO2 emissions from fossil fuels is 11% to 25%! because different qualities of fossil fuels produce different amounts of CO2. (Table 1 of 2 below) And, the uncertainty of the quantities of various types of fossil fuels produced by several large fossil fuel producing nations (table 2 of 2 below) is 9.4% to 20.2%. These uncertainties must be propagated to the final uncertainty for estimated CO2 from estimated fossil fuels which is used in annual studies of carbon budgets and carbon cycle modeling. And, these uncertain numbers are used by CDIAC, Friedlingstein et al, UN IPCC, EPA, etc., eventually to argue that CO2 emissions from fossil fuels is driving dangerous CO2 growth and requires global crisis remediation expenditure estimated by McKinsey & Co. at $9 Trillion per year.
The uncertainty in the estimatedCO2 emissions from estimated fossil fuels production is more than 10 times larger than the uncertainty in NOAA-Mauna-Loa-stated (~0.2 ppm that is 0.2 ppm/415.41 ppm = ~0.05% uncertainty *) in the measured net CO2 concentration. It is not surprising that multiple scientists (Jamal Munshi, Demetris Koutsoyiannis, Peter Stallinga, Murry Salby, and others) find no or negative correlation between human emissions and net CO2 trend. Attributing the slowly increasing slope in measured net atmospheric CO2 concentration to fossil fuels is not supported by the NOAA Mauna Loa data.
“However, the end members of this range are not calculated on the same basis and each case measures different aspects of the [fossil fuels CO2 ] FFCO2 data cube (Fig. 1). For example, the 1-D case assesses uncertainty primarily from a fuel-based methodology perspective (Table 1) [ABOVE] . As the contribution of different fuels to total fuel consumption changes annually, so does the annual global uncertainty change (Fig. 3). The 2-D case assesses uncertainty primarily from a national data quality perspective (Table 2). As the contribution from different countries changes annually, so does the annual global uncertainty change. Global uncertainty has been increasing recently (Fig. 4) because more emissions are coming from countries with less certain data collection and management practices (Fig. 5). The 3-D case assesses uncertainty primarily from a data revision perspective (Fig. 7). As data are revised, missing data are reported and methodology refined, global uncertainty for a given emission year settles to typically less than 2% growth after initial data publication.”
“This manuscript takes three different but complimentary assessments of the uncertainty in CDIAC estimates of FFCO2 emissions. None of these assessments give a systematic appraisal of the full uncertainty, but collectively they provide useful insights. Greatly simplifying the assessments contained herein and trying to address the community’s need for a single, global FFCO2 uncertainty value, 8.4% (2 σ) is offered as a reasonable combination of the data in Figs. 3, 4 and 7. Given the current data, this greatly simplified uncertainty value is dynamic and may change in the future as the global mix of fuels being consumed changes and as the distribution of those fuels to different countries changes. The lack of independent measurements may also hide systematic errors not incorporated into the three uncertainty cases analysed. If this uncertainty analysis did not capture all relevant terms, the uncertainty may actually be larger than that reported here.”
Bud: How are uncertainties propagated in Robert J. Andres, Thomas A. Boden & David Higdon (2014)?
Bud: how does NOAA Manua Loa define uncertainty in their measured CO2 readings? How is it calculated and what are the amounts and units?
Grok 3 beta referencing 14 web pages:
NOAA’s Mauna Loa Observatory, part of the Global Monitoring Laboratory (GML), defines the uncertainty in CO2 measurements as a reflection of the variability and precision inherent in the data collection process. Specifically, uncertainty arises from natural atmospheric variability (e.g., daily fluctuations due to weather systems) and the precision of the measurement instruments and calibration methods. The goal is to ensure that reported CO2 values represent “background” conditions—well-mixed atmospheric air free from local influences like vegetation, human activity, or volcanic emissions.
Definition of Uncertainty
Uncertainty in the Mauna Loa CO2 measurements is typically expressed as the statistical variability of the data used to compute monthly means. It quantifies how much the daily or hourly measurements deviate from a smooth trend, accounting for factors like short-term atmospheric changes and instrument performance. NOAA emphasizes that their measurements are calibrated frequently against reference standards to maintain high accuracy, and uncertainty reflects both this calibration process and the natural variability observed in the data.
How Uncertainty is Calculated
The uncertainty for monthly mean CO2 values at Mauna Loa is calculated based on the observed variability of daily means within a month. Here’s the general process:
Daily Means: Hourly CO2 measurements are taken continuously using a non-dispersive infrared (NDIR) analyzer or, since 2019, a Cavity Ring-Down Spectroscopy (CRDS) system. Only hours meeting “background” criteria (low variability and representative of mid-altitude Pacific air) are used to compute daily averages.
Monthly Means: Each monthly mean is the average of these daily means. The variability of these daily values around the monthly mean contributes to the uncertainty.
Statistical Method: NOAA uses the standard deviation of daily means from a fitted smooth curve (representing the seasonal cycle and long-term trend) as a basis. To refine this, they adjust for autocorrelation—since consecutive daily means are not fully independent (e.g., CO2 levels on one day may influence the next due to persistent weather patterns). This is detailed in research by Thoning et al. (1989).
Calibration Adjustments: The measurements are calibrated against World Meteorological Organization (WMO) reference gases with known CO2 mole fractions (e.g., 350–430 ppm at Mauna Loa). Comparisons between independent measurements (e.g., NOAA vs. Scripps Institution of Oceanography) further validate precision, typically within 0.2 ppm.
For the monthly mean uncertainty, NOAA provides vertical bars on graphs (e.g., the Keeling Curve) showing this value, derived from the variability of daily means and adjusted for autocorrelation.
Amount and Units
Amount: The uncertainty in monthly mean CO2 at Mauna Loa is generally better than 0.2 parts per million (ppm) for the accuracy of the measurements compared to reference standards. However, the uncertainty of the monthly mean, based on atmospheric variability, is typically slightly higher. For example, NOAA reports the standard deviation of day-to-day variability across the entire record as approximately 0.38 ppm. The uncertainty of monthly means, factoring in autocorrelation, is often in the range of 0.1 to 0.5 ppm, depending on the month and year.
Units: The uncertainty is expressed in parts per million (ppm), the same unit as the CO2 mole fraction, which is defined as micromol CO2 per mol of dry air (µmol/mol).
Practical Example
On NOAA GML’s “Trends in CO2” page (updated as of March 13, 2025), the monthly mean CO2 data includes uncertainty bars on the deseasonalized trend. These bars represent the variability of daily means within each month, adjusted as described. The long-term accuracy of the measurements, validated by calibration and intercomparisons, remains within 0.2 ppm, but the monthly uncertainty reflects the natural fluctuations captured in the data.
In summary, NOAA defines uncertainty as a combination of measurement precision and atmospheric variability, calculates it using statistical analysis of daily means with calibration checks, and reports it in ppm, typically around 0.1–0.5 ppm for monthly means, with an instrumental accuracy better than 0.2 ppm.
Bud: look at the following paper and analyze whether the uncertainties stated have been propagated properly to final uncertainty value. Robert J. Andres, Thomas A. Boden & David Higdon (2014) A new evaluation of the uncertainty associated with CDIAC estimates of fossil fuel carbon dioxide emission, Tellus B: Chemical and Physical Meteorology, 66:1, DOI: 10.3402/tellusb.v66.23616 https://www.tandfonline.com/doi/full/10.3402/tellusb.v66.23616
Grok 3 beta:
6 web pages
Let’s analyze the paper by Andres et al. (2014), titled “A new evaluation of the uncertainty associated with CDIAC estimates of fossil fuel carbon dioxide emission,” published in Tellus B: Chemical and Physical Meteorology (DOI: 10.3402/tellusb.v66.23616). The goal is to assess whether the uncertainties stated in the paper have been propagated properly to the final uncertainty value.
Overview of the Paper
The paper evaluates the uncertainty in the Carbon Dioxide Information Analysis Center (CDIAC) estimates of CO2 emissions from fossil fuel combustion and cement production. It builds on previous CDIAC estimates, which provided a global annual uncertainty of ±5% (expressed as one standard deviation, 1σ), and aims to refine this by incorporating a Bayesian statistical approach using Monte Carlo simulations. The authors estimate uncertainties for individual countries, fuel types, and years, then aggregate these to a global total.
Uncertainty Sources and Methodology
The paper identifies three primary sources of uncertainty in CDIAC’s fossil fuel CO2 emission estimates:
Carbon Content (C): The carbon mass per unit of fuel, varying by fuel type (e.g., coal, oil, natural gas).
Fraction Oxidized (O): The proportion of carbon oxidized to CO2 during combustion.
Fuel Mass Consumed (M): The amount of fuel burned, derived from energy statistics.
The total CO2 emission for a given country, fuel type, and year is calculated as:
CO2=C×O×M\text{CO2} = C \times O \times M\text{CO2} = C \times O \times M The uncertainty in this product must account for the uncertainties in (C), (O), and (M), and how they are combined.
The authors use a Monte Carlo approach:
Assign probability distributions to each parameter ((C), (O), (M)) based on literature, expert judgment, and data variability.
Perform 10,000 simulations to propagate these uncertainties through the emission equation.
Aggregate results to estimate uncertainties at country, fuel-type, and global levels.
Stated Uncertainties
Individual Parameters:
(C): Uncertainty varies by fuel type (e.g., ±7% for coal, ±2% for oil, ±1.5% for natural gas, 1σ).
(O): Typically ±0.5% to ±2%, depending on fuel and combustion conditions.
(M): Uncertainty ranges from ±5% (developed countries with good statistics) to ±50% (countries with poor data), based on Intergovernmental Panel on Climate Change (IPCC) guidelines.
Country-Level: Uncertainty varies widely (e.g., ±5% for the U.S., up to ±53% for North Korea in 2010).
Global Annual: The final uncertainty is reported as approximately ±8.4% (1σ) for recent years (e.g., 2010), compared to the previous CDIAC estimate of ±5%.
Propagation of Uncertainty
To determine if uncertainties are propagated properly, we need to check:
Independence Assumption: Are (C), (O), and (M) treated as independent variables?
Combination Method: Is the Monte Carlo simulation correctly aggregating uncertainties?
Consistency: Does the final ±8.4% reflect the input uncertainties and their propagation?
1. Independence Assumption
The paper assumes (C), (O), and (M) are independent for each fuel type and country. This is reasonable for (M) (fuel consumption data from energy statistics) versus (C) and (O) (fuel properties and combustion efficiency), as these are derived from distinct sources. However:
(C) and (O) might have some correlation (e.g., coal with higher carbon content might oxidize differently). The paper doesn’t explore this, but such correlations are typically minor and not well-constrained by data, so assuming independence is a standard simplification.
For independent variables, the relative uncertainty in a product is approximated by the quadrature sum of relative uncertainties:
The Monte Carlo method avoids this approximation by directly sampling distributions, but the assumption of independence still holds.
2. Monte Carlo Simulation
The Monte Carlo approach samples from distributions for (C), (O), and (M) (e.g., normal or lognormal, depending on data) and computes CO2 emissions 10,000 times per estimate. This method inherently propagates uncertainties without assuming a specific functional form (e.g., linear or Gaussian), capturing potential asymmetries or non-linear effects. The paper reports:
Median emissions (50th percentile) as the central estimate.
Uncertainty as the 68% confidence interval (16th to 84th percentiles), approximating 1σ.
For global totals, uncertainties from individual countries and fuel types are aggregated. Since emissions are summed absolute uncertainties add in quadrature if uncorrelated:
3. Consistency with Input Uncertainties
Let’s test if ±8.4% makes sense:
Dominant Contributor: Fuel mass ((M)) has the largest uncertainty (e.g., ±5% to ±50%), dwarfing (C) (±1.5% to ±7%) and (O) (±0.5% to ±2%) for most cases.
Example: For the U.S. (a large emitter), if (M) = ±5%, (C) = ±2%, (O) = ±1%, the combined relative uncertainty is:
(The paper reports ±5% for the U.S., consistent with (M) dominating.
Global Aggregation: With 185 countries, some with high uncertainty (e.g., ±53% for North Korea), the global uncertainty should exceed that of well-constrained countries like the U.S. However, large emitters (U.S., China) dominate the total, and their lower uncertainties (≈±5%) weigh heavily. The ±8.4% suggests a balance between high-uncertainty small emitters and low-uncertainty large emitters.
The increase from ±5% (old CDIAC) to ±8.4% reflects:
More countries with poor data (e.g., developing nations post-1990).
Explicit inclusion of (C) and (O) uncertainties, previously approximated.
Critical Analysis
The propagation appears generally sound:
Strengths: The Monte Carlo method is robust, avoiding oversimplifications like assuming all uncertainties are Gaussian. Reporting the 68% interval aligns with 1σ conventions.
Potential Issues:
Correlation Neglect: If (C) and (O) are correlated across fuel types, the uncertainty might be slightly underestimated. However, this effect is likely small given (M)’s dominance.
Distribution Choice: The paper doesn’t detail distributions for every case (e.g., normal vs. lognormal). If (M) in high-uncertainty countries is skewed (common for sparse data), the 68% interval might not perfectly match 1σ, though the impact on the global total is muted by large emitters.
Temporal Correlation: Year-to-year uncertainties might be correlated (e.g., systematic errors in a country’s reporting), but the paper treats each year independently. This doesn’t affect the annual ±8.4% but could influence trends.
Conclusion
The uncertainties in Andres et al. (2014) are propagated properly to the final ±8.4% value. The Monte Carlo approach correctly combines uncertainties in (C), (O), and (M), and the global aggregation reflects the dominance of well-constrained large emitters tempered by noisier small contributors. Minor caveats (e.g., potential correlations, distribution assumptions) exist but don’t significantly undermine the result. The jump from ±5% to ±8.4% is plausible given the expanded scope and rigor, making the final uncertainty a reasonable representation of the input data and methodology.
“Our results highlight, for the first time, distinct patterns of population- and subpopulation-level vulnerability for a wide variety of bird species found dead at renewable energy facilities. Of the 23 priority bird species killed at renewable facilities, 11 (48%) were either highly or moderately vulnerable, experiencing a greater than or equal to 20% decline in the population growth rates with the addition of up to either 1000 or 5000 fatalities, respectively.” …
“Despite being the focus of massive conservation efforts [74–76], bird populations across North America have declined by nearly 3 billion individuals in less than 50 years [27], and similar bird declines are occurring across the world (e.g. [77]). Although we focused on direct mortality, renewable energy also may cause indirect and sub-lethal effects, for example, through displacement of birds and disruption of habitat. Furthermore, wind and solar are part of a suite of anthropogenic stressors that are relevant to avian populations. For example, climate change, habitat loss and degradation, pesticides, killing by domestic cats, and collision with transmission lines, vehicles and buildings [25,78–81] all can directly or indirectly affect bird populations.” …
“Critically, not only local but also non-local, and often very distant, subpopulations often were vulnerable to additional fatalities at California renewable energy facilities.”
Former Greenpeace Founder Patrick Moore Says Climate Change Based on False Narratives
Prominent scientist backs up claim that there is ‘no climate emergency’ By Lee Yun-Jeong
Patrick Moore, one of the founders of Greenpeace said in an email obtained by The Epoch Times that his reasons for leaving Greenpeace were very clear:
“Greenpeace was ‘hijacked’ by the political left when they realized there was money and power in the environmental movement. [Left-leaning] political activists in North America and Europe changed Greenpeace from a science-based organization to a political fundraising organization,” Moore said.
Moore left Greenpeace in 1986, 15 years after he co-founded the organization.
“The ‘environmental’ movement has become more of a political movement than an environmental movement,” he said. “They are primarily focused on creating narratives, stories, that are designed to instill fear and guilt into the public so the public will send them money.”
He said they mainly operate behind closed doors with other political operatives at the U.N., World Economic Forum, and so on, all of which are primarily political in nature.
The Intergovernmental Panel on Climate Change [IPCC] is “not a science organization,” he said. “It is a political organization composed of the World Meteorological Organization and the United Nations Environment Program.
“The IPCC hires scientists to provide them with ‘information’ that supports the ‘climate emergency’ narrative.
Their campaigns against fossil fuels, nuclear energy, CO2, plastic, etc., are misguided and designed to make people think the world will come to an end unless we cripple our civilization and destroy our economy. They are now a negative influence on the future of both the environment and human civilization.”
“Today the left has adopted many policies that would be very destructive to civilization as they are not technically achievable. Only look at the looming energy crisis in Europe and the UK, which Putin is taking advantage of. But it is of their own making in refusing to develop their own natural gas resources, opposing nuclear energy, and adopting an impossible position on fossil fuels in general,” Moore wrote.
The Left ‘Hijacked’ Greenpeace
He said “green” for the environment and “peace” for the people were the organization’s founding principles, but peace was largely forgotten, and green had become the sole agenda.
“Many [so-called] ‘environmental’ leaders were now saying that ‘humans are the enemies of the Earth, the enemies of Nature.’ I could not accept that humans are the only evil species. This is too much like ‘original sin,’ that humans are born with evil, but all the other species are good, even cockroaches, mosquitos, and diseases,” Moore argued.
He said the new dominant philosophy is that the world would be better if fewer people existed.
“But the people who said this were not volunteering to be the first to go away. They behave as if they are superior to others. This kind of ‘pride’ and ‘conceit’ is the worst of the Cardinal Sins,” Moore said. Environmental Activist
As a prominent scholar, ecologist, and long-time leader in the international environmental field, Patrick Moore is widely regarded as one of the world’s most qualified experts on the environment. He is also a founder of Greenpeace, the world’s largest environmental activist organization.
Moore received his Ph.D. in Ecology from the University of British Columbia in 1974 and an Honorary Doctorate of Science from North Carolina State University in 2005.
He co-founded Greenpeace in 1971 and served as president of Greenpeace Canada for nine years. From 1979 to 1986, Moore served as the Director of Greenpeace International, a driving force shaping the group’s policies and directions. During his 15-year tenure, Greenpeace became the world’s largest environmental activist organization.
In 1991, Moore founded Greenspirit, a consultancy focusing on environmental policies, energy, climate change, biodiversity, genetically modified food, forests, fisheries, food, and resources.
Between 2006 and 2012, Moore served as co-chairman of the Clean and Safe Energy Coalition, a U.S.-based environmental advocacy group.
In 2014, he was appointed Chairman of Ecology, Energy, and Prosperity of Frontier Centre for Public Policy, a non-partisan Canadian public policy think tank.
In 2019 and 2020, Moore served as the Chair of CO2 Coalition, a U.S.-based nonprofit environmental advocacy group dedicated to disputing false claims on CO2 as relates to climate change.
False Narrative on Chlorine
“At the time I decided to leave Greenpeace, I was one of 6 Directors of Greenpeace International. I was the only one with formal science education, BSc Honors in Science and Forestry, and Ph.D. in Ecology. My fellow directors decided that Greenpeace should begin a campaign to ‘Ban Chlorine Worldwide.’”
Moore said it is true that elemental chlorine gas is highly toxic and was used as a weapon in World War I. However, chlorine is one of the 94 [naturally-ocurring] elements on the Periodic Table and has many roles in biology and human health. For example, table salt (NaCl or Sodium Chloride) is an essential nutrient for all animals and many plants. It is impossible to “ban” NaCl.
He pointed out that adding chlorine to drinking water, swimming pools, and spas was one of the most significant advances in public health history in preventing the spread of water-borne communicable diseases such as cholera. And about 85 percent of pharmaceutical drugs are made with chlorine-related chemistry, and about 25 percent of all our medicines contain chlorine. All halogens, including chlorine, bromine, and iodine, are powerful antibiotics; without them, medicine would not be the same.
“Greenpeace named chlorine ‘The Devil’s Element’ and calls PVC, polyvinyl chloride, or simply vinyl, ‘the Poison Plastic.’ All of this is fake [and] to scare the public. In addition, this misguided policy reinforces the attitude that humans are not a worthy species and that the world would be better off without them. I could not convince my fellow Greenpeace directors to abandon this misguided policy. This was the turning point for me,” Moore said.
False Narrative on Polar Bears
When asked how Greenpeace utilizes its massive donations, Moore said it was used to pay for “a very large staff” (likely over 2,000), extensive advertisements, and fundraising programs. And virtually all of the organization’s ads for fundraising are based on false narratives, which he had thoroughly disproven in his books, one example being the polar bears.
“The International Treaty on Polar Bears, signed by all polar countries in 1973, to ban unrestricted hunting of polar bears, is never mentioned in the media, Greenpeace, or politicians who say the polar bear is going extinct due to melting ice in the Arctic. In fact, the polar bear population has increased from 6,000 to 8,000 in 1973 to 30,000 to 50,000 today. This is not disputed,” Moore said.
“But now they say the polar bear will go extinct in 2100 as if they have a magic crystal ball that can predict the future. In fact, this past winter in the Arctic saw an expansion of ice from previous years, and Antarctica was colder during the last winter than in the past 50 years.”
Moore said that he does not pretend to know everything and predict the future with confidence like many in the “climate emergency” business claim they can do. The Goal of the ‘Environmental Apocalypse’ Theory
“I believe the human population has always been vulnerable to people who predict doom with false stories,” Moore said.
“The Aztecs threw virgins into volcanos, and the Europeans and Americans burned women as witches for 200 years claiming this would ‘save the world’ from evil people. This has been [referred to as] ‘herd mentality,’ ‘groupthink,’ and ‘cult behavior.’ Humans are social animals with a hierarchy, and it is easiest to gain a high position by using fear and control.”
Moore said the environmental apocalypse theory is mostly about “political power and control,” adding that he is dedicated to showing people that the situation is not as negative as they are told.
“Today, in the richest countries, our descendants are making decisions that our grandchildren will have to pay for,” he said. “Predictions that the world is coming to an end have been made for thousands of years. Not once has this come true. Why should we believe it now?”
“People are naturally afraid of the future because it is unknown and full of risks and difficult decisions. I believe there is also an element of ‘self-loathing’ in this apocalypse movement.”
Moore said the young generation today are taught that humans are not worthy and are destroying the earth. This indoctrination has made them feel guilty and ashamed of themselves, which is the wrong way to go about life.
The Demonization of Carbon Dioxide
“Very few people believe the world is not warming. The record is clear that the world has been warming since about the year 1700, 150 years before we were using fossil fuels. 1700 was the peak of the Little Ice Age, which was very cold and caused crop failures and starvation. Before that, around 1000 A.D. was the Medieval Warm period when Vikings farmed Greenland. [And] before that, around 500 A.D. were the Dark Ages, and before that, the Roman Warm Period when it was warmer than today, and the sea level was 1–2 meters higher than today,” Moore said.
“Even until about 1950, the amount of fossil fuel used and CO2 emitted were very small compared to today. We do not know the cause of these periodic fluctuations in temperature, but it was certainly not CO2.”
Moore clarified that the “minority opinion” is not about the history of the Earth’s temperature, but it is the relationship between the temperature and CO2 that is at the center of the dispute.
“In this regard, I agree that many believe CO2 is the main cause of warming. CO2 is invisible, so no one can actually see what it is doing. And this ‘majority’ are mainly scientists paid by politicians and bureaucrats, media making headlines, or activists making money. [The rest are] the public who believe this story even though they can’t actually see what CO2 is doing,” Moore said.
Moore provided a graph of temperature continuously measured over 350 years (from 1659 to 2009) in central England. “If carbon dioxide was the main cause of warming, then there should be a rise in temperature along the carbon dioxide curve, but it doesn’t,” he explained.
Google:
1659–2009 Temperature and Carbon Dioxide Emissions in Central England
Moore described the demonization of CO2 as “completely ridiculous.” He added that CO2 is the basis of all life on Earth and its concentration in the atmosphere today, even with the increase, is lower than it has been for a large majority of life’s existence.
Rising CO2 Correlates With Increased Plantation: Study
A study in 2013 found that increased levels of carbon dioxide (CO2) have helped boost green foliage across the world’s arid regions over the past 30 years.
The Australian government agency CSIRO conducted the research in collaboration with Australian National University (ANU). The data was based on satellite observations from the year 1982 to 2010 across parts of the arid areas in Australia, North America, the Middle East, and Africa.
It found an 11 percent increase in foliage cover in the studied area due to what’s called “CO2 fertilization.”
The study said a fertilization effect occurs when elevated CO2 levels enable a leaf during photosynthesis—the process by which green plants convert sunlight into sugar—to extract more carbon from the air or lose less water to the air or both.
“If elevated CO2 causes the water use of individual leaves to drop, plants in arid environments will respond by increasing their total numbers of leaves. These changes in leaf cover can be detected by satellite, particularly in deserts and savannas where the cover is less complete than in wet locations,” according to Randall Donohue, the CSIRO research scientist.
Breaking the Global Warming Narrative
“Climate alarmists prefer to discuss climate knowledge only since 1850. The time before this they referred to as the pre-industrial age. This ‘pre-industrial age’ was more than 3 billion years when life was on the Earth. Many climate changes [occurred during that period], including Ice Ages, Hothouse Ages, major extinctions due to asteroid impacts, and other unknown causes,” Moore said.
“Today, the Earth is in the Pleistocene Ice Age, which began 2.6 million years ago. … So, the most recent major glaciation, which peaked 20,000 years ago, was not the end of the Ice Age. We are still in the Pleistocene Ice Age no matter how the climate alarmists wish to deny this.”
He said the great irony of the present panic about the climate is that the Earth is colder today than it was for 250 million years before the Pleistocene Ice Age set in. And CO2 is lower now than in more than 95 percent of Earth’s history.
“But you would never know this if you listen to all the people who benefit from the lie that the Earth will soon be too hot for life and that CO2 will become higher than in Earth’s history,” Moore said. ‘More CO2 Is Beneficial to the Environment and Humans’
According to Moore, nearly all commercial greenhouse farmers worldwide buy CO2 to inject into their greenhouses to realize up to 60 percent higher crop yields.
“I was impressed when flying over South Korea [and seeing] how many greenhouses there are in the valleys. Like British Columbia, Korea has a lot of mountains and not so much flat fertile farmland.
“I am sure the greenhouse farmers are putting more CO2 in their greenhouses, up to double and triple what it is in the atmosphere today. This is because nearly all plants growing outside in the natural atmosphere are starved for CO2, and it is what limits them from growing faster,” Moore added.
“Please refer to the chapter titled ‘Climate of Fear and Guilt’ in my book, [Fake Invisible Catastrophes and Threats of Doom], if you wish to gain a full understanding of these facts,” he said.
Moore said that most environmental activists, politicians, and so-called experts know that we cannot stop increasing the use of fossil fuels or reducing CO2 emissions within their proposed schedule.
“In 2015, while attending the COP (Conference of the Parties) in Paris, I offered a public bet for $100,000 in a broadcast media release which went out on more than 200 media feeds, that by 2025 global CO2 emissions would be higher than in 2015. I did not receive one taker, not even from the ‘greens,’” Moore said.
“I know that more CO2 is entirely beneficial to both the environment and human civilization. I am proud to be a director of the CO2 Coalition.”
The Irony of ‘Carbon Neutrality’
Moore said “carbon neutrality” is a political term, not a scientific one.
“It is simply wrong to call CO2 ‘carbon.’ Carbon is an element that is what diamonds, graphite, and carbon black (soot) are composed of. [And] CO2 is a molecule that contains carbon and oxygen and is an invisible gas that is the primary food for all life. [Likewise], it is incorrect to refer to NaCl (table salt) as ‘chlorine,’ even though NaCl contains chlorine,” Moore said.
“He said when elements (atoms) combine with each other to form compounds (molecules), they always have very different properties than the elements they are made from.
“‘Net-Zero’ is also a political term made-up by activists who are not scientists. For example, the top leaders of this crusade are people like Al Gore, Leonardo DiCaprio, and Greta Thunberg, none of whom are scientists.”
According to Moore, Russia, China, and India are 40 percent of the human population, and they do not agree with this anti-fossil fuel agenda.
“If we add Brazil, Indonesia, and most African countries, it is a majority of the population who are not climate fanatics,” Moore added.
“Another great irony is that many countries with the coldest climates, such as Canada, Sweden, Germany, and the UK, are the most concerned about warming. For example, the average annual temperature in Canada is -5.35 degrees Celsius.”
Moore also said that fumes from engines are not CO2; they are other substances, as CO2 is invisible and odorless. Dust is also not CO2; it is soot and can be controlled with present technology. And coal plants built today are much cleaner than ones built 20 years ago.
‘Wind and Solar Power Are Parasites on the Economy’
“Solar and wind power are both very expensive and very unreliable. It is almost like a mental illness that so many people have been brainwashed to think entire countries can be supported with these technologies,” Moore said.
“I believe wind and solar energy are parasites on the larger economy. In other words, they make the country poorer than if other more reliable and less costly technologies were used.”
Moore said that wind and solar providers rely heavily on government subsidies, tax write-offs, and mandates, where citizens are forced to buy wind and solar power even if it is more expensive, on the pretext that it is “environmentally friendly.”
“Millions of people pay more for wind and solar energy while a few people make millions of dollars, marks, pounds, etc. It is a bit like a Ponzi scheme in the stock markets,” Moore added.
“They require vast areas of land, are not available most of the time, and require reliable energy such as nuclear, hydroelectric, [coal, and natural] gas to be available when wind and solar are unavailable.”
According to Moore, the construction of wind and solar farms uses vast amounts of fossil fuels for mining, transportation, and construction. And in many locations, they don’t produce nearly enough energy in their lifetimes as is required to build and maintain them.
“Why not use reliable energy [such as nuclear, hydroelectricity, natural gas, etc.] as the primary source?” Moore questioned, adding if that were the case, “then wind and solar would be unnecessary.”
‘Plastic Is Not a Toxic Substance’
“Plastic is not a toxic substance. That is why we package and wrap our food in it, to prevent it from becoming contaminated. Plastic does not magically become toxic when it enters the ocean,” Moore said.
“Of course, they say on one hand that plastic will never break down, and then, on the other hand, they say it will quickly decay into “microplastics,” which, of course, are conveniently invisible so no one can observe or verify this for themselves. How clever!”
According to Moore, our digestive system can tell the difference between “food” and plastic or tiny particles of sand. Our body does not ingest sand into our bloodstream, no matter how microscopic the sand is.
He said plastic floating in the ocean is like a tiny floating reef, the same as driftwood. It provides a surface for marine species to lay their eggs on, attach themselves to, and eat things that are attached to it.
“Pollution is usually toxic or causes harm to life. Plastic is simply ‘litter’ beside the road. It is not hurting anything. One exception is discarded fishing nets, not because they are plastic but because they are shaped to catch fish. The environmental community should work with the fishing industry to stop throwing damaged nets in the sea and bring them back to the dock, where they can be recycled, used in a waste-to-energy plant, or discarded safely,” Moore added.
This interview is a compilation of an email exchange between Moore and South Korean Professor Seok-soon Park, professor of environmental engineering at Ehwa Womans University in Seoul, South Korea, from November–December 2021. It was provided to The Epoch Times by Park with the permission of Moore on July 7, 2022.
Both he and Moore are among the 1,100 scientists and professionals that signed the World Climate Declaration (WCD) stating that there’s no climate emergency.
Professor Seok-soon Park was the first Korean to receive master’s and doctorate degrees in environmental science from Rutgers University in the United States.
He returned to South Korea in 1988 to work at the Korea Science Foundation.
He served as the head of the National Institute of Environmental Sciences, the president of the Korea Society for Environmental Education, the presidential science and technology advisor, and the presidential green growth committee.
He has published more than 20 books and contributed 150 papers to domestic and foreign journals and 180 environmental columns to Korean newspapers and specialized journals.
Park has refused to participate in environmental movements that exaggerate the risks of global warming and oppose development unconditionally.
He founded the Korea Free Environment Federation, which aims to protect environmental rights, reveal the truth about climate change, and help advance the country’s economy.
Dear Gov. Newsome and all virtue signaling EV tyrants…
Excerpt from abstract:
“Future scenarios of the Spanish electricity grid confirm that a massive introduction of renewable energies would lead to a 19.26% and 27.41% decrease of CO2-eq emissions in 2030 and 2050, respectively. Nevertheless, current and future energy scenario predictions show that electric vehicles will produce an increase in fine particulate matter formation (26%), human carcinogenic (20%) and non-carcinogenic toxicity (61%), terrestrial ecotoxicity (31%), freshwater ecotoxicity (39%), and marine ecotoxicity (41%) relative to petrol vehicles. Transfer of environmental burdens from the use phase to the raw materials extraction and manufacturing phases entails a delocalisation of the impacts, which constitutes a new challenge at environmental, social, and legal levels. This study is the first comparative analysis of the environmental impacts of passenger vehicles in Spain from a cradle-to-grave perspective, considering different impact categories and time scopes.”
Bud’s comment: The only thing harmful thing about water vapor and CO2 (the major greenhouse gases) is the people promoting the global warming /climate change fraud.
Comparative life cycle assessment of conventional, electric and hybrid passenger vehicles in Spain
“Those who know me are aware that I don’t believe anything until I run the numbers myself. I kept hearing that President Trump’s tax plan rewarded the rich and punished the middle class and the poor … turns out that’s a Democrat lie. Here’s the truth.” – Willis Eschenbach @WEschenbach. He publishes the blog What’s Up With That. https://wattsupwiththat.com/
Pierre Poilievre, MP, Conservative candidate for Prime Minister of Canada announced, “I will ban all my ministers from any involvement in the World Economic Forum (WEF)” should he become Prime Minister of Canada.
Former Prime Minister Stephen Harper of the Conservative Party has endorsed Pierre Poilievre. Mr. Poilievre served as Minister of State for Democratic Reform and Minister of Employment and Social Development under Harper.
HELLO to America’s GOP, floundering without a platform, Pierre Poilievre has a good idea!
Humans inhale air of ~78% N2 gas, ~21% O2 gas, a variable % water vapor gas, and many trace gases including 0.09% argon gas, ~0.04% CO2 gas, etc. Then humans exhale ~78% N2, ~16% O2, a variable % of water vapor gas, ~0.09% argon, and ~4% CO2. In other words, humans exhale CO2 that is about 100 times more concentrated than the air they inhaled a moment before. Where does that CO2 go?
The measured, not computer modeled, not estimated, net increase of CO2 from all sources, combining all human CO2 sources and sinks and all natural CO2 sources and sinks, for 2020 was only 0.000258% of the atmosphere. Measured by NOAA-Scripps Oceanographic, publicly available data (1) (2020 data are used here because 2021 data have not been finalized.) This means for 2020 the net amount of CO2 added to the atmosphere by humans (including fossil fuel use, cement production, land use, breathing, etc) could not have exceeded 0.000258% of the atmosphere and was probably much less. Then 2.58 ppm net global average increase for 2020 divided by 414.24 ppm net global average CO2 concentration is 0.6%. The maximum possible average net human CO2 addition to the atmosphere for 2020 could not have exceeded 0.6%.
It is the maximum possible net CO2 emitted by humans because that amount includes CO2 emissions from all sources and CO2 absorption by all sinks, both natural and human. CO2 is always, day and night, everywhere, (24 hours X 365 days per year) being absorbed by the environment, mostly ocean surface and plants, and simultaneously always being emitted. (CO2 gas is continuously colliding with ocean surface, which is ~71% of Earth’s surface. CO2 gas is highly soluble in water and ocean. All decaying biological material is emitting CO2. CO2 is highly soluble in water and sea water.) “Net” CO2 means CO2 absorption subtracted from CO2 emission. Subtracting the CO2 concentration of 2019 from the CO2 concentration of 2020, results in on 2.58 ppm or only 0.000258% of the atmosphere on average. Thus, for 2020 net human CO2 addition to the atmosphere cannot exceed and must be less than 0.000258% of atmosphere and less than 0.6% of total CO2.
According to NOAA’s measurements, for year 1970, net CO2 in air from all sources and sinks was 325.68 ppm. That was a net increase from 1969 of 1.06 ppm due to all sources and sinks, natural and human. For 2020, net CO2 in air from all sources and sinks was 414.24 ppm, an increase of 2.58 ppm from 2019, due to all CO2 sources and sinks, natural and human. That is 0.000258% increase in CO2 for 2020. Subtracting: 414.24 minus 325.68 equals 88.56 ppm, which is the measured net CO2 increase in 50 years, for a 50-year average of 1.77 ppm per year increase, due to all sources and sinks, natural and human. That is a 0.000177% per year average increase. Remember your exhaled breath is ~4% CO2 with each exhaled breath, higher if you are exercising. Net CO2 due to all sources and sinks has grown only 27% in 50 years, 88.56 ppm, which might sound like a lot, except remember it is starting from a very low CO2 concentration; for 1970 CO2 was 325.68 ppm.
In the graph below, the red line is approximately 1% representing the sum of all rare gases in air. Beneath that is the purple line which graphs the net global average CO2 concentration from 1970 to 2020 computed and reported by NOAA, 325.68 ppm in 1970 to 414.24 ppm in 2020; these data are not detrended. Beneath that is the dashed green line which graphs the annual CO2 increase, the net of all CO2 sources and CO2 sinks, from 1.6 ppm per year for 1970 to 2.58 ppm per year for 2020, which is the maximum possible human emissions for each year.
Billions of dollars just foolishly and ignorantly passed into U.S. law by Democrats, as well as the $9 TRILLION per year estimated by McKinsey & Co, global bankers, Foreign Affairs Magazine, State Department, EPA, academics living on government grants, UN, WHO, EU, WEF, politicians in over 150 countries, computer gamers masquerading as climate scientists, complicit mainstream media, Greta, Gore, etc., the amount they claim is needed to control human-produced CO2, all of that is a fraud, a total and tragic waste of resources. And this fraud has been ongoing for over 50 years.
It is ludicrous to believe that CO2 is the temperature control knob for the Earth, as claimed by the United Nations. But, on top of that, there are the impossible economics and engineering which are proposed, being built and installed today to transition from a productive fossil-fuel-powered economy to an impossible dystopian future fantasy world that will fail, as described below. This travesty continues because people fall for the mantra of fear propaganda, continue to elect politicians who support it, and allow teachers to teach it.
Only concerned voters and parents can stop this fraud.
The article at the following link is by James Freeman:
You must be logged in to post a comment.