Advanced science, Bangladesh, Economic, Environmental, International, Political, Technical

Welcome to the age of climate change

Our planet is under tremendous stress now. During the last week of January, major cities in the US Midwest and Northeast were colder than some regions in Antarctica. Temperature in Minneapolis dipped as low as negative 32 degrees Celsius, with the wind chill reaching negative 47. Grand Forks in North Dakota has seen the lowest wind chill at negative 54 degrees. As many as 21 cold-related deaths have been reported so far.

Temperatures during the first week of February rose on average by a whopping 40-50 degrees. However, the reprieve is going to be short-lived as the frigid temperatures are expected to return later this month.

Although the scientifically challenged US president wants global warming to “come back fast”, someone should whisper into his ears that extreme cold spells in the Northern Hemisphere are caused, at least in part, by global warming. Under normal circumstances, cold air mass sits above the poles in an area called the polar vortex. Emerging research suggests that a warming Arctic distorts the vortex in the North Pole, so that instead of staying where it belongs in winter, closer to the Arctic Circle, the air moves down south into continental United States. Hence, the brutal cold spells. With the rapid warming of the Arctic, the effects of the polar vortex could become more frequent and severe, bringing about more intense periods of cold snaps and storms.

While we are trying to stay warm, down under, Australians are getting baked by record-breaking heat. Over two days in November, temperatures exceeding 40 degrees in Australia’s north wiped out almost one-third of the nation’s fruit bats, also known as spectacled flying foxes. Scores of brumbies—Australian wild horses—in the Northern Territory have fallen victim to the January heatwave, which soared to a high of 47 degrees. They died from starvation and dehydration. More than a million fish have perished in a river in New South Wales as the water temperature surpassed their tolerance limit.

Last summer, many nuclear power plants in Europe halted operation because overheated river water could no longer cool down the reactors. And like many Asian megalopolises, Bangkok is choking on air pollution. Water cannons are used to alleviate the smog that has shrouded the city for weeks.

A series of droughts with little recovery time in the intervals has pushed millions to the edge of survival in the Horn of Africa. Bangladesh is staring at an unprecedented migration problem as hundreds of thousands face a stark choice between inundated coastal areas and urban slums.

California saw its most ruinous wildfires ever in 2018, claiming more than 100 lives and burning down nearly 1.6 million acres. There have even been freak blazes in Lapland and elsewhere in the Arctic Circle. There is ample data to suggest that climate change is the biggest driver of out-of-control wildfires. In colder regions, an unusually warmer climate leads to earlier snowmelt and, consequently, spring arrives earlier. An early spring causes soils to be drier for a longer period of time. Drier conditions and higher temperatures increase not only the likelihood of a wildfire to occur, but also affect its severity and duration.

Typhoon Mangkhut with maximum sustained winds of 120 miles per hour roared across the Philippines and China in September 2018, triggering landslides, extensive flooding and killing some 100 people. The ferocity of the typhoon matched that of Hurricane Florence on the other side of the globe that pummelled the Mid-Atlantic Coast of the United States just four days earlier. The wind speed was 130 miles per hour and the hurricane claimed 36 lives.

Cutting-edge research by climate scientists indicates that the intensity of hurricanes and typhoons is closely connected to global warming. Higher sea levels due to melting of glaciers and Greenland’s ice sheets and warm water give coastal storm surges a higher starting point. Additionally, because hurricanes and tropical storms gain energy from water, their destructive power intensifies. Moreover, as the Earth has warmed, the probability of a storm with high precipitation levels is much higher than it was at the end of the twentieth century.

Besides raising the sea level, climate change is also modifying oceans in different ways. According to a study published in Nature Communications in January 2019, as climate change gradually heats oceans around the globe, it is also making the ocean waves stronger and more deadly.

Climate change is ravaging the natural laboratory in the Galápagos Islands, one of the most pristine and isolated places in the world, where Charles Darwin saw a blueprint for the origin and natural selection of every species, including humans. Today, because of the more frequent El Niño events that have come with warming of the seas, the inhabitants of the islands are trying to cope with the whims of natural selection.

Welcome to the age of climate change! These are just a few examples of multiple weather-related extremes occurring all over the world. They beg the question: Can human beings survive the climate crisis? The answer depends on what we do in the next 10-20 years. It will determine whether our planet will remain hospitable to human life or slide down an irreversible path towards becoming uninhabitable.

At the World Economic Forum in Davos last month, the UN Secretary General Antonio Guterres said, “If what we agreed in Paris would be materialised, the temperature would rise more than three degrees.” He is finally seeing eye-to-eye with the mainstream scientists and essentially declared the 2015 Paris Accord a dead deal.

If global temperature indeed increases by more than three degrees, summer heat would become unbearable. In particular, temperatures and humidity levels in cities that are already scorching hot would rise to levels that the human body simply cannot tolerate, researchers warn. More importantly, it would trigger a positive greenhouse effect feedback that would eventually push our planet, according to Guterres, “dramatically into a runaway climate change….” Once the runaway greenhouse effect starts, then Paris-like accords, conferences of parties, rulebooks for adaptation to climate change, or going cold turkey with fossil fuels won’t be able to reverse the situation.

Runaway greenhouse effect is not a “Chinese hoax.” Several billion years ago, Venus was cooler than what it is now and had an abundance of water in oceans overlain by an oxygen-rich atmosphere. The current hellish condition on Venus where the surface temperature is a blistering 460 degrees Celsius was caused by runaway greenhouse effect.

Thus, without a significant adjustment to how we conduct our lives, the possibility of Venus syndrome is quite high. In this scenario, our planet would still keep on spinning, but as the fourth dead ball of rock devoid of life.

Quamrul Haider is a Professor of Physics at Fordham University, New York.

Advanced science, Astrophysics, Life as it is, Technical

Quantum Conundrum

The quantum concept that came into existence precisely in the year 1900 was both revolutionary in outlook and spectacular in outcome. This very concept which was put forward by Max Planck in 1900 when he tried to explain black body radiation was subsequently taken up by a luminary like Albert Einstein (as yet unknown to the world) in 1905 and gave a rational explanation to the hitherto difficult scientific problem.

The classical physics (also known as Newtonian physics) was ruling the day until about 1900 when all day-to-day physical problems could be explained by this discipline. But gradually it was running out of steam as new technically challenging phenomena came up due to invention of new instruments and reliable measurements were made.

The intractable physical processes like the black body radiation, interactions of light with particles, the puzzling behaviour of light and many more physical processes could not be explained by traditional classical mechanics. So, a new method, a new mode of thinking, a new science had to be invented that would explain all these inexplicable things.

Although Max Planck was first to venture outside the conventional concept of light being wave in nature to explain ‘black body radiation’ in 1900, it was Albert Einstein who gave scientific explanation by proposing in 1905 the ‘quantisation’ of light – a phenomenon where light was assumed to consist of discreet packets of energy – which he called quantum of light or photon. This quantum of light was advanced in order to explain the hitherto inexplicable photoelectric process, where light was allowed to fall on the surface of a metal and electrons were detected to have emitted. No matter how long or how intense one type of light was, electrons would not be emitted. Only when light of higher frequencies was allowed, electrons were emitted. Einstein showed that photons (quantum of energy in a bundle) of higher frequencies have higher energies and those higher energy photons could emit electrons. (It was like, no matter how long or how heavy the rain is, the roof would not be dented. Only when hailstorm of sufficient big sizes falls on the roof, does the roof cave in). For this quantisation theory, Einstein was awarded Nobel prize in 1921.

Thus, light came to be viewed as both wave and particle, depending on experimental circumstances, and hence the nomenclature ‘wave-particle duality’ came into common vocabulary. If hitherto electromagnetic light can be viewed both as wave and particle, can particles (like electrons) behave like waves? Indeed, so. If electrons are allowed to go through two slits, they interfere and produce alternate bright and dark spectral lines on a screen, exactly like light waves do. The microscopic world does not distinguish between waves and particles, they are blurred into indistinguishable entities. That is the nature that quantum mechanics has produced. 

Although Einstein was the pioneer of quantisation of light, he was not at ease with the way this new concept had been taken up by ‘new lions’ under the stewardship of physicists like Niels Bohr, Wolfgang Pauli, Werner Heisenberg, Erwin Schrodinger, Max Born and many more in the early part of the last century. They collectively produced the full-blown quantum mechanics, which Einstein had difficulty in recognising.  

In quantum theory, particles like electrons revolving round the nucleus of an atom do not exist as particles. They are like strata of waves smeared round the nucleus. However, they exist, behaving like particles, when some energy is imparted to the atom or some energy is taken away from the atom resulting in those electrons moving up or down in energy levels. In other words, electrons exist only when there is an interaction or transition. Without such transitions, electrons just do not show up. However, electrons (with negative charge) are there around the nucleus, but there is no way of telling where the electrons are – only probability of their presence (wave function) can be described! No wonder, Einstein was not happy with such description, which he called incomplete.

Heisenberg produced what came to be known as ‘Heisenberg uncertainty principle’. The elementary particle like an electron cannot be measured with absolute accuracy both its position and momentum at the same time. The act of measuring the position of an electron disturbs the complementary parameter like velocity and so certain amount of uncertainty in momentum creeps in – that is the uncertainty principle. Similar uncertainty exists when measuring time and energy of the particle at the same time.

Niels Bohr, the high priest of quantum mechanics, produced from his Advanced Institute of Physics in Copenhagen, what came to be known as ‘Copenhagen Interpretation’ of quantum mechanics. This interpretation advanced the idea that elementary particles like electrons do not exist in stable or stationary conditions; they only exist in transitions and in interactions.

The ‘Copenhagen Interpretation’ further emphasised that a quantum particle can only be said to exist when it is observed, if it is not observed it does not exist. This was a revolutionary concept. Einstein could not reconcile with that idea. He retorted, “When the Moon is there in the sky, it is real; whether one observes it or not”. Thus, the great intellectual battle on the nature of reality ensued between Einstein and Bohr. Einstein firmly believed that the quantum mechanics as it existed in his life time was inconsistent and incomplete (although he withdrew the ‘inconsistent’ branding, as quantum mechanics kept explaining modern technical processes with consistency). To prove that ‘incompleteness’, he produced various ‘thought experiments’ at various times to challenge Bohr’s ‘Copenhagen Interpretation’. Bohr countered those challenges with technical explanations, but Einstein was not fully convinced.   

Einstein did not like the abstract nature of quantum mechanics. He always demanded that theory must correspond to the reality, if not, it becomes a ‘voodoo’ science.  

For his criticism, he was not very popular with the advocates of ‘Copenhagen Interpretation’. They even lamented that ‘how is it possible that Einstein who was the pioneer of quantum theory and who revolutionised gravitational concept by saying that space is warped by gravity and the gravitational field is indeed the space, now he is reluctant to accept ideas of quantum mechanics’?   

Quantum mechanics had solved many intractable problems and predicted many physical aspects which subsequently came to be true. But at the same time, it is incomprehensible, extremely abstract and devoid of ‘elements of reality’. Anybody hoping to see theory mirroring reality would be totally disappointed. Even Richard Feynman, American Nobel laureate, who contributed significantly to the development of quantum physics once retorted, “I think I can safely say that nobody understands quantum mechanics”! Nonetheless, quantum mechanics is the most advanced scientific discipline of today.

– Dr A Rahman is an author and a columnist.

Advanced science, Environmental, International, Technical

Potential Carbon Capture Techniques

Carbon capture and storage

The Intergovernmental Panel on Climate Change (IPCC) concedes that limiting the rise in global temperature below two-degree Celsius before the end of this century is impossible without reducing emission of carbon dioxide to zero by 2050. However, the majority of scientists agree that zero emission alone will not solve the problem of global warming. That is because we have done too much damage already to the climate to avoid warming just by halting the burning of fossil fuels. Besides, the current concentration of carbon dioxide in the atmosphere would keep on trapping heat for hundreds of years.

So, what’s the way out? Despite the bleak outlook, we can still limit global warming to under two degrees by going carbon negative together with zero emission. Carbon negative means removing more carbon dioxide from the atmosphere than adding to it.

The technique that is currently used to remove carbon dioxide and potentially other greenhouse gases from the atmosphere independent of its source is known as Direct Air Capture (DAC). Within the context of DAC, carbon dioxide is sucked out of the ambient air with a giant network of fans. Once carbon dioxide is trapped, it is liquefied and transported through pipelines and stored underground, often in natural reservoirs like depleted oil wells that can hold the gas for millions of years. There is also growing interest in storing the liquid carbon dioxide in saline aquifers due to their enormous storage capacity.

The companies that are at the forefront of DAC technology are Carbon Engineering in Vancouver, Climeworks in Zurich and Global Thermostat in New York. The Mercator Research Institute on Global Commons and Climate Change in Berlin claims that the company’s DAC plant is the first of its kind to operate on an industrial scale.

Zero or near-zero emission of carbon dioxide could be achieved by using the Carbon Capture and Storage (CCS) technology. The process is similar to DAC technology except that CCS traps carbon dioxide from the exhaust stream of power plants, thereby preventing it from entering the atmosphere.

There are a handful of coal-fired power plants around the world that are using the CCS technology. The largest such plant, Petra Nova in Texas, captures around 5,000 tonnes of carbon dioxide per day from its exhaust. That is about 90 percent of all the carbon dioxide the plant produces.

Another zero-emission technique is known as Bio Energy with Carbon Capture and Sequestration (BECCS). It involves growing crops, burning them to generate electricity, capturing the carbon dioxide emitted during combustion and storing it deep down into the Earth’s crust. Eventually, over the course of millennia, it is converted into carbonate rocks.

Clearly, BECCS obviates the need to extract fossil fuels, thus closing the carbon loop and enabling carbon neutrality by replacing fossil fuel with crops. There are about two dozen BECCS pilot projects operated by multi-national companies like Shell, Chevron and Archer Daniels Midland (ADM). Since 2011, ADM has been sequestering about a million tonnes of carbon dioxide per year.

At Sandia National Laboratories in Albuquerque, New Mexico, scientists are working on applying concentrated sunlight to the captured carbon dioxide to initiate reactions that yield carbon monoxide, hydrogen and oxygen. Because carbon monoxide and hydrogen are the basic chemical building blocks of synthetic fuels, they call this process “sunshine to petrol”. Indeed, researchers have demonstrated that 75 percent of the carbon dioxide captured from the air can be converted into methanol. This shows that the main culprit of global warming can be recycled into useful products. Moreover, production of these carbon-recycled products would be carbon neutral or carbon negative.

Billions of tonnes of carbon dioxide could also be captured by rocks via a natural chemical reaction and permanently stored in an environmentally benign form, according to researchers at Columbia University in New York and the US Geological Survey. They found that when a rock, known as Peridotite, comes in contact with carbon dioxide, it converts the gas into harmless minerals such as calcite. This process is known as “carbon sequestration by mineral carbonation”. They have also worked out a way to “grow enough of the [rock] to permanently store two billion or more tonnes of carbon dioxide annually.”

Peridotite is exposed at the surface in many places on Earth. It is abundant on all the continents, except perhaps Antarctica. In Oman, this naturally occurring rock is sequestering about 100,000 tonnes of carbon dioxide each year. That is enough to soak up carbon dioxide emissions from burning more than 35 million litres of gasoline.

A power plant in Iceland that uses hot water from geothermal steam, which contains carbon dioxide, removes the gas from the steam and injects it into a volcanic rock called basalt. The rock reacts with carbon dioxide to form carbonate minerals in less than two years. Ongoing research suggests that this technique could be used to convert huge amounts of carbon dioxide into “rocks” and stow them underground.

Recently, De Beers—the world’s largest diamond producer—announced that it would start a pilot project in South Africa designed to create the world’s first carbon-neutral mine. Essentially, De Beers would inject carbon dioxide into kimberlite, an ore containing diamonds, where the two will combine to form a solid compound. The project is due to start sometime next year.

Although the idea of carbon dioxide absorption by rocks is still in the embryonic stage, the silver bullet to keep our planet’s climate under control might be the rocks right under our feet. Until the technology to utilise these rocks is fully developed, DAC, CCS and BECCS will need to be a significant part of any realistic plan to assuage the effects of climate change while simultaneously mitigating the cause. Otherwise, we may soon be entering a new geologic era, which could be termed the “Anthropocene Era”, one where the climate is very different from the one our ancestors knew.

The author, Quamrul Haider, is a Professor of Physics at Fordham University, New York.

Advanced science, Astrophysics, Environmental, Technical

How global warming is impacting on Earth’s spin

Anthropogenic greenhouse gas emissions might be affecting more than just the climate. For the first time, scientists at NASA presented evidence that the orientation of the Earth’s spin axis is changing because of global warming.

global_warming_1[1]The Earth spins from west to east about an axis once every 24 hours, creating the continuous cycle of day and night. The north-south spin axis runs through the North and South Poles and is tilted by 23.5 degrees from the vertical. The axial tilt causes almost all the seasonal changes.

But the tilt is far from constant. It varies between 21.6 and 24.5 degrees in a 41,000-year cycle. This variation together with small fluctuations in the Sun and Moon’s gravitational pull, oblate shape and elliptical orbit of the Earth, irregular surface, non-uniform distribution of mass and movement of the tectonic plates cause the spin axis, and hence the Poles, to wobble either east or west along its general direction of drift.

Until 2005, Earth’s spin axis has been drifting steadily in the southwest direction around ten centimetres each year towards the Hudson Bay in Canada. However, in 2005, the axis took an abrupt turn and started to drift east towards England at an annual rate of about 17 centimetres, according to data obtained by NASA’s Gravity Recovery and Climate Experiment satellites. It is still heading east.

After analysing the satellite data, scientists at NASA’s Jet Propulsion Laboratory in California attribute the sudden change in direction of the axis mainly to melting of Greenland’s ice sheets due to global warming. The reason: Melting of ice sheets and the resulting rise of the sea level are changing the distribution of mass on Earth, thereby causing the drift of the spin to change direction and become more oblique. The axis is particularly sensitive to changes in mass distribution occurring north and south of 45 degrees latitude. This phenomenon is similar to the shift in the axis of rotation of a spinning toy if we put more mass on one side of the top or the other.

Since 2002, ice sheets of Greenland have been melting at an annual rate of roughly 270 million tonnes. Additionally, some climate models indicate that a two-to-three degrees Celsius rise in temperature would result in a complete melting of Greenland’s ice sheets. If that happens, it could release the equivalent of as much as 1,400 billion tonnes of carbon dioxide, enhancing global warming even further. It would also raise the sea level by about 7.5 meters. By then, the wobbling of the Poles would also be completely out of whack.

The ice in the Arctic Ocean has also decreased dramatically since the 1960s. For every tonne of carbon dioxide released into the atmosphere, about three square meters of Arctic’s ice were lost in the last 50 years. This reflects a disquieting long-term trend of around ten percent loss of ice per decade. Furthermore, Antarctica is losing more ice than is being replaced by snowfall. The influx of water from the melting of ice of the Arctic Ocean and Antarctica together with the melting of glaciers and the subsequent redistribution of water across the Earth is also causing our planet to pitch over.

What does this mean for us? Although something as small as we humans shook up something as massive as the Earth, it won’t turn upside down as long as the Moon, which acts as a stabiliser of the Earth’s spinning motion, stays in the sky as our nearest neighbour. However, if the shift of the spin axis maintains its present rate and direction, then by the end of this century, the axis would shift by nearly 14 meters. Such a large shift will have devastating consequences for climate change and our planet.

The orientation of the Earth’s spin axis determines the seasonal distribution of radiation at higher latitudes. If the axial tilt is smaller, the Sun does not travel as far north in the sky during summer, producing cooler summers. A larger tilt, as could be in the future, would mean summer days that would be much hotter than the present summer days. In addition, it would impact the accuracy of GPS and other satellite-dependent devices.

Since global warming is causing the Earth’s mass to be redistributed towards the Poles, it would cause the planet to spin faster, just as an ice skater spins faster when she pulls her arms towards her body. Consequently, the length of a day would become shorter.

Our biological clock that regulates sleeping, walking, eating, and other cyclic activities is based on a 24-hour day. Faced with a shorter day, these circadian rhythms would be hopelessly out of sync with the natural world. Moreover, a rapidly spinning Earth will be unstable to the extent that the Poles would wobble faster. This would create enormous stress on the Earth’s geology leading to large-scale natural disasters that will most likely be disastrous for life on Earth.

We may not witness the effects of a rapidly spinning Earth by the end of this century or the next. Nevertheless, the effects will be perceivable a few centuries from now if the global temperature keeps on rising and the ice sheets keep on melting in tandem.

The shift in the Earth’s spin axis due to climate change highlights how real and profoundly large impact humans are having on the planet. The dire consequences of the shift in the axial tilt towards a larger obliquity, as noted above, is not a wake-up call, but an alarm bell. There is still time for our leaders to listen to the scientists and formulate a long-term approach to tackle the problem of climate change instead of a short-term Band-Aid approach, as outlined in the 2015 Paris Agreement, which will see us through only to the end of this century. Therefore, our foremost goal before the death knell should be to reverse global warming, or at the least, to stop further warming instead of limiting it to 1.5-degree in the next 75 years or so.

The author, Quamrul Haider, is a Professor of Physics at Fordham University, New York.


Advanced science, Bangladesh, Economic, Environmental, International, Technical

Harnessing the Solar Energy absorbed by ocean waters


The world’s oceans constitute a vast natural reservoir for receiving and storing solar energy. They take in solar energy in proportion to their surface area, nearly three times that of land. As the sun warms the oceans, it creates a significant temperature difference between the surface water and the deeper water to which sunlight doesn’t penetrate. Any time there’s a temperature difference, there’s the potential to run a heat engine, a device that converts thermal energy into mechanical energy.

Most of the electricity we use comes from heat engines of one kind or another. The working principle of such an engine is very simple. It operates between two reservoirs of thermal energy, one hot and one cold. Energy is extracted from the hot reservoir to heat a working fluid which boils to form high-pressure vapour that drives a turbine coupled to an electricity-producing generator. Contact with the cold reservoir re-condenses the working fluid which is pumped back into the evaporator to complete the cycle.

The idea of building an engine to harness energy from the oceans, mainly to generate electricity, by exploiting the thermal gradient between waters on the surface and deeper layers of an ocean is known as OTEC—acronym for Ocean Thermal Energy Conversion. With OTEC, the hot reservoir is an ocean’s warmer surface water with temperatures, which can exceed 25 degrees Celsius, and the cold reservoir is the cooler water, around five to six degrees, at a depth of up to one kilometre. The working fluid is usually ammonia, which vaporises and condenses at the available temperatures. This is analogous to choosing water as the working fluid matched to the temperature differential between a fossil-fuel-fired boiler and a condenser cooled by air or water.

The maximum efficiency of a heat engine operating between reservoirs at 25 and 5 degrees Celsius is 6.7 percent. This means efficiency of an actual OTEC engine will be much less, perhaps 2-3 percent. But low efficiency isn’t the liability it would be in a fossil-fuelled or nuclear power plant. After all, the fuel for OTEC is unlimited and free, as long as the sun heats the oceans.

The greater is the temperature difference, more efficient an OTEC power plant would be. For example, a surface temperature of 30 degrees would raise the ceiling on efficiency to 8.25 percent. That’s why the technology is viable primarily in tropical regions where the year-round temperature differential between the ocean’s deep cold and warm surface waters is greater than 20 degrees. The waters of Bay of Bengal along the shores of Bangladesh, a country that enjoys a year round warm, and at times very hot weather, have excellent thermal gradients for producing electricity using OTEC technology.

The world’s biggest operational OTEC facility, with an annual power generation capacity of 100 kW, was built by Makai Ocean Engineering in Hawaii. Tokyo Electric Power Company and Toshiba built a 100 kW plant on the island of Nauru, although as much as 70 percent of the electricity generated is used to operate the plant.

The US aerospace company Lockheed Martin is building an OTEC electricity generating plant off the coast of Hainan Island in China. Once operational, the plant will be able to generate up to at least 10 MW of power, enough to sustain the energy requirements of a smaller metropolis. India is building a 200 kW plant, expected to be operational before 2020, in Kavaratti, capital of the Lakshadweep archipelago, to power a desalination plant. Other OTEC systems are either in planning or development stage in Iran, Kuwait, Saudi Arabia, Thailand and several countries along the Indian Ocean, mostly to supply electricity.

Like any alternative form of energy, OTEC has its advantages and disadvantages, but the advantages outweigh the disadvantages. Among the advantages, the one that stands out is its ability to provide a base load supply of energy for an electrical power generation system without interruption, 24/7/365. It also has the potential to produce energy that are several times greater than other ocean energy options, such as waves and tides. More importantly, OTEC is an extremely clean and sustainable technology because it won’t have to burn climate-changing fossil fuels to create a temperature difference between the reservoirs. A natural temperature gradient already exists in the oceans. The gradient is very steady in time, persisting over day and night and from season to season. Furthermore, the desalination technology as a by-product of the OTEC can produce a large amount of fresh water from seawater which will benefit many island nations and desert countries.

However, recirculation of large volumes of water by OTEC power plants could have negative impacts on the aquatic environment. In particular, the introduction of nutrient-rich deep waters into the nutrient-poor surface waters would stimulate plankton blooms that could adversely affect the local ecological balance. Additional ecological problems include destruction of marine habitats and aquatic nursery areas, redistribution of oceanic constituents, loss of planktons and decrease of fish population.

Since OTEC facilities must be located closer to the shores due to cabling constraints, they could have significant effect on near-shore circulation patterns of ocean water. As a result, open ocean organisms close to the shores will be especially affected because they are known to have very narrow tolerance limits to changes in the properties of their environment.

The biggest drawback of OTEC is its low efficiency. This implies that to produce even modest amounts of electricity, OTEC plants have to be constructed on a relatively large scale, which makes them expensive investments. It’s the price we should be prepared to pay to curb global warming. Industry analysts however believe that in the long run, low operation and maintenance cost would offset the high cost of building OTEC facilities.

The current effort, as agreed in the 2015 Paris Accord, to keep our planet lovable is like taking one giant step backward before trying to move one step forward. If technology for OTEC and other eco-friendly renewable sources of energy are fully developed and globally commercialised, it would indeed be one giant step forward in mitigating global warming. They would also equip communities worldwide with the self-empowerment tools that are required to build an independent and sustainable future.


The author, Quamrul Haider, is a Professor of Physics at Fordham University, New York.