Sunday, December 03, 2017
Comments due by Dec. 10, 2017
SWEDEN’S parliament passed a law in June which obliges the country to have “no net emissions” of greenhouse gases into the atmosphere by 2045. The clue is in the wording. This does not mean that three decades from now Swedes must emit no planet-heating substances; even if all their electricity came from renewables and they only drove Teslas, they would presumably still want to fly in aeroplanes, or use cement and fertiliser, the making of which releases plenty of carbon dioxide. Indeed, the law only requires gross emissions to drop by 85% compared with 1990 12/3/2017 Greenhouse gases must be scrubbed from the air.
But it demands that remaining carbon sources are offset with new carbon sinks. In other words greenhouse gases will need to be extracted from the air. Sweden’s pledge is among the world’s most ambitious. But if the global temperature is to have a good chance of not rising more than 2ºC above its pre-industrial level, as stipulated in the Paris climate agreement of 2015, worldwide emissions must similarly hit “net zero” no later than 2090. After that, emissions must go “net negative”, with more carbon removed from the stock than is emitted.
This is because what matters to the climate is the total amount of carbon dioxide in the atmosphere. To keep the temperature below a certain level means keeping within a certain “carbon budget”—allowing only so much to accumulate, and no more. Once you have spent that budget, you have to balance all new emissions with removals. If you overspend it, the fact that the world takes time to warm up means you have a brief opportunity to put things right by taking out more than you are putting in (see chart 1). Being able to remove carbon dioxide from the atmosphere is, therefore, a crucial element in meeting climate targets. Of the 116 models the Intergovernmental Panel on Climate Change (IPCC) looks at to chart the economically optimal paths to the Paris goal, 101 assume “negative emissions”. No scenarios are at all likely to keep warming under 1.5ºC without greenhouse-gas removal. “It is built into the assumptions of the Paris agreement,” says Gideon Henderson of Oxford University.
Climate scientists like Mr. Henderson have been discussing negative-emissions technologies (NETs) with economists and policy wonks since the 1990s. Their debate has turned livelier since the Paris agreement, the phrasing of which strongly suggests that countries will need to invent new sinks as well as cutting emissions. But so far politicians have largely ignored the issue, preferring to focus on curbing current flows of greenhouse gases into the atmosphere. NETs were conspicuous by their absence from the agenda of the annual UN climate jamboree which ended in Bonn on November 17th. In the short term this makes sense. The marginal cost of reducing emissions is currently far lower than the marginal cost of taking carbon dioxide straight from the atmosphere. But climate is not a short-term game. And in the long term, ignoring the need for negative emissions is complacent at best. The eventual undertaking, after all, will be gargantuan. The median IPCC model assumes sucking up a total of 810bn tonnes of carbon dioxide by 2100, equivalent to roughly 20 years of global emissions at the current rate. To have any hope of doing so, preparations for large-scale extraction ought to begin in the 2020s. Modellers favour NETs that use plants because they are a tried and true technology. Reforesting logged areas or “afforesting” previously treeless ones presents no great technical challenges. More controversially, they also tend to invoke “bioenergy with carbon capture and storage” (BECCS). In BECCS, power stations fuelled by crops that can be burned to make energy have their carbon-dioxide emissions injected into deep geological strata, rather than released into the atmosphere. The technology for doing the CCS part of BECCS has been around for a while; some scenarios for future energy generation rely heavily on it. But so far there are only 17 CCS programmes big enough to dispose of around 1m tonnes of carbon dioxide a year. Promoting CCS is an uphill struggle, mainly because it doubles the cost of energy from the dirty power plants whose flues it scrubs. Other forms of low emission electricity are much cheaper. Affixed to bioenergy generation, though, CCS does something that other forms of generation cannot. The carbon which the plants that serve as fuel originally took from the atmosphere above is sent into the rocks below, making it a negative emitter. The problem with afforestation and BECCS is that the plants involved need a huge amount of land. The area estimated ranges from 3.2m square kilometres (roughly the size of India) to as much as 9.7m square kilometres (roughly the size of Canada). That is the equivalent of between 23% and 68% of the world’s arable land. It may be that future agricultural yields can be increased so dramatically that, even in a world with at least 2bn more mouths to feed, the area of its farms could be halved, and that the farmers involved might be happy with this turn of events. But it seems highly unlikely—and blithely assuming it can be done is plainly reckless. Negative thinking Less land-intensive alternatives exist—at least on paper. Some are low tech, like stimulating the soil to store more carbon by limiting or halting deep-ploughing. Others are less so, such as contraptions to seize carbon dioxide directly from the air, or methods that accelerate the natural weathering processes by which minerals in the Earth’s crust bind atmospheric carbon over aeons or that introduce alkaline compounds into the sea to make it absorb more carbon dioxide. According to Jennifer Wilcox of the Colorado School of Mines, and her colleagues, the technology with the second-highest theoretical potential, after BECCS, is direct air capture (see chart 2). This uses CCS-like technology on the open air, rather than on exhaust gases. The problem is that the concentration of carbon dioxide in the air, while very high by historical standards, is very low by chemical-engineering ones: just 0.04%, as opposed to the 10% or more offered by power-plant chimneys and industrial processes such as cement-making. The technologies that exist today, under development by companies such as Global Thermostat in America, Carbon Engineering in Canada or Climeworks of Switzerland, remain pricey. In 2011 a review by the American Physical Society to which Ms Wilcox contributed put extraction costs above $600 per tonne, compared with an average estimate of $60-250 for BECCS. Enhanced weathering is at an even earlier stage of development and costs are still harder to assess. Estimates range from $25 per tonne of carbon dioxide to $600. On average, 2-4 tonnes of silicate minerals (olivine, sometimes used in Finnish saunas because it withstands repeated heating and cooling, is a favourite) are needed for every tonne removed. To extract 5bn tonnes of carbon dioxide a year may require up to 20bn tonnes of minerals that must be ground into fine dust. Grinding is energy-intensive. Distributing the powder evenly, on land or sea, would be a logistical challenge to put it mildly. Ideas abound on a small scale, in labs or in researchers’ heads, but the bigger mechanical schemes in existence today capture a paltry 40m tonnes of carbon dioxide a year. Most involve CCS and have prevented more carbon dioxide escaping into the atmosphere from fossil-burning power plants, rather than removing it. Removing 8bn-10bn tonnes by 2050, as the more sanguine scenarios envisage, let alone the 35bn-40bn tonnes in more pessimistic ones, will be a vast undertaking. Progress will be needed on many fronts. All the more reason to test lots of technologies. For the time being even researchers with a horse in the race are unwilling to bet on a winner. Pete Smith of Aberdeen University speaks for many NETs experts when he says that “none is a silver bullet, and none has a fatal flaw.” It will also not come cheap. WITCH, constructed by Massimo Tavoni of Politecnico di Milano, is a model which analyses climate scenarios. Unlike most simulations, it also estimates how much research-and-development funding is necessary to achieve roll-out at the sort of scale these models forecast. For all low-carbon technologies, it puts the figure at $65bn a year until 2050, four times the sum that renewables, batteries and the like attract today. Mr. Tavoni says a chunk of that would obviously need to go to NETs, which currently get next to nothing. Even the less speculative technologies need investment right away. Trees take decades to reach their carbon-sucking potential, so large-scale planting needs to start soon, notes Tim Searchinger of Princeton University. Direct air capture in particular looks expensive. Boosters note that a few years ago so did renewables. Before technological progress brought prices down, many countries subsidised renewable-energy sources to the tune of $500 per tonne of carbon dioxide avoided and often spent huge sums on it. Christoph Gebald, co-founder of Climeworks, says that “the first data point on our technological learning curve” is $600, at the lower end of previous estimates. But like the price of solar panels, he expects his costs to drop in the coming years, perhaps to as low as $100 per tonne. However, the falling price of solar panels was a result of surging production volumes, which NETs will struggle to replicate. As Oliver Geden of the German Institute of International and Security Affairs observes, “You cannot tell the greengrowth story with negative emissions.” A market exists for rooftop solar panels and electric vehicles; one for removing an invisible gas from the air to avert disaster decades from now does not. Much of the gas captured by Climeworks and other pure NETs firms (as opposed to fossil-fuel CCS) is sold to makers of fizzy drinks or greenhouses to help plants grow. It is hard to imagine that market growing far beyond today’s total of 10m tonnes. And in neither case is the gas stored indefinitely. It is either burped out by consumers of carbonated drinks or otherwise exuded by eaters of greenhouse grown produce. There may be other markets, though. It is very hard to imagine aircraft operating without liquid fuels. One way to provide them would be to create them chemically using carbon dioxide taken from the atmosphere. It is conceivable that this might be cheaper than alternatives, such as biofuels— especially if the full environmental impact of the biofuels is accounted for. The demand for direct air capture spurred by such a market might drive its costs low enough to make it a more plausible NET. From thin air One way to create a market for NETs would be for governments to put a price on carbon. Where they have done so, the technologies have been adopted. Take Norway, which in 1991 told oil firms drilling in the North Sea to capture carbon dioxide from their operations or pay up. This cost is now around $50 per tonne emitted; in one field, called Sleipner, the firms have found ways to pump it back underground for less than that. A broader carbon price—either a tax or tradable emissions permits—would promote negative emissions elsewhere, too. Then there is the issue of who should foot the bill. Many high-impact negative emissions schemes make most sense in low-emitting countries, says Ms Wilcox. Brazil could in theory reforest the cerrado (though that would face resistance because of the region’s role in growing soyabeans and beef). Countries of sub Saharan Africa could do the same in their own tropical savannahs. Spreading olivine in the Amazon and Congo riverbasins could soak up 2bn tonnes of carbon dioxide. Developing countries would be understandably loth to bankroll any of this to tackle cumulative emissions, most of which come from the rich world. The latter would doubtless recoil at footing the bill, preferring to concentrate on curbing current emissions in the mistaken belief that once these reach zero, the job is done. Whether NETs deserve to be lumped in with more outlandish “geoengineering” proposals, such as cooling the Earth with sunlight-reflecting sulphur particles in the stratosphere, is much debated. What they have in common is that they offer ways to deal with the effects of emissions that have already taken place. Proponents of small-scale, low-impact NETs, such as changes to soil management on farms, though, bridle at being considered alongside what they see as high-tech hubris of the most disturbing kind. NETs certainly inspire fewer fears of catastrophic, planetary-scale side-effects than “solar radiation management”. But they do stoke some when it comes to the consequences of tinkering with the ocean’s alkalinity or injecting large amounts of gas underground. And the direct effects of large-scale BECCS or afforestation projects would be huge. If they don’t take up arable land, they need to take up pasture or wilderness. Either option would be a big deal in terms of both human amenity and biodiversity. Another concern is the impact on politicians and the dangers of moral hazard. NETs allow politicians to go easy on emission cuts now in the hope that a quick fix will appear in the future. This could prove costly if the technology works—and costlier still if it does not. One study found that following a 2°C mitigation path which takes for granted NETs that fail to materialise would leave the world closer to 3°C warmer. Mr Geden is not alone in fearing that models that increasingly rely on NETs are “a cover for political inaction”. Academics are paying more attention. This year’s edition of “Emissions Gap”, an influential annual report from the UN Environment Programme, devotes a chapter to carbon-dioxide removal. Mr Henderson is leading a study of the subject for Britain’s Royal Society; America’s National Academy of Sciences has commissioned one, too. Both are due next spring. The IPCC will look at the technology in its special report on the 1.5ºC target, due next autumn. There’s some money, too. Cagineering has attracted backers such as Bill Gates, and now has a pilot plant in Canada. Climeworks has actually sold some carbon-offset credits—to a private investor and a big corporation—on the basis of the carbon dioxide it has squirrelled away at a demonstration plant it recently launched in Iceland. Earlier this year Britain’s government became the first to set aside some cash specifically for NETs research. In October America’s Department of Energy announced a series of grants for “novel and enabling” carbon-capture technologies, some of which could help in the development of schemes for direct air capture. Richard Branson, a British tycoon, has offered $25m to whoever first comes up with a “commercially viable design” that would remove 1bn tonnes of greenhouse gases a year for ten years. All this is welcome, but not enough. The sums involved are trifling: £8.6m ($11.3m) in Britain and $26m from the Department of Energy. The offset sold by Climeworks was for just 100 tonnes. Mr Branson’s prize has gone unclaimed for a decade. A carbon price—which is a good idea for other reasons, too, would beef up interest in NETs. But one high enough to encourage pricey moonshots may prove too onerous for the rest of the economy. Any price would promote more established low-carbon technologies first and NETs only much later, thinks Glen Peters of the Centre for International Climate Research in Oslo.Encouraging CCS for fossil fuels as a stepping stone to NETs appeals to some. The fossil-fuel industry says it is committed to the technology. Total, a French oil giant, has promised to spend a tenth of its $600m research budget on CCS and related technologies. A group of oil majors says it will spend up to $500m on similar projects between now and 2027. But the field’s slow progress to date hardly encourages optimism. Governments’ commitment to CCS has historically proved fickle. Last year Britain abruptly scrapped a £1bn public grant for an industrial-scale CCS plant which would have helped fine-tune the technology. For this to change, politicians must expand the focus of the 23-year-old UN Framework Convention on Climate Change from cutting emissions of greenhouse gases to controlling their airborne concentrations, suggests Janos Pasztor, a former climate adviser to the UN secretary-general. In other words, they must think about stocks of carbon dioxide, not just flows. This is all the more true because emissions continue to elude control. After three years of more or less stable emissions, a zippier world economy looks on track to belch 2% more carbon dioxide this year. That amounts once again to borrowing more of the planet’s remaining carbon budget against future removal. It doesn’t take a numerate modeller like Mr Tavoni to grasp that, in his words, “If you create a debt, you must repay it.”
Saturday, November 25, 2017
Comments due Dec. 1,2017
Global Footprint Network data shows that humanity uses the equivalent of 1.7 planet Earths to provide the renewable resources we use and absorb our waste.1 If all 7+ billion of us were to enjoy a European standard of living - which is about half the consumption of the average American - the Earth could sustainably support only about 2 billion people.
It is crucial to understand that the longer we continue consuming more resources than the Earth can sustainably provide, the less able the Earth can meet our resource needs in the future - and the fewer people the planet can support - long-term.
Evidence of unsustainable resource use is all around us. Global aquifers are being pumped 3.5 times faster than rainfall can naturally recharge them.2 Eventually they will run dry and hundreds of millions will suffer. Topsoil is being lost 10-40 times faster than it is formed.3 Feeding all 7+ billion of us will become increasingly difficult. Oceans are being overfished, and a primary protein source for over 2 billion people is in jeopardy.4 Worldwide, we have lost over half the vertebrate species in the air, water, and land since 1970.5 How many more species can we lose and how many more ecosystems can we destroy before humanity’s own existence is threatened?
It is important to note that the depletion of non-renewable resources such as fossil fuels, metals, and minerals that make a European standard of living possible are not included in Global Footprint Network data. This includes all the tons of oil, coal, iron ore, copper, and hundreds of other minerals and metals that make modern life possible. Taking these non-renewable resources into account suggests 2 billion people living at a European standard of living may be the upper limit of a sustainable global population.
Climate change will only add to the strain on the planet’s ability to support all 7+ billion of us. Climate scientists are warning us to expect lower crop yields of major grains such as wheat, rice, and maize.6 Rising sea levels could create hundreds of millions of climate refugees. And climate disruption is likely to create increasing levels of resource conflict and civil unrest.
Adaptation to climate disruption will be much easier with a much smaller global population. We can achieve a smaller global population tomorrow by beginning a dramatic reduction in births today.
All of us want a viable, sustainable global home. If we allow overpopulation and overconsumption to continue, the evidence is mounting that billions will suffer and that we will leave future generations a much harder, bleaker life.
Reducing birth rates now can save us from the likely increase in death rates that awaits us if we do nothing. Solving overpopulation is essential in building a sustainable future.
Saturday, November 18, 2017
Comments due by Nov. 24, 2017
A transition away from fossil fuels is necessary, but it will not be painless. A WIDELY read cover story on the impact of global warming in New York magazine starts ominously: “It is, I promise, worse than you think.” It goes on to predict temperatures in New York hotter than present-day Bahrain, unprecedented droughts wherever today’s food is produced, the release of diseases like bubonic plague hitherto trapped under Siberian ice, and permanent economic collapse. In the face of such apocalyptic predictions, can the world take solace from those who argue that it can move, relatively quickly and painlessly, to 100% renewable energy? At first glance, the answer to that question looks depressingly obvious. Despite falling costs, wind and solar still produce only 5.5% of the world’s electricity. Hydropower is a much more significant source of renewable energy, but its costs are rising, and investment is falling. Looking more broadly at energy demand, including that for domestic heating, transport and industry, the share of wind and solar is a minuscule 1.6% (see chart). It seems impossible to eliminate fossil fuels from the energy mix in the foreseeable future. Upgrade your inbox Receive our Daily Dispatch and Editors' Picks newsletters. Enter your e-mail address Sign up now But all energy transitions, such as that from coal to hydrocarbons in the 20th century, take many decades. It is the rate of change that guides where investments flow. That makes greens more optimistic. During the past decade, solar photovoltaics (PV) and wind energy have been on a roll as sources of electricity. Although investment dipped slightly last year, the International Energy Agency, a global forecaster, said on July 11th that for the first time the amount of renewable capacity commissioned in 2016 almost matched that for other sources of power generation, such as coal and natural gas. In some countries the two technologies—particularly solar PV in sunny places—are now cheaper than coal and gas. It is no longer uncommon for countries like Denmark and Scotland to have periods when the equivalent of all their power comes from wind. Ambitions are rising. The Senate in California, a state that is close to hitting its goal of generating one-third of its power from renewables by 2020, has proposed raising the target to 60% by 2030; Germany’s goal is to become 80% renewable by 2050. But whether it is possible to produce all of a country’s electricity with just wind, water and hydro is a subject of bitter debate. In 2015 Mark Jacobson of Stanford University and others argued that electricity, transport, heating/cooling, and industry in America could be fully powered in 2050-55 by wind, water and solar, without the variability of the weather affecting users. Forswearing the use of natural gas, biofuels, nuclear power and stationary batteries, they said weather modelling, hydrogen storage and flexible demand could ensure stable supply at relatively low cost. But in June this year Christopher Clack, founder of Vibrant Clean Energy, a firm, issued a stinging critique with fellow researchers in the Proceedings of the National Academy of Sciences, the journal in which Mr Jacobson et al had published their findings. They argued that a narrow focus on wind, water and solar would make tackling climate change more difficult and expensive than it needed to be, not least because it ignored existing zero-carbon technologies such as nuclear power and bioenergy. They claimed the models wrongly assumed that hydroelectricity output could continue for hours on end at many times the capacity available today, and pointed to the implausibility of replacing the current aviation system with yet-tobe-developed hydrogen-powered planes. In their view, decarbonising 80% of the electricity grid is possible at reasonable cost, provided America improves its highvoltage transmission grid. Beyond that is anyone’s guess. Others take a wider view. Amory Lovins of the Colorado-based Rocky Mountain Institute, a think-tank, shrugs off the 100% renewables dispute as a sideshow. He takes comfort from the fact that it is increasingly common for renewables sustainably to produce half a location’s electricity supply. He believes that the share can be scaled up with ease, possibly to 80%. But in order to cut emissions drastically, he puts most emphasis on a tripling of energy efficiency, by designing better buildings and factories and using lighter materials, as well as by keeping some natural gas in the mix. He also sees clean-energy batteries in electric vehicles displacing oil demand, as petroleum did whale oil in the 19th century. Some sceptics raise concerns about the economic ramifications if renewables’ penetration rises substantially. In an article this month, Michael Kelly of Cambridge University focused on the energy return on investment (EROI) of solar PV and wind turbines, meaning the ratio between the amount of energy they produce to the amount of energy invested to make them. He claimed that their EROI was substantially lower than those of fossil fuels; using renewables to generate half of the world’s electricity would leave less energy free to power other types of economic activity. Critics note that his analysis is based on studies of PV returns in Spain from more than half a decade ago. Since then solar and wind costs (a proxy for EROI) have plunged, raising their returns. What is more, other studies suggest returns from fossil-fuel-derived energy have fallen, and will decline further as they incur increased costs associated with pollution and climate change. A high share of renewables may be less efficient at powering economic growth than fossil fuels were in their 20th century heyday. But if the climate doomsayers are to be proved wrong, a clean-energy system must be part of the solution.
Saturday, October 28, 2017
Comments due by Nov. 3, 2017
‘Geostorm’ movie shows dangers of hacking the climate – we need to talk about real-world geoengineering now GEOSTORM - OFFICIAL TRAILER 2 [HD] 10/28/2017 'Geostorm' movie shows dangers of hacking the climate – we need to talk about real-world geoengineering now https://theconversation.com/geostorm-movie-shows-dangers-of-hacking-the-climate-we-need-to-talk-about-real-world-geoengineering-now-85866 2/6 Fiddling with our global climate The technology in the movie “Geostorm” is laughably fantastical. But the idea of technologies that might be used to “geoengineer” the climate is not. Geoengineering, also called climate engineering, is a set of emerging technologies that could potentially offset some of the consequences of climate change. Some scientists are taking it seriously, considering geoengineering among the range of approaches for managing the risks of climate change – although always as a complement to, and not a substitute for, reducing emissions and adapting to the effects of climate change. These innovations are often lumped into two categories. Carbon dioxide removal (or negative emissions) technologies set out to actively remove greenhouse gases from the atmosphere. In contrast, solar radiation management (or solar geoengineering) aims to reduce how much sunlight reaches the Earth. Because it takes time for the climate to respond to changes, even if we stopped emitting greenhouse gases today, some level of climate change – and its associated risks – is unavoidable. Advocates of solar geoengineering argue that, if done well, these technologies might help limit some effects, including sea level rise and changes in weather patterns, and do so quickly. But as might be expected, the idea of intentionally tinkering with the Earth’s atmosphere to curb the impacts of climate change is controversial. Even conducting research into climate engineering raises some hackles. ‘Geostorm’ is far-fetched, but scientists are taking seriously the idea of engineering Earth’s climate. Global stakes are high Geoengineering could reshape our world in fundamental ways. Because of the global impacts that will inevitably accompany attempts to engineer the planet, this isn’t a technology where some people can selectively opt in or opt out out of it: Geoengineering has the potential to affect everyone. Moreover, it raises profound questions about humans’ relationship to nonhuman nature. The conversations that matter are ultimately less about the technology itself and more about what we collectively stand to gain or lose politically, culturally and socially. Much of the debate around how advisable geoengineering research is has focused on solar geoengineering, not carbon dioxide removal. One of the worries here is that figuring out aspects of solar geoengineering could lead us down a slippery slope to actually doing it. Just doing research could make deploying solar geoengineering more likely, even if it proves to be a really bad idea. And it comes with the risk that the techniques might be bad for some while good for others, potentially exacerbating existing inequalities, or creating new ones. For example, early studies using computer models indicated that injecting particles into the stratosphere to cool parts of Earth might disrupt the Asian and African summer monsoons, threatening the food supply for billions of people. Even if deployment wouldn’t necessarily result in regional inequalities, the prospect of solar geoengineering raises questions about who has the power to shape our climate futures, and who and what gets left out. Other concerns focus on possible unintended consequences of large-scale open-air experimentation – especially when our whole planet becomes the lab. There’s a fear that the consequences would be irreversible, and that the line between research and deployment is inherently fuzzy. Shading the Earth from the sun’s rays shouldn’t be a solitary pursuit. And then there’s the distraction problem, often known as the “moral hazard.” Even researching geoengineering as one potential response to climate change may distract from the necessary and difficult work of reducing greenhouse gas levels and adapting to a changing climate – not to mention the challenges of encouraging more sustainable lifestyles and practices. To be fair, many scientists in the small geoengineering community take these concerns very seriously. This was evident in the robust conversations around the ethics and politics of geoengineering at a recent meeting in Berlin. But there’s still no consensus on whether and how to engage in responsible geoengineering research. A geostorm in a teacup? So how close are we to the dystopian future of “Geostorm”? The truth is that geoengineering is still little more than a twinkle in the eyes of a small group of scientists. In the words of Jack Stilgoe, author of the book “Experiment Earth: Responsible innovation in geoengineering”: “We shouldn’t be scared of geoengineering, at least not yet. It is neither as exciting nor as terrifying as we have been led to believe, for the simple reason that it doesn’t exist.” Compared to other emerging technologies, solar geoengineering has no industrial demand and no strong economic driver as yet, and simply doesn’t appeal to national interests in global competitiveness. Because of this, it’s an idea that’s struggled to translate from the pages of academic papers and newsprint into reality. Even government agencies appear wary of funding outdoor research into solar geoengineering – possibly because it’s an ethically fraught area, but also because it’s an academically interesting idea with no clear economic or political return for those who invest in it. Climate outcomes are not good for humanity in the Hollywood version of geoengineering. Yet some supporters make a strong case for knowing more about the potential benefits, risks and efficacy of these ideas. So scientists are beginning to turn to private funding. Harvard University, for instance, recently launched the Solar Geoengineering Research Program, funded by Bill Gates, the Hewlett Foundation and others. As part of this program, researchers David Keith and Frank Keutsch are already planning small-scale experiments to inject fine sunlight-reflecting particles into the stratosphere above Tucson, Arizona. It’s a very small experiment, and wouldn’t be the first, but it aims to generate new information about whether and how such particles might one day be used to control the amount of sunlight reaching the Earth. And importantly, it suggests that, where governments fear to tread, wealthy individuals and philanthropy may end up pushing the boundaries of geoengineering research – with or without the rest of society’s consent. The case for public dialogue The upshot is there’s a growing need for public debate around whether and how to move forward. Ultimately, no amount of scientific evidence is likely to single-handedly resolve wider debates about the benefits and risks – we’ve learned this much from the persistent debates about genetically modified organisms, nuclear power and other high-impact technologies. Leaving these discussions to experts is not only counter to democratic principles but likely to be selfdefeating, as more research in complex domains can often make controversies worse. The bad news here is that research on public views about geoengineering (admittedly limited to Europe and the U.S.) suggests that most people are unfamiliar with the idea. The good news, though, is that social science research and practical experience have shown that people have the capacity to learn and deliberate on complex technologies, if given the opportunity. Hollywood’s version of the technology is one thing, but it’s time to talk about what a real future could be. Geoengineering Risk assessment solar radiation management Carbon dioxide removal Climate engineering Solar geoengineering Negative emissions Risk innovation Science in film As researchers in the responsible development and use of emerging technologies, we suggest less speculation about the ethics of imagined geoengineered futures, which can sometimes close down, rather than open up, decision-making about these technologies. Instead, we need more rigor in how we think about near-term choices around researching these ideas in ways that respond to social norms and contexts. This includes thinking hard about whether and how to govern privately funded research in this domain. And uncomfortable as it may feel, it means that scientists and political leaders need to remain open to the possibility that societies will not want to develop these ideas at all. All of this is a far cry from the Hollywood hysteria of “Geostorm.” Yet decisions about geoengineering research are already being made in real life. We probably won’t have satellite-based weather control any time soon. But if scientists intend to research technologies to deliberately intervene in our climate system, we need to start talking seriously about whether and how to collectively, and responsibly, move forward. (The Coversation)
Saturday, October 21, 2017
Comments due by Oct 28, 2017
(A$: Australian dollar)
Deloitte Access Economics has valued the Great Barrier Reef at A$56 billion, with an economic contribution of A$6.4 billion per year. Yet this figure grossly underestimates the value of the reef, as it mainly focuses on tourism and the reef’s role as an Australian icon. When you include aspects of the reef that the report excludes, such as the ecosystem services provided by coral reefs, you find that the reef is priceless. Putting a price on the Great Barrier Reef buys into the notion that a cost-benefit analysis is the right way to make decisions on policies and projects that may affect the reef. For example, the environmental cost of the extension to the Abbot Point coal terminal can be compared to any economic benefits. But as the reef is both priceless and irreplaceable, this is the wrong approach. Instead, the precautionary principle should be used to make decisions regarding the reef. Policies and projects that may damage the reef cannot go ahead. How do you value the Great Barrier Reef? The Deloitte report uses what’s known as a “contingent valuation” approach. This is a survey-based methodology, and is commonly used to measure the value of non-market environmental assets such as endangered species and national parks – as well as to calculate the impact of events such as oil spills. In valuing the reef, surveys were used to elicit people’s willingness to pay for it, such as through a tax or levy. This was found to be A$67.60 per person per year. The report also uses the travel-cost method, which estimates willingness to pay for the Great Barrier Reef, based on the time and money that people spend to visit it. Again, this is commonly used in environmental economics to value national parks and the recreational value of local lakes. Of course, all methods of valuing environmental assets have limitations. For example, it is difficult to make sure that respondents are stating realistic amounts in their willingness to pay. Respondents may act strategically if they think they really will be slugged with a Great Barrier Reef levy. They may conflate this environmental issue with all environmental issues. But more importantly, the methodology in the report leaves out the most important non-market value that the reef provides, which are called ecosystem services. For example, coral reefs provide storm protection and erosion protection, and they are the nurseries for 25% of all marine animals which themselves have commercial and existence value. The Deloitte report even cites (but does not reference) a 2014 study that values the ecosystem services provided by coral reefs at US$352,249 per hectare per year. The Great Barrier Reef Marine Park covers 35 million hectares with 2,900 individual reefs of varying sizes. This means the ecosystem services it provides are worth trillions of dollars per year. That is, it is essentially priceless. The problem with putting a value on the Reef Valuing the environment at all is contentious in economics. Valuation is performed so that all impacts from, say, a new development, can be expressed in a common metric – in this case dollars. This allows a cost-benefit analysis to be performed. But putting a price on the Great Barrier Reef hides the fact that it is irreplaceable, and as such its value is not commensurate with the values of other assets. For instance, using Deloitte’s figure, The Australian newspaper compared the reef to the value of 12 Sydney Opera Houses. But while they are both icons, the Opera House can be rebuilt. The Great Barrier Reef cannot. Any loss is irreversible. When environmental assets are irreplaceable and their loss irreversible, a more appropriate decisionmaking framework is the Precautionary Principle. The Precautionary Principle suggests that when there is uncertainty regarding the impacts of a new development on an environmental asset, decision makers should be cautious and minimise the maximum loss. For example, if it is even remotely possible that the extension to the Abbot Point coal terminal could lead to massive destruction of the reef, then precaution suggests that it shouldn’t go ahead. Assigning a value to the reef might still be appropriate under the Precautionary Principle, to estimate the maximum loss. But it would require the pricing of all values and especially ecosystem services. While the Precautionary Principle has been much maligned due to its perceived bias against development, it is a key element of the definition of Ecologically Sustainable Development in Australia’s Environment Protection and Biodiversity Conservation Act 1999. For a priceless asset like the Great Barrier Reef, it is perhaps better to leave it as “priceless” and to act accordingly. After all, if the Precautionary Principle is ever going to be used when assessing Ecologically Sustainable Development, in contrast with cost-benefit analysis and valuations, it is surely for our main environmental icon. Ultimately, the protection and prioritisation of the Great Barrier Reef is a political issue that requires political will, and not one that can be solved by pricing and economics.
Saturday, October 14, 2017
Comments due by Oct 21, 2017
LOUD conversation in a train carriage that makes concentration impossible for fellow-passengers. A farmer spraying weedkiller that destroys his neighbour’s crop. Motorists whose idling cars spew fumes into the air, polluting the atmosphere for everyone. Such behaviour might be considered thoughtless, anti-social or even immoral. For economists these spillovers are a problem to be solved. Markets are supposed to organise activity in a way that leaves everyone better off. But the interests of those directly involved, and of wider society, do not always coincide. Left to their own devices, boors may ignore travellers’ desire for peace and quiet; farmers the impact of weedkiller on the crops of others; motorists the effect of their emissions. In all of these cases, the active parties are doing well, but bystanders are not. Market prices—of rail tickets, weedkiller or petrol—do not take these wider costs, or “externalities”, into account. The examples so far are the negative sort of externality. Others are positive. Melodious music could improve everyone’s commute, for example; a new road may benefit communities by more than a private investor would take into account. Still others are more properly known as “internalities”. These are the overlooked costs people inflict on their future selves, such as when they smoke, or scoff so many sugary snacks that their health suffers. The first to lay out the idea of externalities was Alfred Marshall, a British economist. But it was one of his students at Cambridge University who became famous for his work on the problem. Born in 1877 on the Isle of Wight, Arthur Pigou cut a scruffy figure on campus. He was uncomfortable with strangers, but intellectually brilliant. Marshall championed him and with the older man’s support, Pigou succeeded him to become head of the economics faculty when he was just 30 years old. In 1920 Pigou published “The Economics of Welfare”, a dense book that outlined his vision of economics as a toolkit for improving the lives of the poor. Externalities, where “self-interest will not…tend to make the national dividend a maximum”, were central to his theme. Although Pigou sprinkled his analysis with examples that would have appealed to posh students, such as his concern for those whose land might be overrun by rabbits from a neighbouring field, others reflected graver problems. He claimed that chimney smoke in London meant that there was only 12% as much sunlight as was astronomically possible. Such pollution imposed huge “uncharged” costs on communities, in the form of dirty clothes and vegetables, and the need for expensive artificial light. If markets worked properly, people would invest more in smoke-prevention devices, he thought. Pigou was open to different ways of tackling externalities. Some things should be regulated—he scoffed at the idea that the invisible hand could guide property speculators towards creating a well-planned town. Other activities ought simply to be banned. No amount of “deceptive activity”—adulterating food, for example— could generate economic benefits, he reckoned. But he saw the most obvious forms of intervention as “bounties and taxes”. These measures would use prices to restore market perfection and avoid strangling people with red tape. Seeing that producers and sellers of “intoxicants” did not have to pay for the prisons and policemen associated with the rowdiness they caused, for example, he recommended a tax on booze. Pricier kegs should deter some drinkers; the others will pay towards the social costs they inflict. This type of intervention is now known as a Pigouvian tax. The idea is not just ubiquitous in economics courses; it is also a favourite of policymakers. The world is littered with apparently externality-busting taxes. The French government imposes a noise tax on aircraft at its nine busiest airports. Levies on drivers to counterbalance the externalities of congestion and pollution are common in the Western world. Taxes to fix internalities, like those on tobacco, are pervasive, too. Britain will join other governments in imposing a levy on unhealthy sugary drinks starting next year. Pigouvian taxes are also a big part of the policy debate over global warming. Finland and Denmark have had a carbon tax since the early 1990s; British Columbia, a Canadian province, since 2008; and Chile and Mexico since 2014. By using prices as signals, a tax should encourage people and companies to lower their carbon emissions more efficiently than a regulator could by diktat. If everyone faces the same tax, those who find it easiest to lower their emissions ought to lower them the most. Such measures do change behaviour. A tax on plastic bags in Ireland, for example, cut their use by over 90% (with some unfortunate side-effects of its own, as thefts of baskets and trolleys rose). Three years after a charge was introduced on driving in central London, congestion inside the zone had fallen by a quarter. British Columbia’s carbon tax reduced fuel consumption and greenhouse-gas emissions by an estimated 5-15%. And experience with tobacco taxes suggests that they discourage smoking, as long as they are high and smuggled substitutes are hard to find. Champions of Pigouvian taxes say that they generate a “double dividend”. As well as creating social benefits by pricing in harm, they raise revenues that can be used to lower taxes elsewhere. The Finnish carbon tax was part of a move away from taxes on labour, for example; if taxes must discourage something, better that it be pollution than work. In Denmark the tax partly funds pension contributions. Pigou flies Even as policymakers have embraced Pigou’s idea, however, its flaws, both theoretical and practical, have been scrutinised. Economists have picked holes in the theory. One major objection is the incompleteness of the framework, since it holds everything else in the economy fixed. The impact of a Pigouvian tax will depend on the level of competition in the market it is affecting, for example. If a monopoly is already using its power to reduce supply of its products, a new tax may not do any extra good. And if a dominant drinks firm absorbs the cost of an alcohol tax rather than passes it on, then it may not influence the rowdy. (A similar criticism applies to the idea of the double dividend: taxes on labour could cause people to work less than they otherwise might, but if an environmental tax raises the cost of things people spend their income on it might also have the effect of deterring work.) Another assault on Pigou’s idea came from Ronald Coase, an economist at the University of Chicago (whose theory of the firm was the subject of the first brief in this series). Coase considered externalities as a problem of ill-defined property rights. If it were feasible to assign such rights properly, people could be left to bargain their way to a good solution without the need for a heavy-handed tax. Coase used the example of a confectioner, disturbing a quiet doctor working next door with his noisy machinery. Solving the conflict with a tax would make less sense than the two neighbours bargaining their way to a solution. The law could assign the right to be noisy to the sweet-maker, and if worthwhile, the doctor could pay him to be quiet. In most cases, the sheer hassle of haggling would render this unrealistic, a problem that Coase was the first to admit. But his deeper point stands. Before charging in with a corrective tax, first think about which institutions and laws currently in place could fix things. Coase pointed out that laws against nuisance could help fix the problem of rabbits ravaging the land; quiet carriages today assign passengers to places according to their noise preferences. Others reject Pigou’s approach on moral grounds. Michael Sandel, a political philosopher at Harvard University, has worried that relying on prices and markets to fix the world’s problems can end up legitimising bad behaviour. When in 1998 one school in Haifa tried to encourage parents to pick their children up on time by fining them, tardy pickups increased. It turned out that parental guilt was a more effective deterrent than cash; making payments seems to have assuaged the guilt. Besides these more theoretical qualms about Pigouvian taxes, policymakers encounter all manner of practical ones. Pigou himself admitted that his prescriptions were vague; in “The Economics of Welfare”, though he believed taxes on damaging industries could benefit society, he did not say which ones. Nor did he spell out in much detail how to set the level of the tax. Prices in the real world are no help; their failure to incorporate social costs is the problem that needs to be solved. Getting people to reveal the precise cost to them of something like clogged roads is asking a lot. In areas like these, policymakers have had to settle on a mixture of pragmatism and public acceptability. London’s initial £5 ($8) fee for driving into its city centre was suspiciously round for a sum meant to reflect the social cost of a trip. Inevitably, a desire to raise revenue also plays a role. It would be nice to believe that politicians set Pigouvian taxes merely in order to price in an externality, but the evidence, and common sense, suggests otherwise. Research may have guided the initial level of a British landfill tax, at £7 a tonne in 1996. But other considerations may have boosted it to £40 a tonne in 2009, and thence to £80 a tonne in 2014. Things become even harder when it comes to divining the social cost of carbon emissions. Economists have diligently poked gigantic models of the global economy to calculate the relationship between temperature and GDP. But such exercises inevitably rely on heroic assumptions. And putting a dollar number on environmental Armageddon is an ethical question, as well as a technical one, relying as it does on such judgments as how to value unborn generations. The span of estimates of the economic loss to humanity from carbon emissions is unhelpfully wide as a result, ranging from around $30 to $400 a tonne. It’s the politics, stupid The question of where Pigouvian taxes fall is also tricky. A common gripe is that they are regressive, punishing poorer people, who, for example, smoke more and are less able to cope with rises in heating costs. An economist might shrug: the whole point is to raise the price for whoever is generating the externality. A politician cannot afford to be so hard-hearted. When Australia introduced a version of a carbon tax in 2012, more than half of the money ended up being given back to pensioners and poorer households to help with energy costs. The tax still sharpened incentives, the handouts softened the pain. A tax is also hard to direct very precisely at the worst offenders. Binge-drinking accounts for 77% of the costs of excessive alcohol use, as measured by lost workplace productivity and extra health-care costs, for example, but less than a fifth of Americans report drinking to excess in any one month. Economists might like to charge someone’s 12th pint of beer at a higher rate than their first, but implementing that would be a nightmare. Globalisation piles on complications. A domestic carbon tax could encourage people to switch towards imports, or hurt the competitiveness of companies’ exports, possibly even encouraging them to relocate. One solution would be to apply a tax on the carbon content of imports and refund the tax to companies on their exports, as the European Union is doing for cement. But this would be fiendishly complicated to implement across the economy. A global harmonised tax on carbon is the stuff of economists’ dreams, and set to remain so. So, Pigou handed economists a problem and a solution, elegant in theory but tricky in practice. Politics and policymaking are both harder than the blackboard scribblings of theoreticians. He was sure, however, that the effort was worthwhile. Economics, he said, was an instrument “for the bettering of human life”. (Economist)
Saturday, April 15, 2017
Sunday, March 26, 2017
Comments due by April 7, 2017
IT HAS BEEN a bad couple of years for those hoping for the death of driving. In America, where cars are an important part of the national psyche, a decade ago people had suddenly started to drive less, which had not happened since the oil shocks of the 1970s. Academics started to talk excitedly about “peak driving”boomers, car-shy millennials, ride-sharing apps such as Uber and even the distraction of Facebook. Yet the causes may have been more prosaic: a combination of higher petrol prices and lower incomes in the wake of the 2008-09 financial crisis. Since the drop in oil prices in 2014, and a recovery in employment, the number of vehicle-miles travelled has rebounded, and sales of trucks and SUVs, which are less fuel-efficient than cars, have hit record highs. This sensitivity to prices and incomes is important for global oil demand. More than half the world’s oil is used for transport, and of that, 46% goes into passenger cars. But the response to lower prices has been partially offset by dramatic improvements in fuel efficiency in America and elsewhere, thanks to standards like America’s Corporate Average Fuel Economy (CAFE), the EU’s rules on CO2 emissions and those in place in China since 2012. The IEA says that such measures cut oil consumption in 2015 by a whopping 2.3m b/d. This is particularly impressive because interest in fuel efficiency usually wanes when prices are low. If best practice were applied to all the world’s vehicles, the savings would be 4.3m b/d, roughly equivalent to the crude output of Canada. This helps explain why some forecasters think demand for petrol may peak within the next 10-15 years even if the world’s vehicle fleet keeps growing. Occo Roelofsen of McKinsey, a consultancy, goes further. He reckons that thanks to the decline in the use of oil in light vehicles, total consumption of liquid fuels will begin to fall within a decade, and that in the next few decades driving will be shaken up by electric vehicles (EVs), self-driving cars and car-sharing. America’s Department of Energy (DoE) officials underline the importance of such a shift, given the need for “deep decarbonisation” enshrined in the Paris climate agreement. “We can’t decarbonise by mid-century if we don’t electrify the transportation sector,” says a senior official in Washington, DC. It is still unclear what effect Donald Trump’s election will have on this transition. In a recent paper entitled “Will We Ever Stop Using Fossil Fuels?”, Thomas Covert and Michael Greenstone of the University of Chicago, and Christopher Knittel of the Massachusetts Institute of Technology, argue that several technological advances are needed to displace oil in the car industry. Even with oil at $100 a barrel, the price of batteries to power EVs would need to fall by a factor of three, and they would need to charge much faster. Moreover, the electricity used to power the cars would need to become far less carbon-intensive; for now, emissions from EVs powered by America’s electricity grid are higher than those from highly efficient petrol engines, say the authors. My kingdom for a cheap battery They calculate that at a battery’s current price of around $325 per kilowatt hour (kWh), oil prices would need to be above $350 a barrel for EVs to be cost-competitive in 2020. Even if they were to fall to the DoE’s target of $125 per kWh, they would still need an oil price of $115 a barrel to break even. But if battery prices fell that much, oil would probably become much cheaper, too, making petrol engines more attractive. Even with a carbon tax, the break-even oil price falls only to $90 a barrel. Those estimates may be too conservative, but the high cost of batteries and their short range help explain why EVs still make up only 0.1% of the global car fleet (though getting to 1m of them last year was a milestone). They are still mostly too expensive for all but wealthy cleanenergy pioneers. Many experts dismiss the idea that EVs will soon be able seriously to disrupt oil demand. Yet they may be missing something. Battery costs have fallen by 80% since 2008, and though the rate of improvement may be slowing, EV sales last year rose by 70%, to 550,000. They actually fell in America, probably because of low petrol prices, but tripled in China, which became the world’s biggest EV market. Next year Tesla aims to bring out its more affordable Model 3. It hopes that the cost of the batteries mass-produced at its new Gigafactory in Nevada will come down to below $100 per kWh by 2020 (see chart), and that they will offer a range of 215 miles (350km) on a single charge. Countries that have offered strong incentives to switch to EVs have seen rapid growth in their use. Norway, for instance, offers lower taxes, free use of toll roads and access to bus lanes. Almost a quarter of the new cars sold there are now electric (ample hydroelectricity makes the grid unusually clean, too). This bodes well for future growth, especially if governments strengthen their commitment to electrification in the wake of the Paris accord. The Electric Vehicles Initiative (EVI), an umbrella group of 16 EV-using nations, has pledged to get to 20m by 2020. The IEA says that to stand a chance of hitting the 2ºC globalwarming target, there would need to be 700m EVs on the road by 2040. That seems hugely ambitious. It would put annual growth in EV sales on a par with Ford’s Model T—at a time when the car industry is also in a potentially epoch-making transition to self-driving vehicles. But imagine that the EVI’s forecast were achievable. By 2020 new EV sales would be running at around 7m a year, displacing the growth in sales of new petrol engines, says Kingsmill Bond of Trusted Sources, a research firm. Investors, focusing not just on total demand for oil but on the change in demand, might see that as something of a tipping point. As Mr Bond puts it: “Investors should not rely on the phlegmatic approach of historians who tell them not to worry about change"
Sunday, March 19, 2017
Comments due by March 31, 2017
The post for this week is slightly different than usual. Actually there is nothing to read, it is a 21 minute video that is 10 years old but that is still one of the best efforts to explain in plain language The Story of Stuff. Give it a look. Enjoy.
Click on the above link and watch the 21 minute video. (If the link is dead then copy and paste)