Saturday, October 28, 2017

Real World Geoengineering

                                                      Comments due by Nov. 3, 2017

‘Geostorm’ movie shows dangers of hacking the climate – we need to talk about real-world geoengineering now GEOSTORM - OFFICIAL TRAILER 2 [HD] 10/28/2017 'Geostorm' movie shows dangers of hacking the climate – we need to talk about real-world geoengineering now 2/6 Fiddling with our global climate The technology in the movie “Geostorm” is laughably fantastical. But the idea of technologies that might be used to “geoengineer” the climate is not. Geoengineering, also called climate engineering, is a set of emerging technologies that could potentially offset some of the consequences of climate change. Some scientists are taking it seriously, considering geoengineering among the range of approaches for managing the risks of climate change – although always as a complement to, and not a substitute for, reducing emissions and adapting to the effects of climate change. These innovations are often lumped into two categories. Carbon dioxide removal (or negative emissions) technologies set out to actively remove greenhouse gases from the atmosphere. In contrast, solar radiation management (or solar geoengineering) aims to reduce how much sunlight reaches the Earth. Because it takes time for the climate to respond to changes, even if we stopped emitting greenhouse gases today, some level of climate change – and its associated risks – is unavoidable. Advocates of solar geoengineering argue that, if done well, these technologies might help limit some effects, including sea level rise and changes in weather patterns, and do so quickly. But as might be expected, the idea of intentionally tinkering with the Earth’s atmosphere to curb the impacts of climate change is controversial. Even conducting research into climate engineering raises some hackles. ‘Geostorm’ is far-fetched, but scientists are taking seriously the idea of engineering Earth’s climate. Global stakes are high Geoengineering could reshape our world in fundamental ways. Because of the global impacts that will inevitably accompany attempts to engineer the planet, this isn’t a technology where some people can selectively opt in or opt out out of it: Geoengineering has the potential to affect everyone. Moreover, it raises profound questions about humans’ relationship to nonhuman nature. The conversations that matter are ultimately less about the technology itself and more about what we collectively stand to gain or lose politically, culturally and socially. Much of the debate around how advisable geoengineering research is has focused on solar geoengineering, not carbon dioxide removal. One of the worries here is that figuring out aspects of solar geoengineering could lead us down a slippery slope to actually doing it. Just doing research could make deploying solar geoengineering more likely, even if it proves to be a really bad idea. And it comes with the risk that the techniques might be bad for some while good for others, potentially exacerbating existing inequalities, or creating new ones. For example, early studies using computer models indicated that injecting particles into the stratosphere to cool parts of Earth might disrupt the Asian and African summer monsoons, threatening the food supply for billions of people. Even if deployment wouldn’t necessarily result in regional inequalities, the prospect of solar geoengineering raises questions about who has the power to shape our climate futures, and who and what gets left out. Other concerns focus on possible unintended consequences of large-scale open-air experimentation – especially when our whole planet becomes the lab. There’s a fear that the consequences would be irreversible, and that the line between research and deployment is inherently fuzzy. Shading the Earth from the sun’s rays shouldn’t be a solitary pursuit. And then there’s the distraction problem, often known as the “moral hazard.” Even researching geoengineering as one potential response to climate change may distract from the necessary and difficult work of reducing greenhouse gas levels and adapting to a changing climate – not to mention the challenges of encouraging more sustainable lifestyles and practices. To be fair, many scientists in the small geoengineering community take these concerns very seriously. This was evident in the robust conversations around the ethics and politics of geoengineering at a recent meeting in Berlin. But there’s still no consensus on whether and how to engage in responsible geoengineering research. A geostorm in a teacup? So how close are we to the dystopian future of “Geostorm”? The truth is that geoengineering is still little more than a twinkle in the eyes of a small group of scientists. In the words of Jack Stilgoe, author of the book “Experiment Earth: Responsible innovation in geoengineering”: “We shouldn’t be scared of geoengineering, at least not yet. It is neither as exciting nor as terrifying as we have been led to believe, for the simple reason that it doesn’t exist.” Compared to other emerging technologies, solar geoengineering has no industrial demand and no strong economic driver as yet, and simply doesn’t appeal to national interests in global competitiveness. Because of this, it’s an idea that’s struggled to translate from the pages of academic papers and newsprint into reality. Even government agencies appear wary of funding outdoor research into solar geoengineering – possibly because it’s an ethically fraught area, but also because it’s an academically interesting idea with no clear economic or political return for those who invest in it. Climate outcomes are not good for humanity in the Hollywood version of geoengineering. Yet some supporters make a strong case for knowing more about the potential benefits, risks and efficacy of these ideas. So scientists are beginning to turn to private funding. Harvard University, for instance, recently launched the Solar Geoengineering Research Program, funded by Bill Gates, the Hewlett Foundation and others. As part of this program, researchers David Keith and Frank Keutsch are already planning small-scale experiments to inject fine sunlight-reflecting particles into the stratosphere above Tucson, Arizona. It’s a very small experiment, and wouldn’t be the first, but it aims to generate new information about whether and how such particles might one day be used to control the amount of sunlight reaching the Earth. And importantly, it suggests that, where governments fear to tread, wealthy individuals and philanthropy may end up pushing the boundaries of geoengineering research – with or without the rest of society’s consent. The case for public dialogue The upshot is there’s a growing need for public debate around whether and how to move forward. Ultimately, no amount of scientific evidence is likely to single-handedly resolve wider debates about the benefits and risks – we’ve learned this much from the persistent debates about genetically modified organisms, nuclear power and other high-impact technologies. Leaving these discussions to experts is not only counter to democratic principles but likely to be selfdefeating, as more research in complex domains can often make controversies worse. The bad news here is that research on public views about geoengineering (admittedly limited to Europe and the U.S.) suggests that most people are unfamiliar with the idea. The good news, though, is that social science research and practical experience have shown that people have the capacity to learn and deliberate on complex technologies, if given the opportunity. Hollywood’s version of the technology is one thing, but it’s time to talk about what a real future could be. Geoengineering Risk assessment solar radiation management Carbon dioxide removal Climate engineering Solar geoengineering Negative emissions Risk innovation Science in film As researchers in the responsible development and use of emerging technologies, we suggest less speculation about the ethics of imagined geoengineered futures, which can sometimes close down, rather than open up, decision-making about these technologies. Instead, we need more rigor in how we think about near-term choices around researching these ideas in ways that respond to social norms and contexts. This includes thinking hard about whether and how to govern privately funded research in this domain. And uncomfortable as it may feel, it means that scientists and political leaders need to remain open to the possibility that societies will not want to develop these ideas at all. All of this is a far cry from the Hollywood hysteria of “Geostorm.” Yet decisions about geoengineering research are already being made in real life. We probably won’t have satellite-based weather control any time soon. But if scientists intend to research technologies to deliberately intervene in our climate system, we need to start talking seriously about whether and how to collectively, and responsibly, move forward. (The Coversation)

Saturday, October 21, 2017

What Is The Value of the Great Reef?

                                                  Comments due by Oct 28, 2017
                                                           (A$: Australian dollar)

Deloitte Access Economics has valued the Great Barrier Reef at A$56 billion, with an economic contribution of A$6.4 billion per year. Yet this figure grossly underestimates the value of the reef, as it mainly focuses on tourism and the reef’s role as an Australian icon. When you include aspects of the reef that the report excludes, such as the ecosystem services provided by coral reefs, you find that the reef is priceless. Putting a price on the Great Barrier Reef buys into the notion that a cost-benefit analysis is the right way to make decisions on policies and projects that may affect the reef. For example, the environmental cost of the extension to the Abbot Point coal terminal can be compared to any economic benefits. But as the reef is both priceless and irreplaceable, this is the wrong approach. Instead, the precautionary principle should be used to make decisions regarding the reef. Policies and projects that may damage the reef cannot go ahead. How do you value the Great Barrier Reef?  The Deloitte report uses what’s known as a “contingent valuation” approach. This is a survey-based methodology, and is commonly used to measure the value of non-market environmental assets such as endangered species and national parks – as well as to calculate the impact of events such as oil spills. In valuing the reef, surveys were used to elicit people’s willingness to pay for it, such as through a tax or levy. This was found to be A$67.60 per person per year. The report also uses the travel-cost method, which estimates willingness to pay for the Great Barrier Reef, based on the time and money that people spend to visit it. Again, this is commonly used in environmental economics to value national parks and the recreational value of local lakes. Of course, all methods of valuing environmental assets have limitations. For example, it is difficult to make sure that respondents are stating realistic amounts in their willingness to pay. Respondents may act strategically if they think they really will be slugged with a Great Barrier Reef levy. They may conflate this environmental issue with all environmental issues. But more importantly, the methodology in the report leaves out the most important non-market value that the reef provides, which are called ecosystem services. For example, coral reefs provide storm protection and erosion protection, and they are the nurseries for 25% of all marine animals which themselves have commercial and existence value. The Deloitte report even cites (but does not reference) a 2014 study that values the ecosystem services provided by coral reefs at US$352,249 per hectare per year. The Great Barrier Reef Marine Park covers 35 million hectares with 2,900 individual reefs of varying sizes. This means the ecosystem services it provides are worth trillions of dollars per year. That is, it is essentially priceless. The problem with putting a value on the Reef Valuing the environment at all is contentious in economics. Valuation is performed so that all impacts from, say, a new development, can be expressed in a common metric – in this case dollars. This allows a cost-benefit analysis to be performed. But putting a price on the Great Barrier Reef hides the fact that it is irreplaceable, and as such its value is not commensurate with the values of other assets. For instance, using Deloitte’s figure, The Australian newspaper compared the reef to the value of 12 Sydney Opera Houses. But while they are both icons, the Opera House can be rebuilt. The Great Barrier Reef cannot. Any loss is irreversible. When environmental assets are irreplaceable and their loss irreversible, a more appropriate decisionmaking framework is the Precautionary Principle. The Precautionary Principle suggests that when there is uncertainty regarding the impacts of a new development on an environmental asset, decision makers should be cautious and minimise the maximum loss. For example, if it is even remotely possible that the extension to the Abbot Point coal terminal could lead to massive destruction of the reef, then precaution suggests that it shouldn’t go ahead. Assigning a value to the reef might still be appropriate under the Precautionary Principle, to estimate the maximum loss. But it would require the pricing of all values and especially ecosystem services. While the Precautionary Principle has been much maligned due to its perceived bias against development, it is a key element of the definition of Ecologically Sustainable Development in Australia’s Environment Protection and Biodiversity Conservation Act 1999. For a priceless asset like the Great Barrier Reef, it is perhaps better to leave it as “priceless” and to act accordingly. After all, if the Precautionary Principle is ever going to be used when assessing Ecologically Sustainable Development, in contrast with cost-benefit analysis and valuations, it is surely for our main environmental icon. Ultimately, the protection and prioritisation of the Great Barrier Reef is a political issue that requires political will, and not one that can be solved by pricing and economics.

Saturday, October 14, 2017

Pigouvian Taxes

                                                 Comments due by Oct 21, 2017

LOUD conversation in a train carriage that makes concentration impossible for fellow-passengers. A farmer spraying weedkiller that destroys his neighbour’s crop. Motorists whose idling cars spew fumes into the air, polluting the atmosphere for everyone. Such behaviour might be considered thoughtless, anti-social or even immoral. For economists these spillovers are a problem to be solved. Markets are supposed to organise activity in a way that leaves everyone better off. But the interests of those directly involved, and of wider society, do not always coincide. Left to their own devices, boors may ignore travellers’ desire for peace and quiet; farmers the impact of weedkiller on the crops of others; motorists the effect of their emissions. In all of these cases, the active parties are doing well, but bystanders are not. Market prices—of rail tickets, weedkiller or petrol—do not take these wider costs, or “externalities”, into account. The examples so far are the negative sort of externality. Others are positive. Melodious music could improve everyone’s commute, for example; a new road may benefit communities by more than a private investor would take into account. Still others are more properly known as “internalities”. These are the overlooked costs people inflict on their future selves, such as when they smoke, or scoff so many sugary snacks that their health suffers. The first to lay out the idea of externalities was Alfred Marshall, a British economist. But it was one of his students at Cambridge University who became famous for his work on the problem. Born in 1877 on the Isle of Wight, Arthur Pigou cut a scruffy figure on campus. He was uncomfortable with strangers, but intellectually brilliant. Marshall championed him and with the older man’s support, Pigou succeeded him to become head of the economics faculty when he was just 30 years old. In 1920 Pigou published “The Economics of Welfare”, a dense book that outlined his vision of economics as a toolkit for improving the lives of the poor. Externalities, where “self-interest will not…tend to make the national dividend a maximum”, were central to his theme. Although Pigou sprinkled his analysis with examples that would have appealed to posh students, such as his concern for those whose land might be overrun by rabbits from a neighbouring field, others reflected graver problems. He claimed that chimney smoke in London meant that there was only 12% as much sunlight as was astronomically possible. Such pollution imposed huge “uncharged” costs on communities, in the form of dirty clothes and vegetables, and the need for expensive artificial light. If markets worked properly, people would invest more in smoke-prevention devices, he thought. Pigou was open to different ways of tackling externalities. Some things should be regulated—he scoffed at the idea that the invisible hand could guide property speculators towards creating a well-planned town. Other activities ought simply to be banned. No amount of “deceptive activity”—adulterating food, for example— could generate economic benefits, he reckoned. But he saw the most obvious forms of intervention as “bounties and taxes”. These measures would use prices to restore market perfection and avoid strangling people with red tape. Seeing that producers and sellers of “intoxicants” did not have to pay for the prisons and policemen associated with the rowdiness they caused, for example, he recommended a tax on booze. Pricier kegs should deter some drinkers; the others will pay towards the social costs they inflict. This type of intervention is now known as a Pigouvian tax. The idea is not just ubiquitous in economics courses; it is also a favourite of policymakers. The world is littered with apparently externality-busting taxes. The French government imposes a noise tax on aircraft at its nine busiest airports. Levies on drivers to counterbalance the externalities of congestion and pollution are common in the Western world. Taxes to fix internalities, like those on tobacco, are pervasive, too. Britain will join other governments in imposing a levy on unhealthy sugary drinks starting next year. Pigouvian taxes are also a big part of the policy debate over global warming. Finland and Denmark have had a carbon tax since the early 1990s; British Columbia, a Canadian province, since 2008; and Chile and Mexico since 2014. By using prices as signals, a tax should encourage people and companies to lower their carbon emissions more efficiently than a regulator could by diktat. If everyone faces the same tax, those who find it easiest to lower their emissions ought to lower them the most. Such measures do change behaviour. A tax on plastic bags in Ireland, for example, cut their use by over 90% (with some unfortunate side-effects of its own, as thefts of baskets and trolleys rose). Three years after a charge was introduced on driving in central London, congestion inside the zone had fallen by a quarter. British Columbia’s carbon tax reduced fuel consumption and greenhouse-gas emissions by an estimated 5-15%. And experience with tobacco taxes suggests that they discourage smoking, as long as they are high and smuggled substitutes are hard to find. Champions of Pigouvian taxes say that they generate a “double dividend”. As well as creating social benefits by pricing in harm, they raise revenues that can be used to lower taxes elsewhere. The Finnish carbon tax was part of a move away from taxes on labour, for example; if taxes must discourage something, better that it be pollution than work. In Denmark the tax partly funds pension contributions. Pigou flies Even as policymakers have embraced Pigou’s idea, however, its flaws, both theoretical and practical, have been scrutinised. Economists have picked holes in the theory. One major objection is the incompleteness of the framework, since it holds everything else in the economy fixed. The impact of a Pigouvian tax will depend on the level of competition in the market it is affecting, for example. If a monopoly is already using its power to reduce supply of its products, a new tax may not do any extra good. And if a dominant drinks firm absorbs the cost of an alcohol tax rather than passes it on, then it may not influence the rowdy. (A similar criticism applies to the idea of the double dividend: taxes on labour could cause people to work less than they otherwise might, but if an environmental tax raises the cost of things people spend their income on it might also have the effect of deterring work.) Another assault on Pigou’s idea came from Ronald Coase, an economist at the University of Chicago (whose theory of the firm was the subject of the first brief in this series). Coase considered externalities as a problem of ill-defined property rights. If it were feasible to assign such rights properly, people could be left to bargain their way to a good solution without the need for a heavy-handed tax. Coase used the example of a confectioner, disturbing a quiet doctor working next door with his noisy machinery. Solving the conflict with a tax would make less sense than the two neighbours bargaining their way to a solution. The law could assign the right to be noisy to the sweet-maker, and if worthwhile, the doctor could pay him to be quiet. In most cases, the sheer hassle of haggling would render this unrealistic, a problem that Coase was the first to admit. But his deeper point stands. Before charging in with a corrective tax, first think about which institutions and laws currently in place could fix things. Coase pointed out that laws against nuisance could help fix the problem of rabbits ravaging the land; quiet carriages today assign passengers to places according to their noise preferences. Others reject Pigou’s approach on moral grounds. Michael Sandel, a political philosopher at Harvard University, has worried that relying on prices and markets to fix the world’s problems can end up legitimising bad behaviour. When in 1998 one school in Haifa tried to encourage parents to pick their children up on time by fining them, tardy pickups increased. It turned out that parental guilt was a more effective deterrent than cash; making payments seems to have assuaged the guilt. Besides these more theoretical qualms about Pigouvian taxes, policymakers encounter all manner of practical ones. Pigou himself admitted that his prescriptions were vague; in “The Economics of Welfare”, though he believed taxes on damaging industries could benefit society, he did not say which ones. Nor did he spell out in much detail how to set the level of the tax. Prices in the real world are no help; their failure to incorporate social costs is the problem that needs to be solved. Getting people to reveal the precise cost to them of something like clogged roads is asking a lot. In areas like these, policymakers have had to settle on a mixture of pragmatism and public acceptability. London’s initial £5 ($8) fee for driving into its city centre was suspiciously round for a sum meant to reflect the social cost of a trip. Inevitably, a desire to raise revenue also plays a role. It would be nice to believe that politicians set Pigouvian taxes merely in order to price in an externality, but the evidence, and common sense, suggests otherwise. Research may have guided the initial level of a British landfill tax, at £7 a tonne in 1996. But other considerations may have boosted it to £40 a tonne in 2009, and thence to £80 a tonne in 2014. Things become even harder when it comes to divining the social cost of carbon emissions. Economists have diligently poked gigantic models of the global economy to calculate the relationship between temperature and GDP. But such exercises inevitably rely on heroic assumptions. And putting a dollar number on environmental Armageddon is an ethical question, as well as a technical one, relying as it does on such judgments as how to value unborn generations. The span of estimates of the economic loss to humanity from carbon emissions is unhelpfully wide as a result, ranging from around $30 to $400 a tonne. It’s the politics, stupid The question of where Pigouvian taxes fall is also tricky. A common gripe is that they are regressive, punishing poorer people, who, for example, smoke more and are less able to cope with rises in heating costs. An economist might shrug: the whole point is to raise the price for whoever is generating the externality. A politician cannot afford to be so hard-hearted. When Australia introduced a version of a carbon tax in 2012, more than half of the money ended up being given back to pensioners and poorer households to help with energy costs. The tax still sharpened incentives, the handouts softened the pain. A tax is also hard to direct very precisely at the worst offenders. Binge-drinking accounts for 77% of the costs of excessive alcohol use, as measured by lost workplace productivity and extra health-care costs, for example, but less than a fifth of Americans report drinking to excess in any one month. Economists might like to charge someone’s 12th pint of beer at a higher rate than their first, but implementing that would be a nightmare. Globalisation piles on complications. A domestic carbon tax could encourage people to switch towards imports, or hurt the competitiveness of companies’ exports, possibly even encouraging them to relocate. One solution would be to apply a tax on the carbon content of imports and refund the tax to companies on their exports, as the European Union is doing for cement. But this would be fiendishly complicated to implement across the economy. A global harmonised tax on carbon is the stuff of economists’ dreams, and set to remain so. So, Pigou handed economists a problem and a solution, elegant in theory but tricky in practice. Politics and policymaking are both harder than the blackboard scribblings of theoreticians. He was sure, however, that the effort was worthwhile. Economics, he said, was an instrument “for the bettering of human life”. (Economist)

Friday, October 06, 2017

cost benefit analysis


                                                           Comments due by Oct. 14, 2017
The notion that a zero pollution objective is not necessarily ideal policy is one of the more difficult concepts for environmental economists to convey.  After all, if pollution is bad shouldn’t we design policy to completely eliminate it?  Many of us are drawn to the field based on a genuine concern for the environment and the belief that economics provides a powerful tool for helping solve environmental problems.  Yet we are often in the position of recommending policies that appear on the surface to be anti-environmental.  How can these observations be reconciled?  The answer lies in understanding scarcity:  we have unlimited wants, but live in a world with limited means.  Economists in general study how people make decisions when faced with scarcity.  Scarcity implies that resources devoted to one end are not available to meet another; hence there is an opportunity cost of any action.  This includes environmental policy.  For example, funds used by a municipality to retrofit its water treatment plant to remove trace amounts of arsenic (a carcinogen) cannot also be used to improve local primary education. Environmental economists are tasked with recommending policies that reflect scarcity of this type at the society level.  For both individuals and societies scarcity necessitates tradeoffs, and the reality of tradeoffs can make the complete elimination of pollution undesirable.  Once this is acknowledged the pertinent question becomes how much pollution should be eliminated.  How should we decide?  Who gets to decide?  To help provide answers economists use an analytical tool called cost-benefit analysis. 

Cost-benefit analysis provides an organizational framework for identifying, quantifying, and comparing the costs and benefits (measured in dollars) of a proposed policy action.  The final decision is informed (though not necessarily determined) by a comparison of the total costs and benefits.  While this sounds logical enough, cost-benefit analysis has been cause for substantial debate when used in the environmental arena (see the online debate between Lisa Heinzerling, Frank Ackerman, and Kerry Smith).  The benefits of environmental regulations can include, for example, reduced human and wildlife mortality, improved water quality, species preservation, and better recreation opportunities.  The costs are usually reflected in higher prices for consumer goods and/or higher taxes.  The latter are market effects readily measured in dollars, while the former are nonmarket effects for which dollar values are not available.  In addition to complicating the practice of cost-benefit analysis (dollar values for the nonmarket effects must be inferred rather than directly observed) this raises ethical issues.  Should we assign dollar values to undisturbed natural places?  To human lives saved?  To the existence of blue whales and grey wolves?  If we decide such things are too ‘priceless’ to assign dollar values we lose the ability to use cost-benefit analysis to inform the decision.  What then is the alternative?  How do we decide?  Who gets to decide?
Environmental economists tend to favor cost-benefit analysis in the policy arena because of the discipline and transparency it provides in evaluating policy options.  It is easy to evaluate absolutes.  Most would agree that reducing nitrogen contamination of groundwater wells, limiting the occurrence of code red ozone alerts, and preserving habitat for grizzly bears are worthy goals.  Determining the relative merits of any one of these compared to the others, or compared to non-environmental goals such as improving public education, is much more daunting.  Because policy making is ultimately about evaluating the relative merits of different actions some mechanism is needed to rank the alternatives.  Without the discipline of cost-benefit analysis it is not clear how the interests, claims, and opinions of parties affected by a proposed regulation can be examined and compared.  Criterion such as ‘moral’ or ‘fair’ do not lend themselves well to comparison and are subject to wide ranging interpretation.  Who gets to decide what is moral or fair?  Cost-benefit analysis is far from perfect, but it demands a level of objectivity and specificity that are necessary components of good decision making.
To begin this post I described an apparent contradiction:  environmental economists who consider themselves ‘environmentalists’ will on occasion recommend environmental regulations that do not seek to completely eliminate pollution.  Hopefully it is now clear that this is really not a contradiction.  Environmentalists come in many forms, including activists, lobbyists, spokesmen, natural scientists, and even economists.  Economics provides a structured framework for evaluating outcomes absent hype and advocacy.  Cost-benefit analysis is a part of this.  By using the tools of their field environmental economists can contribute unbiased information that can lead to better policy decisions, and ultimately better environmental outcomes. 
                                                                                                                       (The Cromulent Econ)

Saturday, April 15, 2017

Are climate targets set in Paris achievable?

© Richie Chan | Shutterstock

                                                          Comments due by Apr. 22 , 2017

In order to have a good chance of meeting the limits set by the Paris Agreement, it will be necessary to both reduce greenhouse gas emissions while preserving carbon sinks,  with net emissions peaking in the next ten years, according to a new study.

Carbon dioxide (CO2) and other greenhouse gases in the atmosphere can be reduce in  two ways—by cutting our emissions, or by removing it from the atmosphere, for example through plants, the ocean, and soil.
The historic Paris Agreement set a target of limiting future global average temperature increase to well below 2°C and pursue efforts to even further limit the average increase to 1.5°C above pre-industrial levels. Yet the timing and details of these efforts were left to individual countries.
In a new study, published in the journal Nature Communications, researchers from the International Institute for Applied Systems Analysis (IIASA) used a global model of the carbon system that accounts for carbon release and uptake through both natural and anthropogenic activities.
“The study shows that the combined energy and land-use system should deliver zero net anthropogenic emissions well before 2040 in order to assure the attainability of a 1.5°C target by 2100,” says IIASA Ecosystems Services and Management Program Director Michael Obersteiner, a study coauthor.
According to the study, fossil fuel consumption would likely need to be reduced to less than 25% of the global energy supply by 2100, compared to 95% today. At the same time, land use change, such as deforestation, must be decreased. This would lead to a 42% decrease in cumulative emissions by the end of the century compared to a business as usual scenario.
“This study gives a broad accounting of the carbon dioxide in our atmosphere, where it comes from and where it goes. We take into account not just emissions from fossil fuels, but also agriculture, land use, food production, bioenergy, and carbon uptake by natural ecosystems,” explains World Bank consultant Brian Walsh, who led the study while working as an IIASA researcher. 
Atmospheric carbon concentration [ppm] in the various scenarios, shown with CDIAC data and RCP projections © Walsh et al, 2017

The compares four different scenarios for future energy development, with a range of mixtures of renewable and fossil energy.  In a “high-renewable” scenario where wind, solar, and bioenergy increase by around 5% a year, net emissions could peak by 2022, the study shows. Yet without substantial negative emissions technologies, that pathway would still lead to a global average temperature rise of 2.5°C, missing the Paris Agreement target. 
Walsh notes that the high-renewable energy scenario is ambitious, but not impossible—global production of renewable energy grew 2.6% between 2013 and 2014, according to the IEA. In contrast, the study finds that continued reliance on fossil fuels (with growth rates of renewables between 2% and 3% per year), would cause carbon emissions to peak only at the end of the century, causing an estimated 3.5°C global temperature rise by 2100.
 The authors note that not only the mix of energy matters, but also the overall amount of energy consumed. The study also included ranges for high energy consumption and low energy consumption.
The study adds to a large body of IIASA research on climate mitigation policy and the chances of achieving targets.
“Earlier work on mitigation strategies by IIASA has shown the importance of demand-side measures, including efficiency, conservation, and behavioral change. Success in these areas may explain the difference between reaching 1.5C instead of 2C,” says IIASA Energy Program Director Keywan Riahi, who also contributed to the new work.

Sunday, March 26, 2017

The Demand for Oil will drop but is that enough?

                                                   Comments due by April 7, 2017

IT HAS BEEN a bad couple of years for those hoping for the death of driving. In America, where cars are an important part of the national psyche, a decade ago people had suddenly started to drive less, which had not happened since the oil shocks of the 1970s. Academics started to talk excitedly about “peak driving”boomers, car-shy millennials, ride-sharing apps such as Uber and even the distraction of Facebook. Yet the causes may have been more prosaic: a combination of higher petrol prices and lower incomes in the wake of the 2008-09 financial crisis. Since the drop in oil prices in 2014, and a recovery in employment, the number of vehicle-miles travelled has rebounded, and sales of trucks and SUVs, which are less fuel-efficient than cars, have hit record highs. This sensitivity to prices and incomes is important for global oil demand. More than half the world’s oil is used for transport, and of that, 46% goes into passenger cars. But the response to lower prices has been partially offset by dramatic improvements in fuel efficiency in America and elsewhere, thanks to standards like America’s Corporate Average Fuel Economy (CAFE), the EU’s rules on CO2 emissions and those in place in China since 2012. The IEA says that such measures cut oil consumption in 2015 by a whopping 2.3m b/d. This is particularly impressive because interest in fuel efficiency usually wanes when prices are low. If best practice were applied to all the world’s vehicles, the savings would be 4.3m b/d, roughly equivalent to the crude output of Canada. This helps explain why some forecasters think demand for petrol may peak within the next 10-15 years even if the world’s vehicle fleet keeps growing. Occo Roelofsen of McKinsey, a consultancy, goes further. He reckons that thanks to the decline in the use of oil in light vehicles, total consumption of liquid fuels will begin to fall within a decade, and that in the next few decades driving will be shaken up by electric vehicles (EVs), self-driving cars and car-sharing. America’s Department of Energy (DoE) officials underline the importance of such a shift, given the need for “deep decarbonisation” enshrined in the Paris climate agreement. “We can’t decarbonise by mid-century if we don’t electrify the transportation sector,” says a senior official in Washington, DC. It is still unclear what effect Donald Trump’s election will have on this transition. In a recent paper entitled “Will We Ever Stop Using Fossil Fuels?”, Thomas Covert and Michael Greenstone of the University of Chicago, and Christopher Knittel of the Massachusetts Institute of Technology, argue that several technological advances are needed to displace oil in the car industry. Even with oil at $100 a barrel, the price of batteries to power EVs would need to fall by a factor of three, and they would need to charge much faster. Moreover, the electricity used to power the cars would need to become far less carbon-intensive; for now, emissions from EVs  powered by America’s electricity grid are higher than those from highly efficient petrol engines, say the authors. My kingdom for a cheap battery They calculate that at a battery’s current price of around $325 per kilowatt hour (kWh), oil prices would need to be above $350 a barrel for EVs to be cost-competitive in 2020. Even if they were to fall to the DoE’s target of $125 per kWh, they would still need an oil price of $115 a barrel to break even. But if battery prices fell that much, oil would probably become much cheaper, too, making petrol engines more attractive. Even with a carbon tax, the break-even oil price falls only to $90 a barrel. Those estimates may be too conservative, but the high cost of batteries and their short range help explain why EVs still make up only 0.1% of the global car fleet (though getting to 1m of them last year was a milestone). They are still mostly too expensive for all but wealthy cleanenergy pioneers. Many experts dismiss the idea that EVs will soon be able seriously to disrupt oil demand. Yet they may be missing something. Battery costs have fallen by 80% since 2008, and though the rate of improvement may be slowing, EV sales last year rose by 70%, to 550,000. They actually fell in America, probably because of low petrol prices, but tripled in China, which became the world’s biggest EV market. Next year Tesla aims to bring out its more affordable Model 3. It hopes that the cost of the batteries mass-produced at its new Gigafactory in Nevada will come down to below $100 per kWh by 2020 (see chart), and that they will offer a range of 215 miles (350km) on a single charge. Countries that have offered strong incentives to switch to EVs have seen rapid growth in their use. Norway, for instance, offers lower taxes, free use of toll roads and access to bus lanes. Almost a quarter of the new cars sold there are now electric (ample hydroelectricity makes the grid unusually clean, too). This bodes well for future growth, especially if governments strengthen their commitment to electrification in the wake of the Paris accord. The Electric Vehicles Initiative (EVI), an umbrella group of 16 EV-using nations, has pledged to get to 20m by 2020. The IEA says that to stand a chance of hitting the 2ÂșC globalwarming target, there would need to be 700m EVs on the road by 2040. That seems hugely ambitious. It would put annual growth in EV sales on a par with Ford’s Model T—at a time when the car industry is also in a potentially epoch-making transition to self-driving vehicles. But imagine that the EVI’s forecast were achievable. By 2020 new EV sales would be running at around 7m a year, displacing the growth in sales of new petrol engines, says Kingsmill Bond of Trusted Sources, a research firm. Investors, focusing not just on total demand for oil but on the change in demand, might see that as something of a tipping point. As Mr Bond puts it: “Investors should not rely on the phlegmatic approach of historians who tell them not to worry about change"

Sunday, March 19, 2017

The Story of Stuff

                                                   Comments due by March 31, 2017

The post for this week is slightly different than usual. Actually there is nothing to read, it is a 21 minute video that is 10 years old but that is still one of the best efforts to explain in plain language      The Story of Stuff. Give it a look. Enjoy.

Click on the above link and watch the 21 minute video. (If the link is dead then copy and paste)