It is widely agreed that climate change is caused owing to the emission of greenhouse gases, the chief among them being CO2.
So, if we reduce CO2 emissions, we start fighting climate change and reduce the adverse effects of global warming.
How do we reduce CO2 emissions from man-made sources?
A significant portion of anthropogenic (simple English: from humans) CO2 emissions are due to industrial activities – industrial heating, cooling and power generation. In fact, our fossil fuel based power plants alone contribute to about 30% of the total CO2 generation from man made sources!
Thus, if we are able to make industries cut down on their CO2 emissions, we would have, to a certain extent reduced global warming.
But how can one make industries reduce their CO2 emissions?
This is where carbon credits come in.
Carbon credit is the popular term used to denote a system where a company that is generating less CO2 emissions/unit of production than is the benchmark for its industry, gets an incentive from another company that is generating more than the threshold for emissions.
So, essentially, it is a system in which a company that does lower CO2 emissions gets incentivised, with the incentive essentially being a penalty imposed on another company that does higher CO2 emissions.
The idea behind the carbon credit system of course is that this will, over a period of time, make the large CO2 emitters to start reducing the amount of CO2 that they emit and thus fight climate change.
Until 2012, many developed countries worldwide that had signed an international protocol called the Kyoto Protocol had compulsory carbon credits exchange for their industries. However, the Kyoto Protocol’s life ended in Dec 2012. Post that, any company that buys carbon credits in these countries buy them purely voluntarily. Not surprisingly, the price for a carbon credit (called a CER) has plummeted and as of 2016, it is almost worthless. Sad, but that is life!