Press "Enter" to skip to content

A blast of solar energy from the past

Over the past century, increasing levels of carbon dioxide and other greenhouse gases in the atmosphere due to the burning of fossil fuels have contributed to the heating of Earth’s climate system. This has challenged organizations and industries on both a national and global scale to expeditiously implement low-carbon energy technologies in order to mitigate the impact of climate change on our everyday lives. But as fossil fuels continue to dominate as the primary source of energy consumption across the world, this transition to low-carbon energy has proven to be an uphill battle. One might presume a part of the reason behind this is that renewable technologies like solar power are relatively new in development in comparison to the long and steady operation of the fossil fuel industry, but looking back on history, this hasn’t been the case.

The concept of humans harnessing sunlight for the purpose of producing thermal energy existed as early as 7th century B.C., in the practice of starting fires by concentrating energy from the sun using magnifying lenses. In 1839, French physicist Edmond Becquerel was the first to discover the photovoltaic effect, explaining the process of how semiconducting materials could eventually be used to capture sunlight and generate electric current. However, it would take almost half a century to develop the first design for a solar cell. In 1883, American inventor Charles Fritz produced the first photovoltaic cells using selenium wafers, a precursor to silicon based cells produced by Bell Laboratories in 1954 that are recurrently used for solar panels today.

In 1905, Albert Einstein’s published paper on the photoelectric effect generated attention and acceptance for solar on a broader scale, but even so, early adoption still found itself a naturally slow process because of costly production and the competitiveness of fossil fuels during the second industrial revolution. In 1956, Bell Labs’s product, operating at four percent efficiency (less than a quarter of what’s capable today), became commercially available but at the expensive rate of over $1,900 per watt. For this reason, militaries were the first to invest in solar projects, such as the U.S. and Soviet space programs which worked on the installation of solar technology on satellites during the 1950s and early 1960s. 

The U.S. government’s first push for commercial solar power would occur in the early 1970s, at the start of an energy crisis across the country due to the Arab Oil Embargo of 1973 and the Emergency Petroleum Allocation Act of 1973. Congress passed the Solar Energy Research, Development and Demonstration Act of 1974 in an effort to direct agencies like the National Science Foundation and the Department of Housing and Urban Development to improve solar power technology. In 1977, the government began funding the Solar Energy Research Institute, known today as the National Renewable Energy Laboratory, and in the following year, the Energy Tax Act was passed by Congress in the hopes that a commercial investment tax credit and residential energy credit would provide financial incentives for the public to purchase solar properties. In spite of this commitment, the tax credits failed to have the desired effect of increasing America’s solar power use, as the solar industry still comprised a negligible amount of electricity generation at the time.

The astronomical prices of solar panels in 1956 have since managed to gradually come down in price to less than $0.80 per watt today and private industries such as General Electric and Tesla have made their own claims in the commitment to develop affordable solar and water utilities through the adoption of renewable tech. Yet one of the remaining issues still standing in the way of solar becoming a larger percentage of the world’s electric supply is the current state of grid infrastructures in the U.S. and countries abroad. 

According to senior modeling analyst, Yan Qin, of the power industry intelligence provider Thomson Reuters Point Carbon, national grids have been built to carry fairly consistent levels of energy generation, therefore the variability of solar has presented a struggle where the work and investment to make changes hasn’t come easy. Solar has additionally been held back by its current capacity factors. Compared to a nuclear power station which can produce electricity at above a 90% capacity factor, solar panels are often only capable of functioning at a capacity factor of 10-25%.

A solution suggested by director Dolf Gielen of the International Renewable Energy Agency’s Innovation and Tech Centre in Germany considers combining the efforts of solar with other sources such as hydro, wind, tidal, and geothermal to combat the inconsistencies. Nuclear power could also play a large role in providing steady low-carbon energy. Gielen predicts solar will be able to supply 10% of the world’s electricity by 2050 given that there is continued research and efficiency improvements along the way. A study released in 2016 by Oxford University’s Research Policy journal made a separate conclusion that falling manufacturing costs would grow solar’s share of global electricity to 20% by 2027.

Only time will tell how far we’ll go with solar technology as we continue the pursuit for a world with zero carbon emissions.

Be First to Comment

Leave a Reply