Autodesk Labs Software Engineer, Frederic Loranger, submitted this article.
The Computing Cloud Infrastructure
One of the big tasks of the Obama administration is to upgrade our outdated electric grid infrastructure to support, manage, and distribute renewable energy generated with wind and solar. The technology is ready, but unfortunately the infrastructure is not.
For instance, a solar power plant with thousands of panels is capable of generating and storing electric energy only during the day. How do you make sure that this energy gets distributed back to the grid during peak hours in the evenings? The storage and distribution of the energy across the grid is a huge challenge since the power output is not deterministic. Some days are sunny while others are cloudy. Demand can be high or low based on unforecastable conditions.
We have a similar problem in the Cloud Computing World. There are a lot of untapped computing resources sitting in the cloud that are burning fossil fuel when utilization is not at peak; whether it be the servers at Amazon or Microsoft, the servers of private enterprise, or even your own computers at home. Service providers were quick in realizing that they could make money out of the unused computing resources and the technology they developed, so much so that they have built an entire business around it.
So how could we make sure that all this computing power gets used efficiently? By sharing the resources of all these different Cloud Computing Providers the same way we do it with the electric grid. By developing a Cloud Computing Infrastructure Standard, Cloud Providers will be able to tap into other pools of Computing Resources when they reach their own peak usage.
Thanks Frederic. Thinking green is alive in the Lab.