My Photo

Recent Comments

« September 2008 | Main | November 2008 »

October 2008

October 29, 2008

The Energy Impact of Grid Computing

Bookmark and Share

Metaphorical Grid I was exploring the Internet to see what happened to SETI’s @home project - I once was a member and ran their client on a Windows 2000 machine in my office.  I was quite surprised to see that the project had evolved and was still alive and well.  More interesting than that was the myriad of projects that were using distributed or "grid" computing.

It started me thinking about problem solving in general. If you need large amounts of computer power (e.g. a very expensive supercomputer and infrastructure), you may limit what problems you try to solve.  However, if large amounts of computing power is available and inexpensive (or free), then those seeking to solve complex problems tend to take advantage of it.

For those of you not familiar with grid computing, here’s a brief tutorial.  Traditional computers solve problems in a linear or serial fashion similar to solving long division.  You work on a piece of the problem and when complete you move on to the next section. The results of the first calculations are used in the next step so it proceeds in a serial fashion. Distributed computing uses many computers to solve a problem by breaking it up into tiny pieces.  Each piece is assigned to a single computer for processing so they can all work in parallel greatly speeding up the result. 

Only certain types of problems can be solved this way. For instance, the long division example above does not segment well for distributed computing. It does segment well for vectored computing which works like an assembly line, but requires dedicated processing elements. Problems like weather forecasting, computational fluid dynamics and certain mathematical problems like fractals can all be broken into small pieces and solved in this manner.  In the worlds of physics, pharmaceutical research, genome analysis and others there are many problems well suited to this type of computing.  The pieces are independent of one another and somewhat easily isolated to individual computer elements.

Typically, distributed computing is done with super computers designed with multiple processor cores such as those built by IBM and other vendors.  These systems will have anywhere from 64 to over 1024 independent computing elements (in many cases using multi-core processor chips which multiply the processing capability even more).  This effectively provides thousands of times the computing power that would be available from a single high speed computer.

Now, imagine millions of computers tied together into one massive supercomputer.  You no longer have simply 1000 computing elements, but millions of them.  By tying together average home computers (which are usually pretty snappy by today’s standards) using the Internet, this is exactly what you have. This is grid computing, or tying together computing elements with a communication grid such as the Internet. The computers are not tightly coupled as in dedicated cluster supercomputers.  They can come and go as users take the tasks on and off line.  By simply the shear number of computers in the grid, large problems can be solved using this technique.

It does take software to coordinate the distribution of data sets and the collection of results.  One such technology is BOINC which stands for Berkeley Open Infrastructure for Network Computing. The BOINC platform allows users to distribute their problems over the grid and collect results.  Many projects, such as SETI have moved to BOINC.

While looking at several of the grid computing sites, I started thinking about the power consumed by the computers in the grid.  Typically, the clients are screen savers that scavenge computer time when the user is away (or not using the machine).  When you walk away from your computer and the screen saver kicks in, instead of moving images around to prevent CRT burn-in (old school), it starts up the compute tasks assigned to the machine. 

If you’re like me, I rarely turn off my computers to get around the time it takes to "boot-up" the machine.  Instead, I set the computer to enter into a sleep mode while being idle which greatly reduces the power consumption of the machine. In this mode small parts of the system stay powered up to monitor mouse or keyboard activity to alert the computer to “wake up” and go into full power mode.  This can dramatically lower the power consumption from 150 watts (full speed with LCD on) to 20 of watts or less (LCD off, hard drives powered down, graphics off-line, processor speed reduced to a crawl, etc.). 

Looking at this, 20 watts may still seem high considering it is on all the time, but compared to 150 watts it’s a considerable savings.  If you consider a single grid system like BOINC has over 500,000 computers in their cluster, the additional compute time increases the overall power consumption dramatically. If you assume a computer sleeps 70% of the day while the user is away from the machine, the overall power is reduced to 14% of the normal level (I think I’m going to do some measurements and advise in a later post).  For 500,000 machines entering sleep mode, that is a reduction of over 65 megawatts of power! Over the period of one day, each computer would consume 1.4 kW-hrs and for 500,000 units the total daily energy consumption is 708,000 kW-hrs.

Since the BOINC client does not allow the computer to enter sleep mode, the power consumption of the machine stays relatively flat all day.  Only the LCD display can be powered down (or enter stand-by mode). To calculate the average active power of the system, let’s assume the LCD is allowed to enter stand-by mode while the client runs.  A modern LCD such as the Dell E228WFP consumes roughly 40 watts while running and 2 watts in stand-by. So, the LCD can power down, but the computer is still running at full power since it is reading and writing to the hard drive and doing intensive calculations.  The power of the system is only reduced to roughly 112 watts due to the LCD display entering stand-by mode (See diagram below).

Grid Computing Comparison  

If you now consider that each machine running the client will consume roughly 112 watts for 70% of the time, each machine uses a little over 2.9 kW-hrs per day (compared with 1.4 kW-hrs per day for a non-grid computer). At US$0.16 per kilowatt-hour, that’s an increase in cost of only US$0.24 per day (US$7.20 per month) for any one user.  However, the grid now consumes 1.481 million kW-hrs per day compared to 708,000 kW-hrs which is an increase of 773,000 kW-hrs per day.

If you assume an average U.S. household consumes roughly 30 kW-hrs per day, that increase is the equivalent of adding 25,700 average homes to the power grid. This is not necessarily a bad thing, since solving incredibly compute-intensive problems could lead to a better world for humanity... but it does make you think "wow, that’s a lot of power!" Till next time...

October 16, 2008

Micro Power Stations – You Too Can Be a Utility Company

Bookmark and Share

My Personal Generator As noted in one of my recent blog posts, "The Quest for Energy Independence" (June 30, 2008), I get a sick feeling every month while opening my power bill which many of you probably share.  As most of you experience, the price of electricity has gone through the roof and there’s possibly no end in sight.  As fuel costs continue to surge, so does the price of a kilowatt-hour. We are so dependant on electricity, there’s no going back and who would want to... and we continue to blame the power companies for the rising costs and the fact that we are now large consumers of the stuff.  Like anything, it was so inexpensive, we used great deals of energy never thinking we’d run out.

In reality, the power company has nothing to do with my consumption - they didn’t knock on my door and say, "hey, like to try some kilowatt-hours man... you’ll really like it!"  They are simply the supplier of a required commodity that used to be a lot cheaper than it is today. They are also at the mercy of the raw material suppliers - coal, oil and natural gas prices have all soared recently so they have passed that burden on to us to maintain their profits as any business would do in their position.

Now, when I built my current home, I estimated the cost of all support systems including electricity... I even added an inflation rate which nicely aligned with my estimated cost of living increases as I looked into my crystal ball to predict the future.  When a monthly expense is only U.S. $10 and it doubles, triples or quadruples it might make you angry, but it will rarely stop you from drastically changing your lifestyle.  However, when that expense in 2001 was US$190 and 7 years later the same expense is US$700+, it drastically affects you’re the way you live and the way you view the future.

If I would have known that the cost of electrical power would rise to such extents in such as short timeframe, I would have made different choices during construction of our home.  For example, the house would have been constructed like a thermos bottle (at a much larger expense - initially).  HVAC equipment (heat pumps) accounts for the largest consumption of power only second to refrigeration and lighting - especially in hot climates such as Florida.

The cooling systems in our current home had the highest Seasonal Energy Efficiency Ratio (SEER) at the time. This rating is defined as the total British Thermal Units (BTUs) per cooling season divided by the total electrical energy input in watt-hours.  Our home required three heat pump units. Two were 2.5 ton units and one was a 1 ton unit.  The combined requirement for the house by the contracted design at peak temperatures (in Florida) required 72,000 BTU/hour of cooling!

To keep the house cool, the systems run during peak months for 15 hours per day (1800 hours per season) which results in a power consumption of 10,800 kWh of energy!  At U.S. $0.16 per kWh, that is US$1728.00 per season (4 months) or over US$430 a month just to keep the house cool.  If you add in lighting, refrigerators and various pumps (e.g. irrigation, etc.) the cost skyrockets! Additionally, high energy users get hit with additional taxes from the power company. They don’t give you a discount for using more... they charge more per kilowatt-hour and the break-point is at the first 1000 kilowatt-hours which we quickly surpass during hot months.

To address the problem you can go back and improve insulation, plant shade trees, apply solar film to your windows, and more.  However, somewhere along the line there will be nothing to stop heat transfer - after all, most houses are not designed as Dewar bottles - not yet anyway. So you’ve done your part, but the cost of a kilowatt hour will continue to rise and until electrical energy is as accessible as dirt, there will be no end in sight.

So what can you do to battle the rising costs beyond conservation?  How about building your own power plant!  That’s right... you and your neighbors.  Technology can be a wonderful thing.  Today there’s micro wind turbines such as those made by Helix Wind, low cost Solar Photovoltaic panels such as those being pioneered by Nanosolar, and even moderate sized hydroelectric systems designed for large streams (some dam building required) like those from Canyon Industries for those who live in mountainous areas.

I was particularly excited about the small wind systems and imagined a corner of my neighborhood with dozens of these turbines generating power.  My thought would be for our Home Owners Association to purchase and install the turbines and sell the power back to the community at (hopefully) a much lower cost than the power company.  The other method would be to meter (via the local utility) what goes back into the grid and equally apply credits to all the homeowners!  These systems are virtually maintenance free and are designed for a 30 year lifespan. What’s even more interesting is the availability of tax credits or grants for installing alternative energy generation capability!

So if you’ve conserved as much as possible, think about generating your own electricity! If I didn’t have to drill 5 miles to find temperatures high enough to generate super heated steam, I might even consider a backyard geothermal plant, so low cost solar panels will probably be next.  Till next time...

October 10, 2008

Double Farming the Land... Get Two For The Price Of One!

Bookmark and Share

IStock_000007044322XSmall If you’ve never been to Kansas, you should go - especially the out lying towns that sprung up in the early 1900s to handle the grain produced by the heartland of America.  Recently I was on a family vacation to visit relatives and found myself driving for miles and seeing nothing but fields of various grains in every direction.  This is the most flat yet beautiful country side I’ve ever seen.  Everywhere you look you see various crops and once in a while a lonely group of cows or an abandoned rail line.

While driving the endless expanses of the Kansas planes we would periodically come across a farm house with an outcropping of trees.  In every instance I noticed something unusual... all the branches and leaves of the trees pointed in the same direction as if they had been growing toward the horizon.  As we drove through little towns such as Chase (which didn’t even have a stop sign) the scene repeated itself - more trees with growth in only one direction... weird.

When we arrived at Great Bend (our first destination) I stepped out of the car and was immediately pushed off my feet by a strong wind - my giant brain suddenly realized why the trees were all growing in the same direction... a never ending breeze across the planes.  After we were settled into our hotel I started thinking about all the flat open land we had just driven over and wondered why no one had thought of building wind farms here.  After all, crops can easily grow under a large wind turbine - unlike a giant solar array that would screen out the sun light.

The problem occurred to me that this area is somewhat arid and requires irrigation.  Most farmers use Center Pivot Irrigation which uses a deep well and a large mobile arm that traverses an arc or complete circle.  While it moves it irrigates the crop below and can be easily moved out of the way for harvesting.  If you built wind turbine towers in this area you’d have to be strategic on where you locate them. 

My initial thought was that they would interfere with the farmer’s ability to use these irrigation machines - but with a little thought I realized you could actually build the turbine towers in the corners of the square property plots and never interfere with the irrigation or harvesting - effectively double farming the land for both crops and energy (see illustration). 

Wind Turbine in Farm Land

The only other problem I could see was the lack of an infrastructure to either store or transport the energy produced. Building the wind farms would be straight forward and probably could be done in very little time following harvest.  However, the high tension lines required to move the newly generated power could be a problem - you’d need right-of-ways for those large towers and high power lines...

The other option is to use the power locally, but that would require some form of storage. The weak link of both solar and wind is the current inability to effectively store the energy produced.  If you could store it and run machinery on it (e.g. crack water and run farm equipment on the hydrogen produced) you might find people more readily willing to have these large fans sitting on their property.  All things considered, I believe there are natural energy resources just waiting to be tapped - this is just another example. till next time...

October 01, 2008

Performance and Energy Consumption... Are They Exclusive?

Bookmark and Share

IStock_000006760484XSmall In my position I hear a great deal of discussion regarding the physical trade-offs between performance and power consumption.  "If you want to accelerate quickly in a car, you need power to overcome inertia."  I agree... but increasing the size of the power plant in a car isn’t the only way to get it to accelerate faster.  Inertia is a function of mass (F=ma) so by decreasing the mass, you can get faster acceleration with the same power plant. 

This is a very common approach to improve either performance or fuel economy in today’s modern sports cars as well as jets, boats and other vehicles. But these principals also apply to electronic systems as well.  Complementary Metal Oxide Semiconductor (CMOS) based devices define modern digital and mixed signal electronics.  In the very design of these devices are issues with power as the performance is increased.  For example, DRAM designs have capitalized on the supply voltage vs. power equation for CMOS processes to reduce the power consumed (see the equation below).

 CMOS Equation

This equation shows that the frequency and capacitive load terms contribute linearly to the power consumption.  Reduce the frequency by half and the power will also be cut in half.  However, the supply voltage is a square function, so by reducing the supply voltage from 1.8V (DDR2 memory) to 1.5V (DDR3 memory), the power consumption is reduced by 30% which is a major savings.

As process geometries continue to shrink the conduction channel gets shorter (good) and the gate insulator gets thinner (bad). To reduce leakage (electrons that "tunnel" through the thin insulator) manufacturers have moved to lower supply voltages. By reducing the voltage across the transistor, the associated electric field that exists between the gate and the conduction channel is reduced as well.  New materials such as nitrided hafnium silicates (HfSiON) are being used to replace silicon dioxide in an effort to prevent leakage and electron tunneling (Intel is already shipping processors using hafnium based high-K dielectric in their 45 nm process).

No matter how you slice the problem, when you have a billion transistors all using a tiny amount of power, you end up with a large amount of power being consumed.  Processors and digital systems require large amounts of transistors and for the foreseeable future will only increase in density.  To increase the performance, there must be another way...

My impression is that the industry can take several paths in an effort to increase performance while minimizing power consumption.  One path (which is currently the path of choice) is to continue to shrink the process geometries to 20 nm or below which becomes extremely hard to fabricate.  This will allow more transistors on the same size die and utilize sub one-volt supply voltages.  Another avenue is to migrate away from silicon processes altogether and find another way to make transistors.  There is on-going research in the area of quantum well transistors made in indium antimonide which may be the next step for higher performance digital functions with extremely low power - one tenth of today’s power consumption.  There is a large capital investment in integrated circuit fabrication technology so the next step that will be the least painful will need to be similar to silicon based manufacturing.  There is also research being done in diamond based semiconductors as well as carbon nano-tube technologies to also reduce power while improving performance.

But what about revolutionary change?  What if we abandon semiconductors all together and move to optical non-linear crystal based computing and analog functions?  Is this even possible on the scale of which we currently build processors, analog-to-digital converters, amplifiers or other electronic components? Maybe our industry needs to take a step back and consider the new horizon in front of us...  a world were energy consumption is as much a factor as how fast we go... something to think about.  Till next time...