My Photo

Recent Comments

« June 2008 | Main | August 2008 »

July 2008

July 29, 2008

Achieving Zero Carbon – Self Powering Electronic Systems

Bookmark and Share

What if someone were to tell you that you could build a product that, following manufacturing, would use zero power and produce no carbon emissions? My first response would be, "yes, if you have abundant energy in the product’s environment that could be harvested efficiently".  So what "energy" is there to harvest?  Let’s take a look at the available sources and methods for creating a zero power system and what limitations exist today that limit the performance.

First, we need to have energy of some sort to power our device.  Energy resources will limit the performance of our system, so knowing our energy limits is the first step in determining what is possible.  Here’s a list of sources and the availability of each:


OK, The ZPF (Zero Point Fluctuations) line item is a stretch - but I included it since at the nano-scale, it actually may be a source of energy and it does exist.  So, out of the above list solar is probably the best for most applications.  Depending on the season and where on earth you are located, the sun rains down energy on the surface anywhere from 2 kWhr/m^2 to over 7 kWhr/m^2. For applications such as buried high-way sensors, vibration could be a good source of power (and even solar).  Water running down a drain could also be tapped to supply power to small monitoring systems.  If the equipment is really frugal with its power, radio emissions from the many radio and TV stations could be harnessed to supply microwatts (possibly charge capacitors for use later).

To understand what it would take, let’s use an example of a system I saw many years ago that really needed to be completely independent from any power source - a golf course sprinkler head.  This was a neat idea that a major irrigation manufacturer dreamt up (and you know who you are...).  Here was the problem - a golf course must have water to keep "green" and the system must easily adapt to changes in the location of traps and greens.  Conventional zoned systems that use wires cannot easily be moved.  Each head needs to be controlled and require power.  Each head is used a few hours each week.

So the solution was to have each irrigation head be a self contained system that was totally independent and could be installed anywhere along a pipe without requiring any power.  Each head would communicate wirelessly to repeaters (or a mesh network) controlled from a central PC.  Moisture sensors could also be spread around the network to more accurately control the irrigation so not to waste water.  To solve the problem of power, at the top of each head a small solar panel was used to charge a battery (the weak link - batteries require replacement) capable of opening the valve and maintaining the electronics.  If the grass started to grow over the solar panel while the head was off or the battery required replacement, the head could signal a problem to the central computer. 

The overall function was to water the grass under control from a central system, require no external power source and rarely need service - a very nice application of a completely autonomous system harvesting energy to perform its function.  Could it be improved?  How about using the high pressure water during the watering cycle to run a small turbine and charge the battery - helps if there’s not enough sun.  Of course low power electronics makes this all possible (see for more information on reducing energy consumption in electronics).  If the electronics draw more power than the solar panel can supply (and the head is small), then the battery will go dead (along with the grass).  Additionally, the valve must use hydraulic pressure to help hold it open to minimize the energy required - a common feature of irrigation valves.

Let’s take a look at another system - this one hypothetical; an attic temperature sensor.  The idea here is to monitor the attic temperature for control of an exhaust fan that helps reduce the cooling cost of the home.  Now there’s a new problem - no direct sunlight and limited ambient light.  There are several sources of power in the attic, but most notably a large differential temperature (in summer or winter).  So the question is, how could this sensor be built? I’m going to suggest a thermal generator charging a battery or capacitor. During times of peak temperature differentials such as mid-day in summer, the generator charges the battery and monitors the temperature communicating wirelessly to the controller in the fan.  The actual device could be shaped like a spike that is thrust into the ceiling wall-board and protrudes through the insulation.  The cold side is inside the house (in the ceiling) and the hot side is in the attic... could it work? You tell me... till next time!

July 22, 2008

The Leaky Bucket Syndrome

Bookmark and Share

Residential Energy Leakage
Most people have heard about "vampire power" or the leakage power that continues to flow even when a device is turned off.  Many home entertainment systems can draw a several watts while they are in stand-by mode - the equivalent of "off".  Many set-top cable boxes keep most of their circuitry on even when the unit is not "on".  This is required for the cable infrastructure to maintain communications with the box. 

More curious still is the power that appliances draw when "off".  For instance, if you have a 1000 watt microwave oven with a digital display it may actually use more energy when not cooking then in actual use.  This is due to the electronics drawing power continuously while the oven component itself is used only periodically.  If the oven is used for an average of 4 minutes per day, it will use roughly 24 kW-hrs of energy in a year.  The electronics use about 3 watts and are on 24/7/365 so that component consumes roughly 26 kW-hrs of energy - a bit more than the oven.

If you have 4 incandescent bulb night lights that each use 3 watts of power and you leave them on 24/7 (many people do), in a year that would add up to 105 kW-hrs of energy consumed.  Switching to 1 watt LED based units with daylight sensors would save over 88 kW-hrs of energy in a year.  The LED units will last for 100,000 hours effectively never requiring replacement.  If you consider that incandescent bulbs require replacement periodically, the payback for the LED units with improved efficiency and reliability would be under a year.

OK, so saving a watt here or a watt there is like saving a penny here or a penny there - does it really make a difference?  Here’s a similar analogy: does dropping some loose change into a charity’s bucket during the holidays really make a difference.   If 10 million people dropped an average of 25 cents into buckets across the country, that would add up to 2.5 million dollars!  If every household in America dropped an average of 10 watts from their daily consumption (240 watt-hours), with over 100 million households in the US that would add up to over 24 million kW-hrs of energy per day.  That’s the equivalent output of a 1 gigawatt power plant!

Industrial Size Problems
So if saving a few watts here or there can really add up, imagine the potential savings for industrial users which can be thousands of times higher than a single family home.  It is estimated that 65% of industrial electrical use goes to powering motors.  Motors are most efficient when run at their rated loads; however they quickly lose efficiency when run at lighter loads.  This is a similar phenomenon to the efficiency loss seen in switching power supplies when run at lighter than designed loads.  A Department of Energy (DoE) study of nearly 2000 industrial motors from various applications nationwide showed 44% of them were operating at less than 40% of their recommended loading.

So what can be done to improve motor efficiency?  One method is to replace the direct drive systems with VSD or Variable Speed Drive systems.  It turns out that the speed of an AC synchronous electric motor is proportional to the frequency of the AC line current.  By implementing a variable drive system, a savings of anywhere from 5% to 50% can be realized.  With the cost of electricity increasing, the equipment cost can be quickly amortized and true savings can be realized.

What about lighting?  We talked about saving energy with dimmers and CFLs last week (Living With Less - Are Dimmers Better than CFLs?) in our homes. Now imagine how much could be saved by moving to active systems that lower the light level of florescent lights or HID (High Intensity Discharge) units near windows when it’s sunny.  Natural light coming through glass adds to the total available lighting in buildings.  If the lighting system can monitor that amount and make subtle changes to the light from artificial sources, a tremendous amount of energy can be saved.  Take for instance a parking garage.  Normally during the day, only the interior HID units would need to be working since the exterior units are near the open spaces and receive natural daylight.  Only at night would the units near the periphery need to be turned on.  Additionally, systems that could dim the HID units could gradually increase the brightness the darker it gets.

Parking_garage_hid_exampleFor example, a 4 story above ground parking garage would use approximately 280 HID units (each consuming around 215 watts of power).  Each floor would have 70 units arranged in a matrix of 7 x 10.  So there would be 30 units along the periphery that could be turned off completely and 22 units in the second ring that could be dimmed on each story.  Each story could save 215 watts on the outer 30 units and 107 watts on 22 units on the next ring.  This would save roughly 423 kW-hrs per day or 154,350 kW-hrs a year.  At US$0.10 per kilowatt hour, this would save the garage owner US$15,400 per year!  This does not include the savings in lamp replacement due to reduced wear - just a few thoughts on stopping the leaks in your buckets.

Till next time...

July 14, 2008

Living With Less – Are Dimmers Better than CFLs?

Bookmark and Share

Have you ever wondered if you installed a dimmer whether you’d save any energy in your home?  I have tons of networked dimmers installed throughout our house on every incandescent light bulb we have - including floor rope lights used for night time lighting.  "Why?" you might ask.  Besides being completely crazy about controlling and monitoring things around my house, it makes good sense to adapt the energy consumption of a particular light to the current requirements.  The interesting argument is, "how much do I save and are they better than CFLs?"

I’m going to propose a standard house.  One scenario will use Incandescent bulbs and no dimmers, one will use Compact Florescent Lights (CFLs) without dimming, and the last one will use Dimmers.  We will then run a simulation of a standard day usage pattern to find out which one of these makes the most sense in reducing a household’s lighting energy consumption.

OK, so we need a standard house.  The US average power consumption for homes is around 900 kW-hrs per month.  The US Department of Energy (DoE) states that around 8% of that is consumed by lighting (on the average).  The rest is HVAC, refrigeration, water heating, TVs, electric appliances, pumps, etc.  So the amount of power consumed by the lights in a month would be around 72 kW-hrs.  So that provides us roughly 2.4 kW-hrs per day for lighting.

Figures 1 and 2 below show two separate scenarios based on probable usage pattern of a family of 3 (i.e. husband, wife and teenager) during a normal weekday for a house with 16 bulbs.  The blocks indicate 30 minute periods to simplify the charts.  Figure 1 was filled to roughly 2400 W-hrs for 65W incandescent bulbs and no dimmers which is roughly our standard US household.  By replacing all 16 bulbs with CFLs, the total energy consumption drops to 622.5 W-hrs.  This is a 75% savings in energy for the lighting.  Figure 2 shows the effect of adding dimmers to the same lights and lowering the brightness according to various tasks.  Many times lights are left on simply to navigate through a house and rarely need to be at full brightness.  Also, while watching TV, lowering the lights to a comfortable viewing level makes it easier to see.  TV’s with adaptive brightness may also lower their backlight or projection brightness to adapt, saving power as well.  The calculation with the dimmers drops the energy consumption to 1837.5 W-hrs.  This is a 26% savings which is much less than the 75% savings of the CFLs.

You can download the Excel spreadsheet I created by clicking here so you can run your own scenarios.

So in a year’s time how much money does that save?  For a normal year of days like those above (365.25 of them), 2490 W-hrs * 365.25 equals 909.5 kW-hrs.  At an average rate of $0.10 per kW-hr, the power would cost approximately US$90.  The CFLs’ power would only cost US$23 per year and if we had installed dimmers, we’d spend US$67.  To convert the incandescent bulbs to CFL units would probably cost around US$64 which would pay for itself in the first year.  Dimmers could cost anywhere from US$4 to over $US100 each, so the payback (best case) would be 2.78 years...

Additional considerations would be the environmental factors of CFLs - they all contain mercury which is extremely toxic.  Newer versions use less, but the mercury is required for the bulb to operate.  Non-toxic LED bulbs will eventually emerge and drop in price enabling cost savings as well as an environmentally safe solution.

Got a comment?  Drop be me an email or comment here on the blog.  Till next time...

Figure 1 - No Dimming, Incandescent and CFL bulbs
Figure 2 - Dimming, incandescent bulbs

July 07, 2008

The Whole Can Be Less than the Sum – Adaptively Reducing Power

Bookmark and Share

The title of this week’s blog sounds incorrect.  Isn’t it called synergy when the action of combining pieces together results in something greater than all the parts?  This is usually true, but today I’m going to discuss increasing energy efficiency through active means - that is, we’ll discuss what happens when you combine parts together in a design and the resulting power consumption of the system is actually lower (a great deal lower) than that of all the individual components.  How can this be?  It’s actually quite simple, so let’s take a look.

Here’s the concept - put together a bunch of components that monitor some power-consuming process and continuously "adapt" to the current required conditions to lower the energy consumed.  An example could be the back-light power supply for a personal media player.  While watching a video the player’s power supply is fed ambient light information from a photodiode.  As the ambient light changes, the drive current to the white LED backlight is adjusted.  It rarely needs to be at full power (direct sun light), so it adapts to the current surroundings.  In this way, the battery life is increased and the total energy consumed is reduced.

Now you may argue, "yes, but in direct sunlight the overall power consumption is actually larger due to the additional circuitry required to monitor the ambient light".  I will concede that argument is true.  However, let’s use some statistics (I love math!) to prove my point.  My theory is that if you were to create a usage pattern of the ambient lighting conditions of a large number of personal mobile devices (especially media players), you’d find most of them (around 84%) are watching their video in much less than full sunlight.  I arrived at this fairly large number by assuming a Gaussian distribution of the ambient lighting conditions that most people use - completely dark to full sunlight.  If I add 1 standard deviation on either side of the mean and then add in the remaining distribution on "the dark side" (couldn’t resist the pun), you calculate about 84% of the time users are not watching the video in full sun (or near full sun).  Actually most of the time users are in less than full sun for other reasons such as preventing a sun burn or simply being comfortable - I live in Florida and I should know!

Blog008_equation1So if you accept my theory on the usage patterns, then you must agree that if a PMP is simply designed for full sunlight viewing, it will use considerably more power then the device designed to "adapt" to the ambient conditions.  The next question is how much power is saved by being adaptive... this requires a bit more math.  First, let’s assume again a Gaussian distribution of light conditions during playback over normal usage.  We’ll use the standard Gaussian distribution formula as part of our calculations  shown in Equation 1. 

We’ll also assume a 3σ (standard deviations) spread to set the limits for full darkness and full brightness (daylight viewing with the LEDs at 100%). This will include 99.7% of the usage cases (with a 0.3% error).  To clarify, -3σ = 0% brightness and +3σ = 100% brightness (see drawing 1 and equation 2).  We will also assume a continuously variable drive to the LED back-lights (as apposed to a stepped approach) based on the ambient conditions that will never drop below 20% even in complete darkness. By applying the backlight level function in equation 2 to our distribution function shown in equation 1, we can then calculate the total percentage power used by the adaptive backlight.  Equation 3 shows the total power calculation.  Blog008_drawing1This calculation is for the entire probability distribution including very bright and full sun conditions.  If we evaluate the integral from -3σ to 1σ which represents 84% of the population (as mentioned before), the power is reduced from 59.8% down to 47.1% - an incredible savings. 

Blog008_equation2Now let’s take a look at the system impact of the back-light savings on the total run time of the device. Assume the back-light LEDs represent 40% of the total power consumption of the device when at full brightness.  Assume a run time of 2 hours based on a non-adaptive back-light.  If we reduce the backlight power by 40% for all users, then the overall system run time improves to 2 hours and 23 minutes.  That’s an overall system improvement of 19%.  Now, if we reduce the backlight consumption by 53% for 84% of the population that never watch video in bright light, the run time goes up to 2 hours and 32 minutes - a 27% improvement in performance.

In this discussion we have not considered the power consumed by the adaptive circuit, so we’ll assume it’s negligible relative to the backlight.  In many cases it takes very little additional circuitry to perform these types of tasks.  As you can imagine, if you are watching video in almost complete darkness with this adaptive PMP, then you could probably get an additional hour of play from it.  Just some thought provoking ideas...  If you want to know more about adaptive power reduction, check out National’s PowerWise® Solutions page at

Blog008_equation3_2Till next time...