My Photo

Recent Comments

« January 2009 | Main | March 2009 »

February 2009

February 26, 2009

AC vs. DC – The Westinghouse / Edison War Continues...

Bookmark and Share

AC High Voltage Power Lines Did you know if Edison had his way, all generation and transmission of electrical power including the outlets in your house would provide direct current (DC) instead of alternating current (AC) that we have today?  Around the turn of the 20th century, Nikola Tesla invented alternating current generation, transmission and AC induction motors.  He then licensed his patents to George Westinghouse and the war with Edison began.  Edison went as far as electrocuting animals with AC power to show how lethal it was compared to direct current. The fact is ANY electrical current can be fatal. It does take more current to place your heart into fibrillation with DC than AC (around 60 milliamps for line power AC, and 300 to 500 milliamps for DC). Above 200 milliamps muscles contract so violently, the heart cannot pump at all...  Thus the reason you should always throw off the circuit breaker when working on an electrical project... I do (well, most of the time).

We all know that Tesla and Westinghouse won the battle. AC power has the advantage of easily being "transformed" to higher and lower voltages allowing transmission over vast distances.  Additionally, AC power propagates down a wire with lower loss than direct current.  DC power suffers seriously from Ohm’s Law (R = V / I where "R" is resistance in ohms of the wire, "V" is the voltage drop across the length of the wire in volts, and "I" is current flowing through the wire in amperes).  To calculate the power lost for DC power due to the resistance of a wire, you simply use ohms law plus the power equation (P = I * V) and find P = I^2 * R where P is power in watts.  If you consider a transmission line carrying DC power with a current of 10,000 amperes and a transmission resistance of only 0.1 ohm, you will be losing 10 million watts of power! Also, there would be a voltage loss (a drop in voltage) of over 1,000 volts from one end to the other. Depending on the length of the wire it will either get warm, catch on fire or explode! Since it was known that transmission losses would be much higher than zero ohms (unless the wires were made from super conducting materials), DC transmission was considered impractical and abandoned. But interestingly, the battle still rages on in pockets of our industry.

There are complexities with AC power namely maintaining the correct frequency (50 or 60 Hertz depending on your country) and phase synchronization.  When generators are brought on-line, they must exactly match the phase and frequency of the "grid" otherwise "seriously bad things happen".  Consider what would occur if a 100 megawatt generator was switched into the grid with as little as 1 degree of phase difference between the generator and the grid. The phase angle of 1 degree at the zero crossing (the point where the sine wave power goes to zero before reversing) would be equal to a power loss of over 1.74 megawatts! Well, in reality the power wouldn’t be lost... it would show up somewhere you wouldn’t want it to -  like a high voltage transmission transformer (i.e. imagine a large boom followed by much panic). That’s why our transmission grids have safeguards - like high power circuit breakers the size of automobiles. There are other problems with large distributed networks that span a nation - the phase of the power will be different along the grid and there is always the issue of Power Factor.

With all the problems associated with AC power, our modern world runs on it.  What’s interesting is that in most homes, the electronics (including your PC) immediately turn the AC power into high voltage DC and then using a switching power supply convert the power into lower DC voltages required by the system.  Most electronic subsystems run on DC voltages that range from less than 1 volt to around 48 volts.  There are losses with the conversion from one DC voltage to another, but most designs can provide about 80% efficiency with many above 90%.  To learn more about switching power supplies, go check out National’s Analog University tutorial on switching power.  Also check out their WEBENCH tools which allows you to design a complete switching power supply on-line.

Another reason for converting to DC is the ever increasing need for alternative energy sources such as wind and solar. For instance, photovoltaic panels used for solar installations supply DC power which must then be converted to AC. As LED lighting begins to overtake the traditional incandescent bulbs and CFLs, they will require direct current.  This again is supplied by switching power supplies that convert the power into a constant level direct current for the LEDs. 

But this begs the question, "what about our existing infrastructure?"  I doubt anyone would say, "sure, come on over and tear up my entire house and rewire it for DC power."  Just the issue with appliances is enough to stall any initiative.  However, a dual power system might actually have some merit.  For those systems that can benefit from DC power (such as charging your electric vehicle’s batteries), making a DC gateway into the home might provide some benefits.  You would have one very efficient DC power supply that would reduce the AC line current to around 48 volts DC.  Then, any appliance or electronics that would require DC could start at the 48 volt point and easily convert it to what ever the system requires.

There is a silent movement to move back to DC power for some of the above reasons at least at the final destination. I seriously doubt that Edison will finally win the war which is pretty much over at this point. But as applications for direct current emerge in the home a master DC home gateway may one day show up in your garage.  Something to think about... till next time...
 

February 08, 2009

Will Energy Costs Revive Home Automation?

Bookmark and Share

Home Automation System Once upon a time there was a geeky guy who loved computers and electronics. Alas, he was unmarried, sans children and a social life.  So, one day he decided it would be a good idea to connect up two disconnected things - his home computer and his X-10 based lighting modules he purchased from Radio Shack.  This required some embedded design, assembly code, printed circuit boards, several severe shocks and some application software.  At the end of his quest, he had effectively created a tiny home automation system capable of turning on and off lights at sunrise or sunset, detecting when he was home (which was very frequent) and setting the lighting mood.  When evening ended, the system made sure everything (including the always in-use coffee maker) was turned off.  That was me in 1994... needless to say I’m not so lonely anymore! I have a beautiful wife, two children, a house, a dog, car payments, etc... how things have changed!

But what ever happened to my dream of a completely automated home?  My thoughts in 1994 where that technology (namely the Internet) would reach into the last few feet of my house by the end of the century and every wall switch and appliance would have an IP address... this just never seemed to materialize - for the masses that is.  Yes, there are spectacularly expensive home automation systems available from several manufacturers along with application software that requires a PC in every room.  These "high end" technologies are only available to those who can drop $40,000 or more to control their homes.  I’m not one of those people... and I don’t know very many who are. 

So what happened?  Why didn’t the technology ever find its way into our new homes? I believe the answer is quite simple - there was no "need" for it.  Manual rocker switches or dimmers are fine for just about everyone and the cost is hard to beat.  You can buy a dimmer switch at Home Depot or other hardware supply stores for under $10.00 (a basic model) and install it yourself (if you’re careful).  There is simply no need to automate your washing machine or your refrigerator - that is until now.

In the late 20th century, electrical costs were around 6-10 cents per kilowatt-hour (in most areas in the U.S.).  A very reasonable amount for what you received considering the electrical grid and generation plant overhead.  An average home might use 800 to 900 kilowatt-hours per month, so the monthly bill would be under $90.  Everyone was happy...

Now, let’s look at a hypothetical world - maybe only 10 years away.  In the world of 2019, electrical power is sporadic due to an aging infrastructure and extremely costly at 35 cents per kilowatt hour.  Tariffs have now been added to cover "time of use" which not only includes the actual power used, but when it was used. This was extended to cover residential users in an attempt to keep the aging grid from collapsing during peak hours. Consuming energy during this period (7:00 AM until 7:00 PM), would add an additional 20 cents per kilowatt-hour (that’s the equivalent of 55 cents per kilowatt-hour). So the same home using 800 to 900 kilowatt-hours now has a monthly bill around $400 - everyone is not happy anymore...

In this future world, co-generation from solar and wind sources would help defer the costs of grid connected power. Government subsidies would help put these technologies on many of the roof tops and back yards of residential consumers. Competition as well as new developments would also reduce the cost, but not enough to offset the demand. 

With the cost of energy soaring in the world of 2019, new technologies could finally flourish that help reduce the consumption and improve overall efficiency.  A light switch that is monitored and controlled from a central computer could finally retail for $60 and people would buy it.  Appliances that communicate with the house metering system to know when to use power and have goals to reduce cost would now make sense - and every major white goods manufacturer would be scrambling to add those features to their latest products. Conservation and efficiency would become the mantra of the day since building new carbon based generation capability would have been outlawed - an interesting future it could be...

Economic pressure can move mountains.  As the cost of energy continues to increase - and it will continue - technologies will emerge to manage and conserve power.  Home automation systems may finally become as common as plumbing in an effort to conserve and manage energy. Industrial users have been closely watching their electrical meters for a very long time - now it’s time for residential users to start watching theirs.  See my previous blog, "Metering Your Power Consumption" for more ideas on monitoring where your energy is going.  Till next time...

February 02, 2009

The Role of Semiconductors in Energy Conservation

Bookmark and Share

The Future Chip I’ve been hearing a great deal about how various technologies will be deployed to help reduce our carbon foot print as well as provide a sustainable energy future for all... these include alternative energy generation, smart grids, new solid state lighting, and more.  The most interesting thing is that underlying all of those technologies (and many others) are the semiconductors that provide the computational engines, the sensing and signal conditioning as well as the power conversion.  It is the humble "chip" that defines the semiconductor industry and has made such amazing strides in the last 50 years since its debut.  Now it’s time to leverage that technology in saving energy - not just consuming it.

Without semiconductors, very few of our modern technologies would exist.  It would either be impossible to manufacture them or they would simply be too complex to implement (think "mechanical or vacuum tube" based computers).  Today with energy on everyone’s mind, conservation is in the forefront along with improved efficiency.  If you consider that almost no one owned a computer in 1981 (except geeks like me), the conversion efficiency of the power supplies were not a major issue - cost might have been a higher priority.  However, today just about everyone has at least one computer and the energy consumption of the system is a high priority. Building computing platforms that use less energy is a focus for the major microprocessor vendors as well as the system designers.

Extending the view out into the Internet, the picture becomes cloudy on exactly where the power is going.  However it is going somewhere and in gigantic quantities.  Yahoo and Google both are building new data centers in the Pacific Northwest to move closer to sources of hydroelectric power which is (for now) plentiful and less expensive.  With the growth of the Internet continuing for the foreseeable future, the power consumed by this infrastructure will continue to climb.  Estimates are that by 2050 an additional 300 gigawatt power plants (coal, nuclear, natural gas, etc.) will need to be built to support the increasing consumption of electrical power.

So, if the semiconductor industry has enabled so much through higher levels of integration and performance, why can’t the next big challenge be to make these systems more energy efficient? I have no doubt that is exactly the thought on everyone’s mind.  In the past, the goal was to put as many active devices on a single "chip" as possible.  Today, a billion transistors is standard operating procedure.  Now the goal is to reduce how much energy each transistor uses to do the same job. New technology such as quantum well transistors holds the promise to reduce the energy consumed to around 1/10th of today’s modern CMOS (Complementary Metal Oxide Semiconductor) transistors.  Other technologies such as carbon nano-tubes will also play a role, but we may not see the fruits of these technologies for another 5-10 years.

So, remember while you’re talking on your cellular phone or watching that brand new 50" flat panel HDTV... without the semiconductor industry most of modern life would not exist... and the future is more dependant on the success of that industry than most think.  Till next time...