UTF-8 http://feraios.blogspot.com/feeds/posts/default
Google

Thursday, April 28, 2011

THE MACHINE FOR DREAMING

Thomas Schenkel and his colleague T.C. Shen of the Accelerator and Fusion Research Division (BERKLEY LAB) are working with an electron-beam ion trap to develop a quantum computer based on single-electron transistors. (Photo Roy Kaltschmidt)

Moore’s law


If you were shrewd enough to invest $15,000 in Microsoft ten years ago you’d be a millionaire by now. The engine of growth that has driven the computer industry, and looks set to make Bill Gates the world’s first trillionaire, is no lucky fluke. Underpinning it have been sweeping advances in fundamental electronics. The first computers used bulky vacuum tubes and needed entire buildings to house them. Then in the 1960’s along came the transistor, which in turn gave way to the incredible shrinking microchip. But this is not the end of the story. Yet more exotic technologies are in the pipeline, and they promise to have as great an impact on the information industry as did the invention of the original computer.



The commercial success of computers stems from the fact that with each technological leap, the processing power of computers has soared and the costs have plummeted, allowing manufacturers to penetrate new markets and hugely expand production. The inexorable rise of the computer’s potency is expressed in a rough and ready rule known as Moore’s Law, after Gordon Moore, the co-founder of Intel. According to this dictum, the processing power of computers doubles every 18 months. But how long can it go on?


Moore’s law is a direct consequence of the “small is beautiful” philosophy. By cramming ever more circuitry into a smaller and smaller volume, faster information processing can be achieved. Like all good things, it can’t go on forever: there is a limit to how small electronic parts can be. On current estimates, in less than 15 years, chip components will approach atomic size. What happens then?


The problem is not so much with the particulate nature of atoms as such. Rather it lies with the weird nature of physics that applies in the atomic realm. Here, the dependable laws of Newtonian mechanics dissolve away into a maelstrom of fuzziness and uncertainty.


To understand what this means for computation, picture a computer chip as a glorified network of switches linked by wires in such a way as to represent strings of 1’s and 0’s – so-called binary numbers. Every time a switch is flipped, a bit of information gets processed; for example, a 0 becomes a 1. Computers are reliable because in every case a switch is either on or off; there can be no ambiguity. But for decades physicists have known that on an atomic scale, this either/or property of physical states is fundamentally compromised.



The source of the trouble lies with something known as Heisenberg’s uncertainty principle. Put crudely, it says there is an inescapable vagueness, or in determinism, in the behaviour of matter on the micro-scale. For example, today an atom in a certain state may do such-and-such, tomorrow an identical atom could do something completely different. According to the uncertainty principle, it’s generally impossible to know in advance what will actually happen – only the betting odds of the various alternatives can be given. Essentially, nature is reduced to a game of chance.

Atomic uncertainly

Atomic uncertainly is a basic part of a branch of science known as quantum mechanics, and it’s one of the oddest products of twentieth century physics. So odd, in fact, that no less a scientist than Albert Einstein flatly refused to believe it. “God does not play dice with the universe,” he famously retorted. Einstein hated to think that nature is inherently and fundamentally indeterministic. But it is. Einstein notwithstanding, it is now an accepted fact that, at the deepest level of reality, the physical world is irreducibly random.

When it comes to atomic-scale information processing, the fact that the behaviour of matter is unreliable poses an obvious problem. The computer is the very epitome of a deterministic system: it takes some information as input, processes it, and delivers a definite output. Repeat the process and you get the same output. A computer that behaved whimsically, giving haphazard answers to identical computations, would be useless for most purposes. But try to compute at the atomic level and that’s just what is likely to happen. To many physicists, it looks like the game will soon be up for Moore’s Law.

Although the existence of a fundamental physical limit to the power of computation has been recognized for many years, it was only in 1981 that the American theoretical physicist Richard Feynman confronted the problem head-on. In a visionary lecture delivered at the Massachusetts Institute of Technology (MIT), Feynman speculated that perhaps the sin of quantum uncertainty could be turned into a virtue. Suppose, he mused, that instead of treating quantum processes as an unavoidable source of error to classical computation, one instead harnessed them to perform the computations themselves? In other words, why not use quantum mechanics to compute?

It took only a few years for Feynman’s idea of a “quantum computer” to crystallize into a practical project. In a trail-blazing paper published in 1985, Oxford theoretical physicist David Deutsch set out the basic framework for how such a device might work. Today, scientists around the world are racing to be the first to make it happen. At stake is far more than a perpetuation of Moore’s Law. The quantum computer has implications as revolutionary as any piece of technology in history. If such a machine could be built, it would transform not just the computer industry, but our experience of physical existence itself. In a sense, it would lead to a blending of real and virtual reality.

At the heart of quantum computation lies one of the strangest and most baffling concepts in the history of science. It is known technically as ‘superposition’. A simple example concerns the way an electron circles the nucleus of an atom. The rules of quantum mechanics permit the electron to orbit only in certain definite energy levels. An electron may jump abruptly from one energy level to a higher one if enough energy is provided. Conversely, left to itself, an electron will spontaneously drop down from a higher level to a lower one, giving off energy in the process. That is the way atoms emit light, for example.

Because of the uncertainty principle, it’s normally impossible to say exactly when the transition will occur. If the energy of the atom is measured, however, the electron is always found to be either in one level or the other, never in between. You can’t catch it changing places.


Quantum superpositions

Now comes the weird bit. Suppose a certain amount of energy is directed at the atom, but not enough to make it jump quickly to an excited state. According to the bizarre rules of quantum mechanics, the atom enters a sort of limbo in which it is somehow in both excited and unexcited states at once. This is the all-important superposition of states. In effect, it is a type of hybrid reality, in which both possibilities – excited and unexcited atom  - co-exist. Such a ghostly amalgam of two alternative worlds is not some sort of mathematical fiction, but genuinely real. Physicists routinely create quantum superpositions in the laboratory, and some electronic components are even designed to exploit them in order to produce desired electrical effects.

For 70 years physicists have argued over what to make of quantum superpositions. What really happens to an electron or an atom when it assumes a schizophrenic identity? How can an electron be in two places at once? Though there is still no consensus, the most popular view is that a superposition is best thought of as two parallel universes that are somehow both there, overlapping in a sort of dual existence. In the case of the atom, there are two alternative worlds, or realities, one with the electron in the excited state, the other with the electron in the unexcited state. When the atom is put into a superposition, both worlds exist side-by-side.

Some physicists think of the alternative worlds in a superposition as mere phantom realities, and suppose that when an observation is made it has the effect of transforming what is only a potential universe into an actual one. Because of the uncertainty principle, the observer can’t know in advance which of the two alternative worlds will be promoted to concrete existence by the act of observation, but in every case a single reality is revealed – never a hybrid world. Other physicists are convinced that both worlds are equally real. Since a general quantum state consists of a superposition of not just two, but an unlimited number of alternative worlds, the latter interpretation implies an outlandish picture of reality: there isn’t just one universe, but an infinity of different universes, existing in parallel, and linked through quantum processes. Bizarre though the many-universes theory may seem, it should not be dismissed lightly. After all, its proponents include such luminaries as Stephen Hawking and Murray Gell-Mann, and entire international conferences are devoted to its ramifications.

How does all this relate to computation? The fact that an atom can be in either an excited or an unexcited state can be used to encode information: 0 for unexcited, 1 for excited. A quantum leap between the two states will convert a 1 to a 0 or vice versa. So atomic transitions can therefore be used as switches or gates for computation.

The true power of a quantum computer comes, however, from the ability to exploit superpositions in the switching processes. The key step is to apply the superposition principle to states involving more than one electron. To get an idea of what is involved, imagine a row of coins, each of which can be in one of two states: either heads or tails facing up. Coins too could be used to represent a number, with 0 for heads and 1 for tails.


Heads and Tails

Two coins can exist in four possible states: heads-heads, heads-tails, tails-heads and tails-tails, corresponding to the numbers 00, 01, 10 and 11. Similarly three coins can have 8 configurations, 4 can have 16 and so on. Notice how the number of combinations escalates as more coins are considered.

Now imagine that instead of the coins we have many electrons, each of which can exist in one of two states. This is close to the truth, as many subatomic particles when placed in a magnetic field can indeed adopt only two configurations: parallel or antiparallel to the field. Quantum mechanics allows that the state of the system as a whole can be a superposition of all possible such “heads/tails” alternatives. With even a handful of electrons, the number of alternatives making up the superposition is enormous, and each one can be used to process information at the same time as all the others. To use the jargon, a quantum superposition allows for massive parallel computation. In effect, the system can compute simultaneously in all the parallel universes, and then combine the results at the end of the calculation. The upshot is an exponential increase in computational power. A quantum computer with only 300 electrons, for example, would have more components in its superposition than all the atoms in the observable universe!

Achieving superpositions of many-particle states is not easy (the particles don’t have to be electrons). Quantum superpositions are notoriously fragile, and tend to be destroyed by the influence of the environment, a process known as decoherence. Maintaining a superposition is like trying to balance a pencil on its point. So far physicists have been able to attain quantum computational states involving only two or three particles at a time, but researchers in several countries are hastily devising subtle ways to improve on this and to combat the degenerative effects of decoherence. Gerard Milburn of the University of Queensland and Robert Clark of the University of New South Wales are experimenting with phosphorus atoms embedded in silicon, using the orientation of the phosphorus nuclei as the quantum equivalent of heads and tails.

The race to build a functioning quantum computer is motivated by more than a curiosity to see if it can work. If we had such a machine at our disposal, it could perform tasks that no conventional computer could ever accomplish. A famous example concerns the very practical subject of cryptography. Many government departments, military institutions and businesses keep their messages secret using a method of encryption based on multiplying prime numbers. (A prime number is one that cannot be divided by any other whole number except one.) Multiplying two primes is relatively easy. Most people could quickly work out that, say, 137 x 291 = 39867. But going backwards is much harder. Given 39867 and asked to find the prime factors, it could take a lot of trial and error before you hit on 137 and 291. Even a computer finds the reverse process hard, and if the two prime numbers have 100 digits, the task is effectively impossible even for a supercomputer.

 In 1995 Peter Shor, now at AT&T Labs in Florham Park, New Jersey, demonstrated that a quantum computer could make short work of the arduous task of factorising large prime numbers. At this stage governments and military organizations began to take an interest, since it implied that a quantum computer would render many encrypted data insecure.





Quantum computers

Research projects were started at defence labs such as Los Alamos in New Mexico. NATO and the U.S. National Security Agency began pumping millions of dollars into research. Oxford University set up a special Centre for Quantum Computation.

Soon mathematicians began to identify other problems that looked vulnerable to solution by quantum computation. Most of them fall in the category of search algorithms – various forms of finding needles in haystacks. Locating a friend’s phone number in a directory is easy, but if what you have is a number and you want to work backwards to find the name, you are in for a long job.

A celebrated challenge of this sort is known as the travelling salesman problem. Suppose a salesman has to visit four cities once and only once, and the company wishes to keep down the travel costs. The problem is to determine the routing that involves minimal mileage. In the case of four cities, A, B, C and D, it wouldn’t take long to determine the distance travelled in the various alternative itineraries – ABCD, ACBD, ADCB and so on. But for twenty cities the task becomes formidable, and soars further as additional cities are added.

It is too soon to generalise on how effectively quantum computers will be able to short-circuit these sorts of mega-search problems, but the expectation is that they will lead to a breathtaking increase in speed. At least some problems that would take a conventional supercomputer longer than the age of the universe should be solvable on a quantum computer in next to no time. The practical consequences of this awesome computational power have scarcely been glimpsed.

Some scientists see an altogether deeper significance in the quest for the quantum computer. Ultimately, the laws of the universe are quantum mechanical. The fact that we normally encounter weird quantum effects only at the atomic level has blinded us to the fact that - to paraphrase Einstein - God really does play dice with the universe. The main use of computers is to simulate the real world, whether it is a game of Nintendo, a flight simulator or a calculation of the orbit of a spacecraft. But conventional computers recreate the non-quantum world of daily experience. They are ill suited to dealing with the world of atoms and molecules. Recently, however, a group at MIT succeeded in simulating the behaviour of a quantum oscillator using a prototype quantum computer consisting of just four particles.

 But there is more at stake here than practical applications, as first pointed out by David Deutsch. A quantum computer, by its very logical nature, is in principle capable of simulating the entire quantum universe in which it is embedded. It is therefore the ultimate virtual reality machine. In other words, a small part of reality can in some sense capture and embody the whole. The fact that the physical universe is constructed in this way – that wholes and parts are mutually enfolded in mathematical self-consistency – is a stunning discovery that impacts on philosophy and even theology. By achieving quantum computation, mankind will lift a tiny corner of the veil of mystery that shrouds the ultimate nature of reality. We shall finally have captured the vision elucidated so eloquently by William Blake two centuries ago:


To see a World in a grain of sand,

And a Heaven in a wild flower,

Hold infinity in the palm of your hand,

And eternity in an hour.


by  Paul Davies  10/31/2002


source http://www.physicspost.com

MORE AT    http://youtu.be/I56UugZ_8DI



(CLICKING ON THE TITLED LINK WE ARE REDIRECTED TO A WIKI LIST OF QC SIMULATORS )

Labels:

Link

Wednesday, April 20, 2011

ADVANCED ENERGY TECHNOLOGIES

DEAR FELLOW READERS,HELLO,
OUR TEAMWORK WAS PARTICIPATING TO :

THE EU sustainable ENERGY WEEK
11-15 APRIL 2011

DURING THE VARIOUS  SESSIONS,INTERESTING PRESENTATIONS WERE SEEN,WHICH
ENLIGHTENED US ,ABOUT THE FUTURE ENERGY POLICIES ,WHICH SHOULD BE CREATED. 
BEING MORE PRECISE WE FOLLOWED

A)ENERGY EFFICIENCY IS NOT SUFFICIENT.WHAT ARE POSSIBLE SUFFICIENCY STRATEGIES?
ON 9/4/11 AT ULB (UNIVERSITE LIBRE DE BRUXELLES)

B)HIGH LEVEL OPENING OF THE EU CONFERENCE ON SUSTAINABLE ENERGY POLICY
ON 12/4/11 AT EUROPEAN COMMISSION - CHARLEMAGNE

C) ENERGY DAYS PIVEX PLATFORM AND SMART ENERGY NETWORKS
MINISTRY OF ENVIRONMENT AND FORESTS ROMANIA
ON 13/4/11 AT RESIDENCE HOTEL PALACE

D)COMPARING MARINE ENERGY TECHNOLOGIES
UNIVERSITY OF EDINBURGH,EUROPEAN OCEAN ENERGY ASSOCIATION
ON 14/4 AT CHARLEMAGNE

E)ICT FOR ENERGY EFFICIENCY
ON 14/4/11 AT COMMITTEE OF THE REGIONS

GENERALLY SPEAKING A LOT OF IMPORTANT STAKEHOLDERS WERE GATHERED ,WHO DISCUSSED IN DEPTH THE ENERGY PROBLEM

THANK YOU ,
HAPPY EASTER TO OUR CHRISTIAN BROTHERS

AGGELOS CHARLAFTIS


 BELOW IT IS PRESENTED AN INTERESTING ESSAY AS :

 An informational guide to available  high-tech efficient energy systems

It is no mystery now that depleting fossil fuel reserves and greenhouse
gas emissions are problems that must be dealt with, especially in urban
areas. The first of many steps are being taken by setting goals for cleaner
and more efficient power generation, however, meeting these goals is beyond
the capability of current methods. The answer to reaching and surpassing
expectation may be to implement cutting edge and innovative technologies.
In this informational guide, some of the newest technologies will be
explained in a manner that can reach those who do not have a technical
background. The intention is to inform people of some potential technologies
that are currently available, whose information may not be accessible. We
hope that education will aid in the development and implementation of these
innovative solutions and lead us into a cleaner and greener future.

Focusing on London Borough of Merton, we will discuss several technologies
that could aid in solving some of the problems with CO2 emissions, high
fuel prices and waste disposal.
A)Combined Heat and Power
B) Hydrogen Fuel Cells
C) Pyrolysis
D) Anaerobic Digestion


Combined Heat  and Power



The combined heat and power (CHP)  concept is quite simply the generation of heat and
electricity from a single source; it represents the  most efficient way to generate heat and
electricity. During conventional power generation,excess heat is usually wasted. CHP systems utilize
the waste heat, achieving overall machine efficiencies of 80% and more. Energy costs can be
significantly reduced while being environmentally  friendly as GHG emissions are also reduced.
Furthermore, many manufacturers today engineer  machines to utilize a variety of fuel sources
including renewable bio-fuels.
In addition to reduction in energy use and  carbon emissions, the are a number of
commercial benefits including government funding  and avoidance of the Climate Change Levy.
 


 BASIC PRINCIPAL
















TYPES
 Steam Turbine
 Gas Turbine
 Combined Cycle Gas/Steam Turbine
 Reciprocating Engine

 
ENGINE GENERAL SPECIFICATIONS



DIVERSITY OF FUEL
Natural Gas - Biogas - Diesel  - Propane


CASE STUDY: WOKING
Since 1991, the London Borough of Woking has  installed over 60 independent reciprocating engine CHP machines across the borough. Each machine is connected together by a private wire network owned by the energy services company, Thamesway Energy Ltd, which is 100%
owned by the borough. The borough also includes renewable  sources such as photovoltaics into the network. By 2003 the  borough was 99.85% off of the national grid. As a result, from
1991 to 2002 Woking has reduced energy consumption by  43.8% (170,170,665 kWh) and cut carbon emissions by 71.5% (96,588 tones). Nitrous Oxides (NOx) and Sulphur Dioxide (SO2)  emissions have been  cut down by 68%  and 73.4%  respectively. Total  savings for the  Borough in 11 years  have amounted to  £4.9 million pounds.

For more information visit:
http://www.aircogen.co.uk/
http://www.clarke-energy.co.uk/
http://www.cogenco.co.uk/
http://www.energ.co.uk/chp.asp


Hydrogen Fuel Cells

Some of the most promising technology for the  future of power generation is fuel cells. Fuel cells
represent the cleanest production of heat and electricity  currently available. Operating through a non-combustion  based, non-mechanical process, fuel cells are able to  achieve very low GHG emission and excellent efficiency.
They are versatile and fuel flexible, tending to almost any  size application and deliver consistent reliable power,even from renewable fuels. There is currently large scale  research and development in many countries to overcome the difficulties of commercialization; however the technology is still largely immature and remains expensive compared to other mature technology.



 BASIC PRINCIPAL

Hydrogen fuel cells operate on a principal originally
demonstrated in 1839 by Welsh scientist Sir William Grove.
He discovered an electrochemical process involving
hydrogen and oxygen in a cell that produces electricity and
heat.





BASIC PROCESS


1. Hydrogen rich fuel flows into the anode, the negative terminal
2. Air flows into the  cathode, the positive  terminal
3. The electrochemical reaction is induced by  the catalyst and occurs across the electrolyte
4. DC electricity is  produced and is fed to  the work load (light bulb, motor, grid network)
5. Heat, water and CO2 (if pure hydrogen is not used) are exhausted

TYPES

There are many types of fuel cells, however four have
proven to be well suited for stationary power and cogeneration.
A) Polymer Electrolyte Membrane (PEM)
B) Phosphoric Acid Fuel Cell (PAFC)
C)Molten Carbonate Fuel Cell (MCFC)
D) Solid Oxide Fuel Cell (SOFC)
Each type of fuel cell offers different characteristics:






CASE STUDY: WOKIING PARK

In September, 2003 the London Borough of  Woking installed a UTC PC25 PAFC fuel cell with to
provide heat and electricity to the leisure center and pool  area. The fuel cell has  performed as expected
operating at 37%  electrical efficiency. The  overall efficiency has  been less than  expected, at 57%, as
not all heat output has  been utilized. The fuel cell has brought great results to the Borough in terms of fuel consumption and carbon emissions.
1) Carbon Emission savings of over 1,000 tonnes/yr (compared to fossil fuel combustion methods)
2) 1 million liters of surplus pure water  Each PC25 fuel cell is rated to generate 200kW
of electrical power and 270 kW thermal power. This is  enough power for approximately 57 three bedroom  households


For more information visit:
http://www.eere.energy.gov/hydrogenandfuelcells/
http://www.fuelcelltoday.com
http://www.utcfuelcells.com Fuel Cells


PyrolysisOne of many new alternatives to typical waste  disposal methods is pyrolysis. Pyrolysis is a quickly
developing waste-to-energy technology that is cleaner and more efficient than methods such as incineration and landfilling. It is an advanced thermal treatment that uses extremely high temperatures in the absence of  oxygen to break down waste and other organic material  into more useful fuel products including syngas,pyrolysis oil, and char.
With the expected growths in waste generation  and reductions in landfill availability, pyrolysis is an
appealing economic and environmental solution for  urban areas and municipalities working to reduce the amount of waste being sent to landfills. Pyrolysis is  designed to not only help minimize waste, but to  generate fuel for local energy production in use with CHP  and reduce greenhouse gas (GHG) emissions.





WASTE TREATMENT




BASIC PROCESS



BY- PRODUCTS

1) Synthetic Gas (Syngas) - Gas by-product made up  of carbon monoxide, hydrogen, carbon dioxide, and  methane. Syngas can be used as a fuel to generate heat and/or electricity, or as a chemical
for industrial use.
2) Pyrolysis Oil (bio-fuel) - Liquid residue that can be  used as a fuel to generate heat and/or electricity, or a  chemical for industrial use, fertilization, etc.
3) Char - Solid residue containing carbon and ash.Char is typically disposed of but may be used as
an alternative fuel or recycled.





CASE STUDY: BURGAU, GERMANY

In 1983, WasteGen UK supplied a Materials Energy  and Recovery plant to Burgau, Germany. The plant is a unique combination of a pyrolysis plant and power generation plant and was designed to treat municipal solid waste (MSW). It was  built just outside the city on approximately 1 hectare of land,
and began full operation in  1984. The plant currently  processes around 34,000 tonnes of MSW a year from  120,000 residents.
Any solid by-products  produced by the plant are  disposed of in a nearby landfill.
Gas, however, is typically used  to generate energy. Syngas is burned in a gas boiler to create
steam which drives a 2.2 MW steam turbine for electricity  production. This is enough electricity to power over 4000  residential homes. Any excess steam is piped to a next door  greenhouse for heating.


Anaerobic Digestion

Anaerobic digestion (AD) is a growing  technology in Europe and around the US for the treatment
of waste and biomass. It is most commonly referred to as  biological treatment or a waste-to-energy technology.
Unlike typical methods for waste disposal, AD uses naturally growing bacteria to break down biodegradable  organic waste in the absence of oxygen and convert it into a more useful by-products including biogas, liquid digestate, and fibre digestate.
Commercial manufacture and availability of AD  plants has only begun to increase in the past few decades,along with system designs for the treatment of municipal  solid waste. However, with the projected growths in waste  generation and reductions in landfill space availability,
anaerobic digestion is becoming a much more attractive  and economically feasible alternative for municipal solid waste disposal in urban areas.





WASTE TREATMENT




THE AD PROCESS

Step 1: Pre-Treatment: Materials not suitable for digestion are
removed from the incoming waste.
Step 2: Waste Digestion : Incoming waste is moved into a large,
enclosed tank, known as a digester, which is heated and rid of all oxygen. Bacteria grow inside the digester and break down complex waste matter into simpler materials.
Step 3: Gas Recovery:  30-60% of the incoming waste is converted to a
biogas by-product which is cleaned, collected, and  stored till it can be used.
Step 4: Residue Treatment: Bioliquid and biosolid by-products are collected
and treated to be used as soil conditioners or  composting material.

BY-PRODUCTS


I) Biogas A gas made up of 60% methane and 40% carbon
dioxide, that can be burned to generate heat and/or electricity.
II)Bioliquid (Liquid Residue) - Liquid by-product that can be used as fertilizer to improve soils.
III)Biosolid (Fibre Residue) - Solid byproduct  that can be used as a soil conditioner or compost.



*Based on various sources


CASE STUDY: VALORGA PLANT

In 1994, Organic Waste Systems (OWS) began
operation of the Valorga plant in Tilburg, Netherlands. The  plant is located next to a landfill on 1.6 hectares of land and currently takes in waste from approximately 380,000 people. It has the potential for an annual waste capacity of  52,000 tonnes of  VGF (vegetable, fruit, and garden
waste), but usually takes in around 42,000 tonnes of VGF per year.
Studies have shown that  the plant produces around 18,000 tonnes of compost yearly
and 82m3-106m3 of biogas per  tonne of waste. The biogas is  refined to a quality comparable to natural gas and burned  to generate around 18GWh of energy a year. 3.3GWh of this is used to heat the AD plant, while the remaining  14.7GWh is sold to gas distributors. Initial investment of
the plant cost £12 million, but the plant is now bringing in  an annual average revenue of £2.2 million.


 CHP References:
Greenpeace Briefing. (2006). Decentralising energy the Woking case study. Retrieved April 21, 2006 from  http://www.greenpeace.org.uk/MultimediaFiles/Live/FullReport/7468.pdf
Taking Stock: Managing our impact. (n.d). Case Study 2: Woking Borough Council Energy Services. Retrieved April 19, 2006 from  http://www.takingstock.org/Downloads/Case_Study_2-Woking.pdf
Cogenco Team (2006). CHP: An Overview. Retrieved April 21, 2006 from http://cogenco.co.uk/English/an_overview.html


Hydrogen Fuel Cell References:
MTU-Friedrichshafen. (2003). The high temperature fuel cell  combined power heat energy generation for the future. MTU  CFC Solutions. Retrieved February 05, 2006 from http://www.mtu-friedrichshafen.com/cfc/en/cfcs/cfcs.htm# Rocky Mountain Institute. (2005). Energy: Fuel Cells. Retrieved February 4, 2006 from http://www.rmi.org/sitepages/pid315.php
U.S. Department of Energy. (2005). Energy Efficiency and Renewable Energy: Hydrogen, Fuel Cells, and Infrastructures  Tecnologies Program. Retrieved February 4, 2006 from http://www.eere.energy.gov/hydrogenandfuelcells/
United Technologies Company. (2006). Pure Cell 200 Power Solution. UTC Power: Our Solutions. Retrieved February 05, 2006  from http://www.utcpower.com/fs/com/bin/fs_com_Page/0,5433,03100,00.html

Pyrolysis References:
BTG Biomass Technology Group. (2005). Bio-oil Applications. Retrieved April 20, 2006 from
http://www.btgworld.com/2005/html/technologies/bio-oil-applications.html
Compact Power. (n.d.). Renewable Energy from Waste. Retrieved April 20, 2006 from http://www.compactpower.co.uk/index.php
European Environment Agency. (January 2002). Biodegradable Municipal Waste Management in Europe. Part 3: Technology and  Market Issues [Electronic Version]. Retrieved April 20, 2006 from http://www.environmental-center.com/articles/article1156/part3.pdf
Friends of the Earth. (October 2002). Briefing: Pyrolysis and Gasification [Electronic Version]. Retrieved February 5, 2006 from
http://www.foe.co.uk/resource/briefings/gasification_pyrolysis.pdf
Fortuna, F., Cornacchia, M., Mincarini, M., and Sharm, V. K.. (1997). Pilot Scale Experimental Pyrolysis Plant: Mechanical and
Operational Aspects. Journal of Analytical and Applied Pyrolysis, 40-41, 403-417.
Gale, Steve. (2001). Modern Residuals Processing in Theory and Practice. Retrieved February 5, 2006 from
http://www.hatch.ca/Sustainable_Development/Articles/organics_processing_2001.pdf
Juniper Consultancy Services Ltd. (2003). Pyrolysis and Gasification Factsheet. Technology Reviews for the Waste, Environmental,
and Renewable Energy Sectors. Retrieved February 5, 2006 from
Smith, G. (October 2004). Pyrolysis Facility. Landfilling Our Resources is a Waste. Retrieved April 21, 2006 from
http://www.lacity.org/council/cd12/pdf/
Landfilling_Resources_MPA_Pyrolysis_Facility.pdf
WasteGen UK, Ltd. (n.d.). Generating Value from Waste: Pyrolysis Energy Recovery. Retrieved February 5, 2006 from
http://www.wastegen.com/template.htm

Anaerobic Digestion References:
Duerr, M., Gair, S., Cruden, A, McDonald, J. (2005). The Design of a Hydrogen Organic Fuel Source/Fuel Cell Plant. International
Hydrogen Energy Congress and Exhibition. Scotland, UK: University of Strathclyde.
Friends of the Earth. (November 2004). Briefing: Anaerobic Digestion [Electronic Version]. Retrieved February 11, 2006 from
http://www.foe.co.uk/resource/briefings/anaerobic_digestion.pdf
IEA Bioenergy. (July 2001). Biogas and More! System and Markets Overview of Anaerobic Digestion [Electronic Version].
Oxfordshire, UK: AEA Technology Environment.
Maunder, D.H., Brown, K.A., and Richards, K.M. (August 1995). Generating Electricity from Biomass and Waste.
Power Engineering Journal, 9(4), 188-196.
Ostrem, K. (May 2004). Greening Waste: Anaerobic Digestion for Treating the Organic Fraction of Municipal Solid Wastes.
New York: Columbia University.
Verma, S. (May 2002). Anaerobic Digestion of Biodegradable Organics in Municipal Solid Wastes. New York: Columbia University.
Wannholt, L. (1999). Biological Treatment of Domestic Waste in Closed Plants in Europe – Plant Visit Reports.
RVF Report, 98:8. Malmo: RVF.
Waste. (May 2005). Fact Sheet: Anaerobic Digestion. Retrieved April 20, 2006 from http://www.waste.nl/page/248

SOURCE  http://www.wpi.edu

Labels:

Link