THE EUROPEAN RESPONSE MIGHT BE THE SYNNET
DEAR FELLOW READERS IRENE,PEACE,SALAM,
IT SEEMS THAT THE UPCOMING EVENTS WILL BE VERY CRUCIAL FOR OUR PAN-EUROPEAN AND MEDITERRANEAN ZONE,FOR WHICH WE ARE WORKING SEVERAL YEARS.
MEANWHILE AND UP TO THAT TIME ,RESEARCH CONTINUES:
2/5/12
RESEARCH IN FUTURE CLOUD COMPUTING,
AT CHARLMAGNE BUILDING ,BRU,BE.
THE BEST MULTI-CO WERE THERE
IBM,MICROSOFT,ATOS,ALCATEL-LUCENT ETC TO BE NAMED SOME OF THEM.
ALSO IMPORTANT SCIENTIFIC AND COMMERCIAL STAKEHOLDERS PARTICIPATED TO THE EVENT.
TO OUR SURPRISE IT WAS HEARD FOR THE FIRST TIME AND WE MET FOR A WHILE ,WITH THE REPRESENTATIVE OF THE SOLE AMBITIOUS EUROPEAN SEMICONDUCTOR INDUSTRY.
3/5/12
THE ECONOMIC AND SOCIETAL IMPACT FUTURE INTERNET TECHNOLOGIES,SERVICES AND APPLICATION,
AT DG ICT PREMISES,BRU,BE.
A FINAL WORKSHOP FOR THE CONDUCTION OF A STUDY BY
IDC EMEA (EUROPEAN GOVERNMENT CONSULTING) ,WIK-CONSULT,RAND EUROPE,
AND NOKIA.
HERE IT WAS DECLARED ONCE AGAIN (see article on 22/3/12) :
THE NEED FOR THE EUROPEAN COMMISSION TO FINANCE AND ASSIST THE RESEARCH ,STUDY AND IMPLEMENTATION IN ORDER
WE THE EUROPEANS TO HAVE OUR OWN INTERNET ,WHICH IT MIGHT BE NAMED AS SYNNET.
THE INTERNET IT IS CONTROLLED BY THE U.S.A MILITARY ,THROUGH SEVERAL STAKEHOLDERS AND N.P.O'S .THIS CREATES TO EUROPEAN CITIZENS A SECURITY AND PRIVACY LACK.
WHY DID EUROPE INVEST AT THE SATELLITE POSITIONING TECHNOLOGY,THE GALILEO?
THERE WERE ALREADY TWO OF THEM WHICH WERE DEVELOPED BY THE DEFENSE SECTOR FROM U.S.A.(GPS) AND FORMER U.S.S.R. (GLONASS)
THE PROJECT GALILEO WHICH TODAY IS THE BEST ONE ,WAS DEVELOPED BY CIVILIANS FROM THE SCIENTIFIC ,INDUSTRY AND SOCIETAL SECTORS.
IN THE SAME WAY WE COULD DEVELOP OUR OWN HIGH TECH PROTOCOLS ,SINCE THE INTERNET PROTOCOL AND ARCHITECTURE TECHNOLOGIES (TCP/IP ETC) ARE BASED ON 50' AND 60's SCIENTIFIC RESEARCH.
SUCH AN ACTION ,UNDER THE UNIVERSITIES RESEARCH CAPABILITIES AND RESPONSIBILITY ,COMBINED WITH THE COOPERATION OF OTHER INTERESTED COMMUNITIES,ON THE ONE HAND IT COULD CREATE NEW RESEARCH AND INDUSTRY POSITIONS BUT ON THE OTHER HAND ALSO IT WILL CREATE FOR SURE A COMMERCIAL ,TECHNOLOGICAL AND OTHER COMPETITION ,BETWEEN THE TWO DIFFERENT SYSTEMS (INTERNET AND SYNNET),FOR THE CITIZENS BENEFIT.
THE TWO SYSTEMS ACCORDING TO CLIENT'S REQUEST COULD BE BRIDGED AND EXCHANGE DATA INFORMATION.
TODAY EUROPEAN INDUSTRY DOESN'T EXIST AND IF IT DOES IT ISN'T ANYMORE AT THE HANDS OF OURS.
LET ME ALSO POINT SOME SPECIFIC SENTENCES FROM AN ARTICLE WHICH WAS PUBLISHED TODAY BY AN AMERICAN MAGAZINE.
Frankovsky wrote that some new hardware designs included Facebook's
"vanity-free" storage server and motherboard designs contributed by chip
makers AMD and Intel. AMD's motherboard is about 16-inches by
16.5-inches (40.6 centimeters by 41.9 centimeters), and is for
high-performance computing and general purpose installations such as
cloud deployments. On the software side, VMware will certify its vSphere
virtualization platform to run on hardware based on the OpenRack
specification.
If the Facebook design becomes a de-facto industry standard for
cloud and Web 2.0 data centers, it could make the deployment and
management of systems easier over time, said Charles King, principal
analyst at Pund-IT.
"The Facebook standard seems to be a data-center centric effort
where you are looking to establish a system-design standard that can
populate a data center with hundreds of thousands of servers," King
said.
(
MORE ...)
THANKS ALL FOR YOUR ATTENTION
BELOW ARE PRESENTED SOME RESEARCH ARTICLES WHICH MAYBE THEY ARE INDICATING SOME PATHS FOR THE EUROPEAN RESEARCH ICT INDUSTRY
Researchers Use Diamonds to Boost Computer Memory
Johns Hopkins University engineers are using diamonds to change the properties of an alloy used in
phase-change memory, a change that could lead to the development higher capacity storage systems that retain data more quickly and last longer than current media.
The process, explained this month in the online edition of
Proceedings of the National Academy of Sciences (PNAS), focused on changes to the inexpensive GST phase-change memory alloy that's composed of germanium, antimony and tellurium.
"This phase-change memory is more stable than the material used in current flash drives. It works 100 times faster and is rewritable about 100,000 times," said the study's lead author, Ming Xu, a doctoral student at the Whiting School of Engineering at Johns Hopkins University.
"Within about five years, it could also be used to replace hard drives in computers and give them more memory," he suggested.
GST has been in use for two decades and today is widely used in rewritable optical media, including CD-RW and DVD-RW discs.
IBM and others are
already developing solid-state chip technology using phase-change memory, which IBM says can sustain up to 5 million write cycles. High-end NAND flash memory systems used today can sustain only about 100,000 write cycles.
By using diamond-tipped tools to apply pressure to the GST, the researchers found they could change the properties of the alloy from an amorphous to a crystalline state and thus reduce the electrical resistivity by about four orders of magnitude. By slowing down the change from an amorphous state to a crystalline state, the scientists were also able to produce many varying states allowing more data to be stored on the alloy.
GST is called a phase-change material because, when exposed to heat, an area of the alloy can change from an amorphous state, in which the atoms lack an ordered arrangement, to a crystalline state, in which the atoms are neatly lined up in a long-range order.
An illustration of how the diamond-tipped tools were used to compress GST
The two states are then used to represent the computer digital language of ones and zeros.
In its amorphous state, GST is more resistant to electric current. In its crystalline state, it is less resistant
The two phases of GST, amorphous and crystalline, also reflect light differently, allowing the surface of a DVD to be read by tiny laser.
While GST has been used for some time, the precise mechanics of its ability to switch from one state to another have remained something of a mystery because it happens in nanoseconds once the material is heated.
To solve this mystery, Xu and his research team used the pressure from diamond tools to cause the change to occur more slowly.
The team used a method known as X-ray diffraction, along with a computer simulation, to document what was happening to the material at the atomic level. By recording the changes in "slow motion," the researchers found that they could actually tune the electrical resistivity of the material during the time between its change from amorphous to crystalline form.
"Instead of going from black to white, it's like finding shades or a shade of gray in between," said En Ma, a professor of materials science and engineering, and a co-author of the PNAS paper. "By having a wide range of resistance, you can have a lot more control. If you have multiple states, you can store a lot more data."
SOURCE http://www.pcworld.com/
U.S. Lags in Internet Connectivity Speeds
Akamai's "The State of the Internet" survey reveals that the rest of the world is speeding past the United States in Internet connectivity.
Akamai, one of the companies that can actually pull off a survey like this, has released its quarterly "
The State of the Internet" report that outlines the relative speeds and penetrations of Internet connectivity in the U.S. and around the world.
It lists today's poor performers, which are actually stunningly fast by the standards of the Internet a decade ago when everyone was struggling. Many of us can recall the days of the fractional T-1. These megabit-per-second shared lines were considered the Rolls-Royce of connectivity.
The 600-page report delves into everything from IPv6 adoption to sources of Internet attack traffic. For anyone who follows the news, there are few revelations except for the fact that apparently Washington, D.C. has the worst Internet overall speeds of any place in the country. This also happens to be where the FCC lives and where Congressional hearings discuss topics like Internet speed and connectivity. It's all too hilarious.
After D.C., the worst states in the U.S., as far as overall Internet speed is concerned, are: Missouri, Georgia, Alaska, Illinois, Iowa, Colorado, Ohio, and Texas.
Korea leads the world with a 17.5 Mbps average. Japan and Hong Kong are next, with 9.1 Mbps. The U.S. is number 13 on the list after Holland, Latvia, Switzerland, Czech Republic, and Romania. We average 5.8 Mbps. The average connection speed globally is 2.3 Mbps. This is humiliating, but the U.S. has not even been in the top 10 for more than a decade.
Number one city in the world is Taegu, South Korea with an average connection speed of 21.8 Mbps. The top six cities were all in Korea and the next eight were in Japan. An American city, Boston, finally makes the list at number 51 with 8.4 Mbps followed by North Bergen, New Jersey at number 52. Jersey City, New Jersey comes in number 58.
Curiosities, as far as I'm concerned, were Monterey Park, California at number 59 with 8.2 Mbps and Manchester, New Hampshire at number 68 with 7.8 Mbps.
It proves that, if you do some research, you can find some hotspots around the world where you can do business at high speed. At that point, you have to be concerned with uptime and backup connectivity. Places like New Caledonia, in the southwest Pacific Ocean, are highlighted for their initiatives such as soon-to-be implemented fiber to the home. Nobody even talks about fiber to the home in California anymore, at least since the housing market collapse.
The report is peppered with enticing snippets, too, such as these cocktail party facts:
"Fully one quarter of Moroccan households boast a broadband connection - up from just two percent in 2004."
"Argentina has one of the most developed broadband markets in Latin America, with some of the fastest and least expensive plans on offer."
"Romania is one of the leading countries in the world in terms of high speed internet access in larger cities, but fixed broadband internet penetration is still low in Romanian regions and rural areas."
There is a lot to be learned from reports like these, but the main lesson is that the world is moving fast on Internet technology and we need to be a leader, not a follower. This means everyone in Washington, D.C. should look over this data carefully—assuming, of course, they can actually manage to download it with their slow connections.
SOURCE http://www.pcmag.com
Labels: Research
THE MACHINE FOR DREAMING
Thomas Schenkel and his colleague T.C. Shen of the Accelerator and Fusion Research Division (BERKLEY LAB) are working with an electron-beam ion trap to develop a quantum computer based on single-electron transistors. (Photo Roy Kaltschmidt)
Moore’s law
If you were shrewd enough to invest $15,000 in Microsoft ten years ago you’d be a millionaire by now. The engine of growth that has driven the computer industry, and looks set to make Bill Gates the world’s first trillionaire, is no lucky fluke. Underpinning it have been sweeping advances in fundamental electronics. The first computers used bulky vacuum tubes and needed entire buildings to house them. Then in the 1960’s along came the transistor, which in turn gave way to the incredible shrinking microchip. But this is not the end of the story. Yet more exotic technologies are in the pipeline, and they promise to have as great an impact on the information industry as did the invention of the original computer.
The commercial success of computers stems from the fact that with each technological leap, the processing power of computers has soared and the costs have plummeted, allowing manufacturers to penetrate new markets and hugely expand production. The inexorable rise of the computer’s potency is expressed in a rough and ready rule known as Moore’s Law, after Gordon Moore, the co-founder of Intel. According to this dictum, the processing power of computers doubles every 18 months. But how long can it go on?
Moore’s law is a direct consequence of the “small is beautiful” philosophy. By cramming ever more circuitry into a smaller and smaller volume, faster information processing can be achieved. Like all good things, it can’t go on forever: there is a limit to how small electronic parts can be. On current estimates, in less than 15 years, chip components will approach atomic size. What happens then?
The problem is not so much with the particulate nature of atoms as such. Rather it lies with the weird nature of physics that applies in the atomic realm. Here, the dependable laws of Newtonian mechanics dissolve away into a maelstrom of fuzziness and uncertainty.
To understand what this means for computation, picture a computer chip as a glorified network of switches linked by wires in such a way as to represent strings of 1’s and 0’s – so-called binary numbers. Every time a switch is flipped, a bit of information gets processed; for example, a 0 becomes a 1. Computers are reliable because in every case a switch is either on or off; there can be no ambiguity. But for decades physicists have known that on an atomic scale, this either/or property of physical states is fundamentally compromised.
The source of the trouble lies with something known as Heisenberg’s uncertainty principle. Put crudely, it says there is an inescapable vagueness, or in determinism, in the behaviour of matter on the micro-scale. For example, today an atom in a certain state may do such-and-such, tomorrow an identical atom could do something completely different. According to the uncertainty principle, it’s generally impossible to know in advance what will actually happen – only the betting odds of the various alternatives can be given. Essentially, nature is reduced to a game of chance.
Atomic uncertainly
Atomic uncertainly is a basic part of a branch of science known as quantum mechanics, and it’s one of the oddest products of twentieth century physics. So odd, in fact, that no less a scientist than Albert Einstein flatly refused to believe it. “God does not play dice with the universe,” he famously retorted. Einstein hated to think that nature is inherently and fundamentally indeterministic. But it is. Einstein notwithstanding, it is now an accepted fact that, at the deepest level of reality, the physical world is irreducibly random.
When it comes to atomic-scale information processing, the fact that the behaviour of matter is unreliable poses an obvious problem. The computer is the very epitome of a deterministic system: it takes some information as input, processes it, and delivers a definite output. Repeat the process and you get the same output. A computer that behaved whimsically, giving haphazard answers to identical computations, would be useless for most purposes. But try to compute at the atomic level and that’s just what is likely to happen. To many physicists, it looks like the game will soon be up for Moore’s Law.
Although the existence of a fundamental physical limit to the power of computation has been recognized for many years, it was only in 1981 that the American theoretical physicist Richard Feynman confronted the problem head-on. In a visionary lecture delivered at the Massachusetts Institute of Technology (MIT), Feynman speculated that perhaps the sin of quantum uncertainty could be turned into a virtue. Suppose, he mused, that instead of treating quantum processes as an unavoidable source of error to classical computation, one instead harnessed them to perform the computations themselves? In other words, why not use quantum mechanics to compute?
It took only a few years for Feynman’s idea of a “quantum computer” to crystallize into a practical project. In a trail-blazing paper published in 1985, Oxford theoretical physicist David Deutsch set out the basic framework for how such a device might work. Today, scientists around the world are racing to be the first to make it happen. At stake is far more than a perpetuation of Moore’s Law. The quantum computer has implications as revolutionary as any piece of technology in history. If such a machine could be built, it would transform not just the computer industry, but our experience of physical existence itself. In a sense, it would lead to a blending of real and virtual reality.
At the heart of quantum computation lies one of the strangest and most baffling concepts in the history of science. It is known technically as ‘superposition’. A simple example concerns the way an electron circles the nucleus of an atom. The rules of quantum mechanics permit the electron to orbit only in certain definite energy levels. An electron may jump abruptly from one energy level to a higher one if enough energy is provided. Conversely, left to itself, an electron will spontaneously drop down from a higher level to a lower one, giving off energy in the process. That is the way atoms emit light, for example.
Because of the uncertainty principle, it’s normally impossible to say exactly when the transition will occur. If the energy of the atom is measured, however, the electron is always found to be either in one level or the other, never in between. You can’t catch it changing places.
Quantum superpositions
Now comes the weird bit. Suppose a certain amount of energy is directed at the atom, but not enough to make it jump quickly to an excited state. According to the bizarre rules of quantum mechanics, the atom enters a sort of limbo in which it is somehow in both excited and unexcited states at once. This is the all-important superposition of states. In effect, it is a type of hybrid reality, in which both possibilities – excited and unexcited atom - co-exist. Such a ghostly amalgam of two alternative worlds is not some sort of mathematical fiction, but genuinely real. Physicists routinely create quantum superpositions in the laboratory, and some electronic components are even designed to exploit them in order to produce desired electrical effects.
For 70 years physicists have argued over what to make of quantum superpositions. What really happens to an electron or an atom when it assumes a schizophrenic identity? How can an electron be in two places at once? Though there is still no consensus, the most popular view is that a superposition is best thought of as two parallel universes that are somehow both there, overlapping in a sort of dual existence. In the case of the atom, there are two alternative worlds, or realities, one with the electron in the excited state, the other with the electron in the unexcited state. When the atom is put into a superposition, both worlds exist side-by-side.
Some physicists think of the alternative worlds in a superposition as mere phantom realities, and suppose that when an observation is made it has the effect of transforming what is only a potential universe into an actual one. Because of the uncertainty principle, the observer can’t know in advance which of the two alternative worlds will be promoted to concrete existence by the act of observation, but in every case a single reality is revealed – never a hybrid world. Other physicists are convinced that both worlds are equally real. Since a general quantum state consists of a superposition of not just two, but an unlimited number of alternative worlds, the latter interpretation implies an outlandish picture of reality: there isn’t just one universe, but an infinity of different universes, existing in parallel, and linked through quantum processes. Bizarre though the many-universes theory may seem, it should not be dismissed lightly. After all, its proponents include such luminaries as Stephen Hawking and Murray Gell-Mann, and entire international conferences are devoted to its ramifications.
How does all this relate to computation? The fact that an atom can be in either an excited or an unexcited state can be used to encode information: 0 for unexcited, 1 for excited. A quantum leap between the two states will convert a 1 to a 0 or vice versa. So atomic transitions can therefore be used as switches or gates for computation.
The true power of a quantum computer comes, however, from the ability to exploit superpositions in the switching processes. The key step is to apply the superposition principle to states involving more than one electron. To get an idea of what is involved, imagine a row of coins, each of which can be in one of two states: either heads or tails facing up. Coins too could be used to represent a number, with 0 for heads and 1 for tails.
Heads and Tails
Two coins can exist in four possible states: heads-heads, heads-tails, tails-heads and tails-tails, corresponding to the numbers 00, 01, 10 and 11. Similarly three coins can have 8 configurations, 4 can have 16 and so on. Notice how the number of combinations escalates as more coins are considered.
Now imagine that instead of the coins we have many electrons, each of which can exist in one of two states. This is close to the truth, as many subatomic particles when placed in a magnetic field can indeed adopt only two configurations: parallel or antiparallel to the field. Quantum mechanics allows that the state of the system as a whole can be a superposition of all possible such “heads/tails” alternatives. With even a handful of electrons, the number of alternatives making up the superposition is enormous, and each one can be used to process information at the same time as all the others. To use the jargon, a quantum superposition allows for massive parallel computation. In effect, the system can compute simultaneously in all the parallel universes, and then combine the results at the end of the calculation. The upshot is an exponential increase in computational power. A quantum computer with only 300 electrons, for example, would have more components in its superposition than all the atoms in the observable universe!
Achieving superpositions of many-particle states is not easy (the particles don’t have to be electrons). Quantum superpositions are notoriously fragile, and tend to be destroyed by the influence of the environment, a process known as decoherence. Maintaining a superposition is like trying to balance a pencil on its point. So far physicists have been able to attain quantum computational states involving only two or three particles at a time, but researchers in several countries are hastily devising subtle ways to improve on this and to combat the degenerative effects of decoherence. Gerard Milburn of the University of Queensland and Robert Clark of the University of New South Wales are experimenting with phosphorus atoms embedded in silicon, using the orientation of the phosphorus nuclei as the quantum equivalent of heads and tails.
The race to build a functioning quantum computer is motivated by more than a curiosity to see if it can work. If we had such a machine at our disposal, it could perform tasks that no conventional computer could ever accomplish. A famous example concerns the very practical subject of cryptography. Many government departments, military institutions and businesses keep their messages secret using a method of encryption based on multiplying prime numbers. (A prime number is one that cannot be divided by any other whole number except one.) Multiplying two primes is relatively easy. Most people could quickly work out that, say, 137 x 291 = 39867. But going backwards is much harder. Given 39867 and asked to find the prime factors, it could take a lot of trial and error before you hit on 137 and 291. Even a computer finds the reverse process hard, and if the two prime numbers have 100 digits, the task is effectively impossible even for a supercomputer.
In 1995 Peter Shor, now at AT&T Labs in Florham Park, New Jersey, demonstrated that a quantum computer could make short work of the arduous task of factorising large prime numbers. At this stage governments and military organizations began to take an interest, since it implied that a quantum computer would render many encrypted data insecure.
Quantum computers
Research projects were started at defence labs such as Los Alamos in New Mexico. NATO and the U.S. National Security Agency began pumping millions of dollars into research. Oxford University set up a special Centre for Quantum Computation.
Soon mathematicians began to identify other problems that looked vulnerable to solution by quantum computation. Most of them fall in the category of search algorithms – various forms of finding needles in haystacks. Locating a friend’s phone number in a directory is easy, but if what you have is a number and you want to work backwards to find the name, you are in for a long job.
A celebrated challenge of this sort is known as the travelling salesman problem. Suppose a salesman has to visit four cities once and only once, and the company wishes to keep down the travel costs. The problem is to determine the routing that involves minimal mileage. In the case of four cities, A, B, C and D, it wouldn’t take long to determine the distance travelled in the various alternative itineraries – ABCD, ACBD, ADCB and so on. But for twenty cities the task becomes formidable, and soars further as additional cities are added.
It is too soon to generalise on how effectively quantum computers will be able to short-circuit these sorts of mega-search problems, but the expectation is that they will lead to a breathtaking increase in speed. At least some problems that would take a conventional supercomputer longer than the age of the universe should be solvable on a quantum computer in next to no time. The practical consequences of this awesome computational power have scarcely been glimpsed.
Some scientists see an altogether deeper significance in the quest for the quantum computer. Ultimately, the laws of the universe are quantum mechanical. The fact that we normally encounter weird quantum effects only at the atomic level has blinded us to the fact that - to paraphrase Einstein - God really does play dice with the universe. The main use of computers is to simulate the real world, whether it is a game of Nintendo, a flight simulator or a calculation of the orbit of a spacecraft. But conventional computers recreate the non-quantum world of daily experience. They are ill suited to dealing with the world of atoms and molecules. Recently, however, a group at MIT succeeded in simulating the behaviour of a quantum oscillator using a prototype quantum computer consisting of just four particles.
But there is more at stake here than practical applications, as first pointed out by David Deutsch. A quantum computer, by its very logical nature, is in principle capable of simulating the entire quantum universe in which it is embedded. It is therefore the ultimate virtual reality machine. In other words, a small part of reality can in some sense capture and embody the whole. The fact that the physical universe is constructed in this way – that wholes and parts are mutually enfolded in mathematical self-consistency – is a stunning discovery that impacts on philosophy and even theology. By achieving quantum computation, mankind will lift a tiny corner of the veil of mystery that shrouds the ultimate nature of reality. We shall finally have captured the vision elucidated so eloquently by William Blake two centuries ago:
To see a World in a grain of sand,
And a Heaven in a wild flower,
Hold infinity in the palm of your hand,
And eternity in an hour.
by Paul Davies 10/31/2002
source http://www.physicspost.com
MORE AT http://youtu.be/I56UugZ_8DI
(CLICKING ON THE TITLED LINK WE ARE REDIRECTED TO A WIKI LIST OF QC SIMULATORS )
Labels: Research
Link