UTF-8 https://feraios.blogspot.com/feeds/posts/default
Google

Friday, December 31, 2021

the global panoptikon deal is very clear and well understood

 POLITES,CITIZENS,METAPOLITANS,

MATRIOTS&PATRIOTS,DEMOCRATS,

DEAR READERS,CHAIRESTHAI,
GrAEA-EARTH HAS COME TO AN END TO ITS (her) ANNUAL TURN AROUND HELIOS-SUN THE 2021th ACCORDING TO THE LATEST european HUMAN CALENDAR THE GREGORIAN ONE.
THINGS ON THE SPHERE HAVENT CHANGED A LOT FOR THE LAST 2 YEARS AS THE  PROBLEMS IN ORDER THE CHANGE TO A NEW ERA  TO HAPPEN ARE STILL THE SAME:

A)ECONOMIC HIDDEN COLLAPSE

B)CLIMAT ISSUES MAINLY CAUSED BY THE FOLLOWED OLD ECONOMIC POLICIES-THEORIES  OF THE ELITS AND THEIR POLITICAL PARTY SYSTEM ESTABLISHMENT

AND 
C)THE LATEST SIGN OF A CHANGE DIRECTION WHICH IS  A PUBLIC HEALTH ISSUE(STUDYING HISTORY WE CAN READ,because this is the most reliable source, THAT WHEN THERE IS A SYSTEM CHANGE A PANDEMIA HAPPENS) 

THE WRITER OF THIS ARTICLE BUT OTHERS TOO ARE POINTING THESE THINGS FOR THE LAST 40 YEARS,BEING ENDANGERED LOTS OF TIMES TO LOOSE OUR MIND,HEALTH AND SELFRESPECT APART FROM THE CONCEQUENSES ON THE PERSONAL LIVING STANDARDS.

NOWADAYS WE SEEM TO BE RIGHT.

The Great Reset Is Actually a Great Purge Against Humanity

THIS DOESNT MEAN ANYTHING  FOR PEOPLE LIKE ME  BECAUSE THE SIMPLE AVERAGE CITIZEN IS HYPNOTISED BY  A POWERFULL OLIGARCHY AND EVENTHOUGH HE GETS IN THE MENTIONED POINTS (s)HE DOESNT REACT.

THAT MEANS THE OLD REGIMES ARE BUILDING THEIR NEW REALITY CALLED BY THEM THE 
new age STEP BY STEP with the great reset.




ADELPHOI,THEIR REALITY IS OBVIOUS NOT ONLY BY THE CAREFULL RESEARCHER BUT THESEDAYS  ALSO FROM THE MAJORITY OF THE POPULATION.
IF WE WONT REACT ONLY IN PEACE BY SPECIFIC METHODS OUR CHILDREN ARE GOING TO LIVE IN THE 
ANIMAL FARM
.

THIS IS OUR MESSAGE FOR 2022

JOIN-RESPECT-ACT in peace

IT IS WISHED THE NEW YEAR  THAT THE FLOWERS OF OUR WORK TO FLOURISH AND HYGEIA TO COME TO OUR PLANET

EUCHARISTOO FOR SUPPORTING


PS.IT IS ADVISÉD TO BE SEEN ALL THÉ SÉRIES OF THIS DW DOCUMENTARY 


Labels:

Thursday, February 13, 2014

SCIENCE ,RESEARCH AND REPLICATION

How can we cut down on research that doesn't replicate?


In this week's Nature, Francis Collins and Lawrence Tabak have written a policy statement about the problem of biomedical research results that cannot be replicated by independent studies. Collins is the director of the U.S. National Institutes of Health, and the essay is an official statement of how the agency is attempting to reduce the incidence of sensationalized findings that do not have real clinical validity.

Science has long been regarded as 'self-correcting', given that it is founded on the replication of earlier work. Over the long term, that principle remains true. In the shorter term, however, the checks and balances that once ensured scientific fidelity have been hobbled. This has compromised the ability of today's researchers to reproduce others' findings.
Although Collins and Tabak do not cite his work, their essay is a response to the findings of John Ioannidis, who has famously claimed that most published research findings are false. The basic reason is statistics. In many fields, studies are published when the results accord with a statistical threshold (often a 1 in 20 chance of being produced by chance). But across a whole field, everyone is selecting just these results for publication while discarding results that do not meet this threshold. Consider two studies of exactly the same phenomenon. One reaches the 1-in-20 threshold and is published. The other fails to reach the threshold -- maybe the result could be obtained once in every two trials -- and is not published. If both results could be compiled together, the phenomenon would fail to meet the test. But only one of the studies is published; the other falls into obscurity. It may never leave the pages of the lab notebook. Indeed, the results may never really be compiled when it is clear they will be non-significant.
What happens with replication studies is that research groups suddenly have an incentive to publish negative results. They might have obtained negative results on many, many different phenomena, but those would not be interesting enough to publish. We wouldn't probably be too concerned about any single case. But across many, many different areas of science, it appears that replication studies are consistently failing to reproduce the effects claimed by prior research. That failure reflects a systemic bias in favor of publishing "statistically significant" results."Significance" presupposes that the studies are independent, when in fact publication of studies across a field is highly non-independent.
The Economist published an informative article about Ioannidis' work last year, which is cited by Collins and Tabak: "Trouble at the lab".

The pitfalls Dr Stodden points to get deeper as research increasingly involves sifting through untold quantities of data. Take subatomic physics, where data are churned out by the petabyte. It uses notoriously exacting methodological standards, setting an acceptable false-positive rate of one in 3.5m (known as the five-sigma standard). But maximising a single figure of merit, such as statistical significance, is never enough: witness the “pentaquark” saga. Quarks are normally seen only two or three at a time, but in the mid-2000s various labs found evidence of bizarre five-quark composites. The analyses met the five-sigma test. But the data were not “blinded” properly; the analysts knew a lot about where the numbers were coming from. When an experiment is not blinded, the chances that the experimenters will see what they “should” see rise. This is why people analysing clinical-trials data should be blinded to whether data come from the “study group” or the control group. When looked for with proper blinding, the previously ubiquitous pentaquarks disappeared.
Other data-heavy disciplines face similar challenges. Models which can be “tuned” in many different ways give researchers more scope to perceive a pattern where none exists. According to some estimates, three-quarters of published scientific papers in the field of machine learning are bunk because of this “overfitting”, says Sandy Pentland, a computer scientist at the Massachusetts Institute of Technology.
I worry about paleoanthropology. Traditionally, the field has been data-poor. There are only a handful of fossils that represent any particular anatomical detail in any particular ancient species of hominins. That makes for small samples. But because the field is of very high interest, many paleoanthropological papers can report negative results and still be publishable in relatively high-profile journals. Indeed, several of my own papers have been essentially based on negative results -- failure to disprove a null hypothesis, or failure to show significant change over time. Those results are interesting when we are trying to test the pattern of our evolution.
Today, ancient DNA has begun to provide vastly more data about some parts of our evolution. But comparing an ancient genome to the genomes of hundreds or thousands of living people is not straightforward. We require fairly sophisticated models to understand the evolutionary changes in these samples.
Models introduce the problem of overfitting. And models require assumptions, which are often hidden away in the supplementary information of high-impact papers. As we've seen recently, many of the initial conclusions about ancient genomes, made in the wake of the Neandertal and Denisovan discoveries in 2010, were overhyped. Along with some other anthropologists, I raised concerns about these at the time, pointing out which conclusions were very solid, and which other ones we should treat more cautiously. And I'll continue to do that. But many people who are applying sophisticated models to ancient DNA data are not quite so cautious -- they are looking for their publishable results. Negative results are, at the moment, less interesting or publishable in this field. I worry that the level of scrutiny at top journals may be relaxing.
Collins and Tabak are not concerned with paleoanthropology, they are interested in biomedical research. But many areas of human genetics face the same challenges as paleoanthropology -- a sea of new data, with new methods of analysis, and high-profile papers being published that heavily depend on models. They point out some of the problems of this environment:

Factors include poor training of researchers in experimental design; increased emphasis on making provocative statements rather than presenting technical details; and publications that do not report basic elements of experimental design. Crucial experimental design elements that are all too frequently ignored include blinding, randomization, replication, sample-size calculation and the effect of sex differences. And some scientists reputedly use a 'secret sauce' to make their experiments work — and withhold details from publication or describe them only vaguely to retain a competitive edge. What hope is there that other scientists will be able to build on such work to further biomedical progress?
I wish I had a "secret sauce"! In any event, the NIH has adopted a radical solution: Alter the format of the biosketch.

Perhaps the most vexed issue is the academic incentive system. It currently over-emphasizes publishing in high-profile journals. No doubt worsened by current budgetary woes, this encourages rapid submission of research findings to the detriment of careful replication. To address this, the NIH is contemplating modifying the format of its 'biographical sketch' form, which grant applicants are required to complete, to emphasize the significance of advances resulting from work in which the applicant participated, and to delineate the part played by the applicant. Other organizations such as the Howard Hughes Medical Institute have used this format and found it more revealing of actual contributions to science than the traditional list of unannotated publications. The NIH is also considering providing greater stability for investigators at certain, discrete career stages, utilizing grant mechanisms that allow more flexibility and a longer period than the current average of approximately four years of support per project.
I read that and said aloud, "What?" Talk about a toothless policy.
I don't understand why people who apply for federal research money don't have their funding revoked when they don't follow agency policies. If a lab publishes hyped research, the principal investigator should be downgraded for future funding decisions. If the lab doesn't archive data and make it available for replication, the lab should be downgraded for future funding decisions.
And if intervention at the level of the lab is not sufficient -- as for advanced PIs who may be on their last funding cycle -- then the university or research organization should be downgraded. Simple as that.
Fortunately, we don't have too many data sharing problems in paleoanthropology.

References:

Collins, FS and Tabak, LA. 2014. Policy: NIH plans to enhance reproducibility. Nature 505, 612–613. doi:10.1038/505612a URL: http://www.nature.com/news/policy-nih-plans-to-enhance-reproducibility-1.14586

Why the Economist is wrong about science reproducibility


Word about the Economist’s two posts on reproducibility spread fast through the JoVE office on Friday morning. And while we were delighted to see a major news outlet recognizing science’s reproducibility issue, the Economist failed to emphasize the true matter at hand.Science doesn’t have a scientist problem, science has a communication problem.

On October 17th and 19th, the Economistpublished a video and an article (“Is science wrong? Human after all,” and then “Unreliable research:Trouble at the lab”) discussing at length the problems behind  Amgen and Bayer’s claims that only about 10-30% of published scientific experiments are reproducible. Correctly concluding that these findings present a serious problem for modern science, the article provides several reasons for low-reproducibility. But this is where the Economist  gets it wrong.
The Economist states there are several reasons behind the reproducibility problem, but their main argument has to do with two causes. They point to career motivations that result in pressure to produce positive results, and they say scientists’ weak knowledge of statistics leads to the publishing of false-positive results. In other words, the journal states that scientists lie, and that scientists don’t know what they are doing.
But every practicing scientist knows that the reality is different. While it is true that there are some unethical individuals among scientists too, there is no massive fraud in academic research. And yes, some scientists, especially biologists, do not have a strong knowledge of statistical analysis, but you don’t have to be a statistician when you see a 4-fold difference between your experiment and the control to make a correct conclusion. The problem is that even when results are valid they are very difficult to reproduce—regardless of whether or not they were published in the most prestigious and rigorous science journals.
While traditional text-based science journals are sufficient to describe the results of scientific studies (for example, showing that gene X regulates cell-division in cancer cells), they are incompetent when it comes to unambiguously describing the methods used to achieve those results (how exactly cells and gene X were manipulated in the lab). Complex experiments can be reproduced only after visual demonstration by its original authors. Reproducibility comes with actually seeing how an experiment is performed—be it from flying across the world to have it demonstrated first hand, or from viewing it through video publication, for example, through JoVE. Unlike reading, visualization eliminates the errors of interpretation, or this case, misinterpretation.
“…[Video] is particularly important for observing experimental procedures, which usually are over-simplified in most of the traditional journals,” says Dr Pinfen Yang, Associate Professor, Marquette University, “In fact, only videos could adequately convey many procedures. For example, I learned how to inject zebrafish embryos and mate green algae by watching video published JoVE over and over again.”
The Economist could have found this simple truth only if their journalists bothered to speak with the graduate students and postdocs who actually do work in experimental labs with their own hands. Instead, they preferred to do the finger-pointing and offer administrative solutions that will not change the core problem—the format of science communication.
But just as the information age has shed light onto irreproducibility, it has shed light onto one  solution: video.
by Phil Meagher

SOURCE  http://johnhawks.net/  ,  http://www.jove.com/

Labels:

Tuesday, November 19, 2013

HUMAN GENETIC MODIFICATION PROVES THE GAIAN PLASTICITY AND EVOLUTIONARY SOLAR FORMATION

(BY CLICKING ON THE TITLE WE SHALL BE REDIRECTED TO AN ARTICLE FOR SOMETHING YOU NEED TO KNOW)

Genetically Modified Babies

In October 2013, the US Food and Drug Administration will hold atwo-day public meeting to discuss genetic modification within the human egg, which changes will be passed on generationally.  The United Kingdom is also moving to allow GM babies.
Human gene therapy has been ongoing since 1990, but most of that involved non-heritable genes, called somatic (non-sex cell) gene therapy.  Somatic modifications only affect the individual and are not passed on, and so do not affect the human genome.
The game changed with the successful birth of at least 30 genetically modified babies by 2001. Half of the babies engineered from one clinic developed defects, so the FDA stepped in andasserted jurisdiction over “the use of human cells that receive genetic material by means other than the union of gamete nuclei” (sperm and egg nuclei).
Now the FDA is considering going forward with “oocyte modification” which involves genetic material from a second woman, whereby offspring will carry the DNA from three parents.  These kinds of genetic changes (“germline modification”) alter the human genome.
With ooplasmic transfer, the technique injects healthy mitochondrial DNA from a donor into the egg of an infertile woman.  Mitochondrial DNA floats outside a cell’s nucleus which contains the regular DNA, and is only inherited from the mother.
This is the first such meeting ever to be held in public by the FDA, reports Biopolitical Times (BPT), speculating that the meeting will likely include discussing a mitochondrial replacement technique developed by Shoukhrat Mitalipov at Oregon Health and Science University (OHSU).
Notes the BPT, “mitochondrial replacement is a form of inheritable genetic modification.”  This type of gene therapy is the source of much controversy, because it permanently changes the human genome and risks unforeseeable changes in growth and development, and aging.
As late as 2008, all germline modification therapies and enhancements were banned in 83% of the 30 nations making up the OECD (Organization for Economic Cooperation and Development), including the US and UK, reports the Center for Genetics and Society (CGS).
In June of this year, the United Kingdom reversed its long-standing policy against germline modification, and decided to go ahead with three-parent babies. Regulations on the procedure are now being drafted and Members of Parliament are expected to vote on the issue in 2014.
Testifying before the US House Foreign Affairs Committee, Subcommittee on Terrorism, Nonproliferation and Trade in 2008, CGS Executive Director Richard Hayes advised:
“Most people strongly support therapeutic applications of genetic science, but they also realize that the manipulation of inheritable genetic traits crosses a consequential barrier. In the great majority of instances, couples at risk of passing on a serious genetic disease can ensure that their child is disease-free by means of medically-related trait selection, thus obviating the need for the far more complex and risk-prone intervention that germline modification would entail.”
 Making humans better, smarter, stronger has long been the goal of eugenicists.  Hayes warns:
“Germline enhancement has also been seriously proposed as a means of creating people with such novel cognitive, psychological, and behavioral traits that they would constitute a new, ‘post-human’ species, incapable of interbreeding with ‘normal’ humans.”
 Paul Knoepfler, Associate Professor of Cell Biology and Human Anatomy at the University of California, Davis School of Medicine, commented that:
“Moving one oocyte nucleus into the enucleated oocyte of another person could trigger all kinds of devastating problems (most likely through epigenetic changes) that might not manifest until you try to make a human being out of it. Then it’s too late.”
 BPT shares in this opposition:
 “If the FDA gives the OHSU researchers a green light to move towards human clinical trials, it will be the first instance of regulatory approval for human germline modification everanywhere in the world.
 ”Given the current regulatory void in the United States and the paucity of safety data, allowing scientists to experiment with creating permanent changes to the human genome is a genie that must be kept in the bottle.”
As with genetically modified crops, a host of unforeseen and deleterious consequences may develop when we begin modifying humans with genes their children will inherit. GM feed is linked with infertility and spontaneous abortions in livestock, and crops modified to be insecticidal are linked to declining pollinator populations, especially bees, moths and bats.
But another argument against germline modification is that it will lead to designer babies and a new class of underdogs – those who cannot afford genetic enhancement.
Eugenicists and futurists like Ray Kurzweil (The Singularity Is Near, 2006) foresee and welcome the convergence of the NBIC fields that can improve human performance: nanotechnology, biotechnology, information technology and cognitive science.
In 2001, over 50 policy makers and scientists from a range of fields contributed to a National Science Foundation-sponsored workshop on converging NBIC technologies. Within the individual, group and societal level discussions, they addressed key areas of human activity: working, learning, aging, group interaction and human evolution. The consensus reached was to focus a national R&D priority on human enhancement.
In re-opening the allowance for GM babies, whose genetic changes will be passed on to future generations, the FDA is taking the next steps toward toeing the line on genetic human enhancement.
In addition to accepting written comments, the FDA, in collaboration with the Office of Cellular Tissue and Gene Therapies Center for Biologics Evaluation and Research, will also provide a free webcastof the two-day discussion.  The meeting may be rescheduled without notice, the FDA warns.
 By Rady Ananda
SOURCE GLOBAL RESEARCH

Warning: Genetically Modified Humans

ANATOLIA, 9,000BC – The rising sun advanced over the hills, engulfing the arid land in a blaze of warmth. Below the amber sky lay a patchwork of wheat fields, in which a scattering of stooped figures silently harvested their crops.
Later, their harvest would be scrutinised, and only the largest grains selected for planting in the autumn.
A revolution was occurring. For the first time in 3.6 billion years, life had subverted the evolutionary process and began to steer it not with natural selection, but artificial selection. Selection pressures became synonymous with the needs of the architects; the farmers. The technique led to a widespread transition from hunter-gathering to agriculture, a shift that would transform human culture and lay the foundations for the first civilisations. Moreover, in their efforts to permanently remodel the characteristics of a species, early farmers were pioneers of genetic modification.
The modification of plants would later be followed by the domestication of animals, and perhaps eventually, human beings.
From the promotion of eugenics to justify genocide in Nazi Germany, to the mass-produced and homogenous population of Aldous Huxley’s dystopian future in the novel ‘Brave New World’, to ‘Frankenfood’, genetic engineering has amassed a reputation as a treacherous pursuit. However, a recent development appears to have slipped under the public radar: human pre-natal diagnosis. Screening foetal genomes to eliminate genetic ‘defects’ may lead to incremental changes in the human genetic reservoir, a permanent shift in our characteristics and eventually, self-domestication.
The technique involves testing for diseases in a human embryo or foetus, and may be performed to determine if it will be aborted, or in high-risk pregnancies, to enable the provision of immediate medical treatment on delivery. Until recently, pre-natal screening required invasive procedures such as amniocentesis, in which the fluid from the sac surrounding the foetus, the amnion, is sampled and the DNA examined for genetic abnormalities. The procedure can only be performed after the 15th week of pregnancy, and carries a 1% risk of miscarriage and the possibility of complications. In the light of such limitations and risks, the technique hasn’t gained widespread popularity.
However, a research group based at the University of Washington in Seattle has developed an alternative. Their simple test can be performed weeks earlier than current pre-natal screening, and crucially, requires only a maternal blood sample and DNA from both parents. The technique exploits the fragments of foetal DNA in the mother’s blood plasma, which can be strung together by sequencing each nucleotide many times, and then differentiated from maternal and paternal DNA by statistical comparison. It’s quick, harmless, and may soon become widely available. Therein lies the problem. Such a tool is a powerful new route gleaning information about unborn offspring. The object of the exercise: to identify foetuses with the earmarks of genetic disease as candidates for abortion.
Inevitably, the technique is vulnerable to abuse and will empower parents to discriminate the characteristics of their progeny pre-emptively, in a step towards ‘designer babies’. Nevertheless, there is a more immediate concern. Screening for inheritable disorders requires knowledge of their genetic basis, which can be dangerously precarious. Some conditions, such as Down’s syndrome; characterised by the presence of an extra chromosome, are glaringly obvious. Others have more subtle and complex genetic origins. Just as the invention of vaccines to prevent infectious diseases was followed by attempts at total eradication, our efforts to eliminate genetic characteristics may have permanent consequences.
Autism spectrum disorder (ASD) has already been singled out as a potential target for the screening technology. The disorder, which is characterised by difficulties in communication and social interaction, and repetitive or stereotyped behaviours and interests, has a strong but elusive genetic basis. Intriguingly, there has been much speculation that the genes involved in the development of ASD may be linked to mathematical and scientific ability.
The theory has roots in the overlap between certain useful aptitudes in technical professions, and behaviour typical of ASD. An obsessive attention to detail, the ability to understand predictable rule- based systems, ‘systemising’, and a narrow range of interests, are traits characteristic of both groups. Professor Baron Cohen of the University of Cambridge is a strong proponent of the idea, and has suggested that scientist couples are more likely to have children with the disorder. It’s a compelling idea with intuitive plausibility, but the evidence isn’t there (yet). Until we know better, perhaps restraint is needed in eliminating these potentially important genes from our gene pool. There has been speculation that Einstein and Newton were ‘on the spectrum’- what if we inadvertently ‘cured’ the future world of similar talent?
Will our descendants be less than human? Another candidate for remedy with reproductive technology is schizophrenia. The disorder affects cognition, and can lead to chronic problems with emotional responsiveness. The 1% prevalence of schizophrenia makes it an apt target for prevention. However, the globally consistent and high incidence of this disease may be an indicator of its association with advantageous genetic characteristics. The ‘social brain hypothesis’, the main theory to explain the evolution of schizophrenia, suggests that the human brain evolved to select for genes associated with schizophrenia in a trade for higher order cognitive traits. These include language and the ability to interpret the thoughts and emotions of others. Schizophrenia is the cost that humans pay for being able to communicate, and as such, the genes responsible may be an essential component of the human gene pool. As with ASD, the elimination of the disease may have unintended consequences, and permanently alter the social dynamics within our species.
This mechanism, termed a ‘heterozygote advantage’, can arise from the benefits of carrying different forms of a gene, as opposed to two of the same variant, or ‘alleles’. The phenomenon has been proposed for a wide variety of genetic diseases; however usefulness is often dependent on environmental context. Because human lifestyles have diversified to such an extent from those of our ancestors, certain advantages may be outdated. The malaria protection conferred by carrying a single sickle-cell gene is hardly worth the risk of debilitating anaemia if you end up with two- especially in a modern world where anti-malarial medication is widely available. The systematic eradication of this disorder, and many others, will be a welcome and significant medical advancement. But caution is needed.
Following a recent project to build a comprehensive map of the functional elements in the human genome, ENCODE, a function was assigned to 80% of our DNA sequence. However, our genomes are still poorly understood. Many sequences are multi-functional, and knowledge of mechanisms of gene expression is essential to any meaningful model.
We urgently need a regulatory framework for the use of procedures such as pre-natal screening, and to exercise restraint in gene eradication. A detailed assessment and forecast of the long- term consequences is essential before a potentially corrosive procedure become entrenched in modern society. The alternative: we might just end up domesticating ourselves.
By Zaria Gorvett
SOURCE  http://blogs.scientificamerican.com


Labels:

Link

Sunday, September 30, 2012

WORKING SCIENTIFICALLY SHOWS THE WAY TOWARDS GNOOSIS





CITIZENS,READERS AND FELLOWS CHAIRESTHAI :-)
ANOTHER ECO-NOMIC   CYCLE BEGAN ,ANOTHER TURN AROUND HELIOS,WHO ONLY DONATES TO THE BEINGS OF HIS  RADIUS RESPECT.
MEANWHILE OUR EPAPHOS TEAMWORK,ENTERED  TO THIS  NEW INDIKTUS ,BY PARTICIPATING TO THE FOLLOWING EVENTS:



A)FUTURE INTERNET PPP  12/9/12 (EC ICT PREMISES)

AN INFORMATION CALL DAY ,PROMOTING SYNERGIES BETWEEN PUBLIC AND PRIVATE SECTOR,CONCERNING THE MENTIONED SCOPE.
VARIOUS STAKEHOLDERS (CZECH ACADEMY OF SCIENCES,AALTO UNI,SOUTHAMPTON UNI,JAUME UNI,THALES GROUP,AND OTHER ESTEEMED ORGS)  PRESENTED  IDEAS FOR COLLABORATION IN  INTELLIGENT CITIES,SMART REGIONS, MANufacturing SMEs,e-health,CLOUD MOBILE APPS,DARK FIBRES,OPTICAL FACILITIES.


B)E-PROCUREMENT-NEW CHALLENGES AND OPPORTUNITIES  13/9/12 (EU ECONOMIC AND SOCIAL COMMITTEE PREMISES )

LATELY THE DISCUSSIONS FOR THE THEME ARE AUGMENTING,BECAUSE THE IMPLEMENTATION OF A COMMON PLATFORM,FOR SURE WILL BRING EFFICIENCY AND SAVINGS TO THE VARIOUS TRANSACTIONS  FOR PUBLIC ENTITIES THROUGHOUT EUROPE.
AS IT WAS DECLARED BY OUR TEAMWORK,WE SHOULDN'T BUILD ON THE SAND,THAT IS WHY OUR THESIS FOR AN INDEPENDENT EUROPEAN INTERNET,IS A MUST FROM A PRIVACY,SECURITY AND MARKET PERSPECTIVE.


C)NEW VISUALIZATION SYSTEMS WITHIN  CUNEIFORM STUDIES 14/9/12 (ROYAL BELGIAN MUSEUM FOR ARTS AND SCIENCES)




KU LEUVEN,OXFORD, SOUTHAMPTON,HEIDELBERG  UNIVERSITIES TOGETHER WITH THE MUSEUM (fp6-fp7) ARE DEVELOPING A SCANNING 3D MACHINES,FOR   VARIOUS ARCHAEOLOGICAL PURPOSES.TECHNICALLY WE HAVE OFFERED OUR EXPERTIZE BY CONSULTING WAYS OF USING THESE MACHINES,SO THAT THE CLASSICAL STUDIES TO COME CLOSER ON THE ONE HAND WITH THE PEOPLE AND MARKET AND ON THE OTHER BETWEEN THEMSELVES.

 
D)PRECOMMERCIAL PROCUREMENT   21/9/12  (SHERATON AIRPORT HOTEL)
 
 
 
THE VERY CAPABLE PRACE GROUP,IS THINKING ABOUT INVESTING 10 MILLION EURO ,ON THE ENERGY EFFICIENCY COMPUTER GRIDS SYSTEMS.THE QUESTION WHICH EPAPHOS POSED WAS WHERE THE COMPETITION FROM THE OTHER ZONES (FAR EAST,NORTH AMERICA ETC) IS LAID.
IN THAT WAY  WE THE EUROPEANS ,SHALL BE IN THE POSITION ,TO FINANCE A PROJECT WHICH WILL CONTRIBUTE A LOT TO OUR INNOVATIVE STRATEGIES.
THE ROYALTIES ISSUE,FOR VARIOUS  PROJECTS, WHICH WAS RAISED IT IS BELIEVED THAT  IT WILL CAUSE LIMITATIONS FOR SEVERAL INNOVATIVE SOLUTIONS AND APPLICATIONS,WHICH THE PRECOMMERCIAL -PROCUREMENT  INITIATIVE NEEDS SO MUCH,TO BE EXPANDED.
THE PRECOMMERCIAL PROCUREMENT SHOULD BE DIRECTED AND ENCOURAGE EUROPEAN RESEARCH SOLUTIONS,FOR SPECIFIC PROBLEMS,WHICH AREN'T YET BEING OFFERED  ANYWHERE ELSE BY THE INTERNATIONAL  RESEARCH  SECTOR.
 
 
 
 
 
E)UNLOCKING THE GROWTH POTENTIAL 25-26/9/12 (CROWNE PLAZA HOTEL)
 


DURING THAT IMPORTANT EVENT ,AFTER YEARS OF LOW ACTIVITY AT THE SECTOR,OUR ADVISORY GUIDANCE ,INDICATED THREE MAIN ISSUES,WHICH WEREN'T MENTIONED AT ALL,CONCERNING  THE OFFICIAL PORTS POLICIES:

1)SAFETY AND SECURITY ,SINCE PORTS ARE MAIN PILLARS INTO THE EUROPEAN BORDERS
2)COOPERATION AND PARTICIPATION FOR THE PORTS DECISION MAKING BODIES ,OF LOCAL COMMUNITIES AND CIVIL SOCIETY,APART FROM THE PUBLIC AND PRIVATE SECTOR.
3)THE ROBOTICS ARE ENTERING INTO CITIZENS EVERYDAY LIVES.
A SOCIAL ECONOMIC STUDY ,URGENTLY MIGHT BE CONDUCTED WHICH WILL RESEARCH HOW THE ROBOTIZING OF PORTS OPERATIONS WILL AFFECT THE WHOLE ECONOMICAL AND SOCIETAL CIRCLE,COMBINED WITH THE ALREADY OCCURRED AUTOMATIZING PROCESSES.


OUR GROUP,THINKS THAT THE ABOVE MENTIONED POLICIES WHICH ASSIST DEMOCRATIC PARTICIPATION,MODERNIZATION  AND SAFETY,WILL CONTRIBUTE TO THE EUROPEAN PORTS LOCAL-NATIONAL DEVELOPMENT.



ST) FUTURE INTERNET ARCHITECTURE AGENDA 27/9/12 (EC  ICT )

LOTS OF  INTERESTING TECHNICAL DISCUSSIONS OCCURED DURING THE EVENT.
IT SEEMS THAT THINGS ARE GOING ON RAPIDLY.THE  SUGGESTION FROM OUR SIDE WAS  TO BE  CONDUCTED A RELIABLE  TECHNICAL STUDY REGARDING THE PREVIOUS AND CURRENT ATTEMPTS FOR  CONSTRUCTING  A NEW  INTERNET.THIS WILL DEVELOP  FURTHER ON THE  SECTOR,WHICH IS STUCK  AND IT WILL GIVE TO THE RESEARCHERS A GOOD TECHNICAL MANUAL,WITH  PLETHORA  OF HISTORICAL WAYS AND DATA.


CLOSING OUR POST,IT IS  WISHED  A SUCCESSFUL ECONOMIC YEAR AND WE WOULD LIKE TO  THANK ALL  THOSE,WHO BY THEIR WAYS,THEY ARE FACILITATING OUR TEAMWORK'S  ACTIVITIES.

HYGEIAINETE
A.CH.



Labels:

Sunday, May 06, 2012

THE EUROPEAN RESPONSE MIGHT BE THE SYNNET


DEAR FELLOW READERS IRENE,PEACE,SALAM,
IT SEEMS THAT THE UPCOMING EVENTS WILL BE VERY CRUCIAL FOR OUR PAN-EUROPEAN AND MEDITERRANEAN ZONE,FOR WHICH WE ARE WORKING SEVERAL YEARS.
MEANWHILE AND UP TO THAT TIME ,RESEARCH  CONTINUES:


2/5/12  RESEARCH  IN FUTURE CLOUD COMPUTING,
AT CHARLMAGNE BUILDING ,BRU,BE.

THE BEST MULTI-CO WERE THERE
IBM,MICROSOFT,ATOS,ALCATEL-LUCENT ETC TO BE NAMED SOME OF THEM.
ALSO IMPORTANT SCIENTIFIC AND COMMERCIAL STAKEHOLDERS PARTICIPATED TO THE EVENT.
TO OUR SURPRISE IT WAS  HEARD FOR THE FIRST TIME AND WE MET FOR A WHILE ,WITH THE  REPRESENTATIVE OF THE SOLE AMBITIOUS EUROPEAN SEMICONDUCTOR INDUSTRY.


3/5/12  THE ECONOMIC AND SOCIETAL IMPACT FUTURE INTERNET TECHNOLOGIES,SERVICES AND APPLICATION,
AT DG ICT PREMISES,BRU,BE.

A FINAL WORKSHOP  FOR THE CONDUCTION OF A STUDY BY
IDC EMEA (EUROPEAN GOVERNMENT CONSULTING) ,WIK-CONSULT,RAND EUROPE,
AND NOKIA.
HERE IT WAS DECLARED ONCE  AGAIN (see article  on 22/3/12) :

THE NEED FOR THE EUROPEAN COMMISSION TO FINANCE AND ASSIST THE RESEARCH ,STUDY AND IMPLEMENTATION  IN ORDER

WE THE EUROPEANS TO HAVE OUR OWN INTERNET ,WHICH IT MIGHT BE NAMED AS SYNNET.
THE INTERNET IT IS CONTROLLED BY THE U.S.A MILITARY ,THROUGH SEVERAL STAKEHOLDERS AND N.P.O'S .THIS CREATES TO EUROPEAN CITIZENS  A SECURITY AND PRIVACY  LACK.
WHY  DID EUROPE INVEST AT THE SATELLITE POSITIONING TECHNOLOGY,
THE GALILEO?
THERE WERE ALREADY  TWO OF THEM WHICH WERE DEVELOPED BY THE DEFENSE SECTOR FROM  U.S.A.(GPS) AND FORMER U.S.S.R. (GLONASS)
THE PROJECT GALILEO WHICH TODAY IS THE BEST ONE ,WAS DEVELOPED BY CIVILIANS FROM THE SCIENTIFIC ,INDUSTRY AND SOCIETAL SECTORS.
IN THE SAME WAY  WE COULD DEVELOP OUR OWN HIGH TECH  PROTOCOLS ,SINCE THE INTERNET PROTOCOL AND ARCHITECTURE TECHNOLOGIES  (TCP/IP ETC) ARE BASED ON 50' AND 60's SCIENTIFIC RESEARCH. 

SUCH AN ACTION ,UNDER THE UNIVERSITIES RESEARCH CAPABILITIES AND RESPONSIBILITY ,COMBINED WITH THE COOPERATION OF OTHER INTERESTED COMMUNITIES,ON THE ONE HAND IT COULD CREATE NEW RESEARCH AND INDUSTRY POSITIONS BUT ON THE OTHER HAND ALSO IT WILL CREATE FOR SURE A COMMERCIAL ,TECHNOLOGICAL AND OTHER  COMPETITION  ,BETWEEN THE TWO DIFFERENT SYSTEMS (INTERNET AND SYNNET),FOR THE CITIZENS BENEFIT.
THE TWO SYSTEMS ACCORDING TO CLIENT'S REQUEST COULD BE BRIDGED AND EXCHANGE DATA INFORMATION.
TODAY EUROPEAN INDUSTRY DOESN'T  EXIST AND IF IT DOES IT ISN'T ANYMORE AT THE HANDS OF OURS.
LET ME ALSO POINT SOME SPECIFIC SENTENCES FROM AN ARTICLE WHICH WAS PUBLISHED TODAY BY AN AMERICAN MAGAZINE.



Frankovsky wrote that some new hardware designs included Facebook's "vanity-free" storage server and motherboard designs contributed by chip makers AMD and Intel. AMD's motherboard is about 16-inches by 16.5-inches (40.6 centimeters by 41.9 centimeters), and is for high-performance computing and general purpose installations such as cloud deployments. On the software side, VMware will certify its vSphere virtualization platform to run on hardware based on the OpenRack specification.

If the Facebook design becomes a de-facto industry standard for cloud and Web 2.0 data centers, it could make the deployment and management of systems easier over time, said Charles King, principal analyst at Pund-IT.

"The Facebook standard seems to be a data-center centric effort where you are looking to establish a system-design standard that can populate a data center with hundreds of thousands of servers," King said. 
(MORE ...)
THANKS ALL FOR YOUR ATTENTION



BELOW ARE PRESENTED SOME RESEARCH ARTICLES WHICH MAYBE THEY ARE   INDICATING SOME PATHS FOR THE EUROPEAN RESEARCH ICT INDUSTRY
 

Researchers Use Diamonds to Boost Computer Memory

Johns Hopkins University engineers are using diamonds to change the properties of an alloy used in phase-change memory, a change that could lead to the development higher capacity storage systems that retain data more quickly and last longer than current media.
The process, explained this month in the online edition of Proceedings of the National Academy of Sciences (PNAS), focused on changes to the inexpensive GST phase-change memory alloy that's composed of germanium, antimony and tellurium.
"This phase-change memory is more stable than the material used in current flash drives. It works 100 times faster and is rewritable about 100,000 times," said the study's lead author, Ming Xu, a doctoral student at the Whiting School of Engineering at Johns Hopkins University.
"Within about five years, it could also be used to replace hard drives in computers and give them more memory," he suggested.
GST has been in use for two decades and today is widely used in rewritable optical media, including CD-RW and DVD-RW discs.
IBM and others are already developing solid-state chip technology using phase-change memory, which IBM says can sustain up to 5 million write cycles. High-end NAND flash memory systems used today can sustain only about 100,000 write cycles.
By using diamond-tipped tools to apply pressure to the GST, the researchers found they could change the properties of the alloy from an amorphous to a crystalline state and thus reduce the electrical resistivity by about four orders of magnitude. By slowing down the change from an amorphous state to a crystalline state, the scientists were also able to produce many varying states allowing more data to be stored on the alloy.
GST is called a phase-change material because, when exposed to heat, an area of the alloy can change from an amorphous state, in which the atoms lack an ordered arrangement, to a crystalline state, in which the atoms are neatly lined up in a long-range order.
An illustration of how the diamond-tipped tools were used to compress GST
The two states are then used to represent the computer digital language of ones and zeros.
In its amorphous state, GST is more resistant to electric current. In its crystalline state, it is less resistant
The two phases of GST, amorphous and crystalline, also reflect light differently, allowing the surface of a DVD to be read by tiny laser.
While GST has been used for some time, the precise mechanics of its ability to switch from one state to another have remained something of a mystery because it happens in nanoseconds once the material is heated.
To solve this mystery, Xu and his research team used the pressure from diamond tools to cause the change to occur more slowly.
The team used a method known as X-ray diffraction, along with a computer simulation, to document what was happening to the material at the atomic level. By recording the changes in "slow motion," the researchers found that they could actually tune the electrical resistivity of the material during the time between its change from amorphous to crystalline form.
"Instead of going from black to white, it's like finding shades or a shade of gray in between," said En Ma, a professor of materials science and engineering, and a co-author of the PNAS paper. "By having a wide range of resistance, you can have a lot more control. If you have multiple states, you can store a lot more data."
Lucas Mearian covers storage, disaster recovery and business continuity, financial services infrastructure and health care IT for Computerworld. Follow Lucas on Twitter at @lucasmearian, or subscribe to Lucas's RSS feed . His e-mail address is lmearian@computerworld.com.

SOURCE  http://www.pcworld.com/




U.S. Lags in Internet Connectivity Speeds

Akamai's "The State of the Internet" survey reveals that the rest of the world is speeding past the United States in Internet connectivity.
Akamai, one of the companies that can actually pull off a survey like this, has released its quarterly "The State of the Internet" report that outlines the relative speeds and penetrations of Internet connectivity in the U.S. and around the world.
It lists today's poor performers, which are actually stunningly fast by the standards of the Internet a decade ago when everyone was struggling. Many of us can recall the days of the fractional T-1. These megabit-per-second shared lines were considered the Rolls-Royce of connectivity.
The 600-page report delves into everything from IPv6 adoption to sources of Internet attack traffic. For anyone who follows the news, there are few revelations except for the fact that apparently Washington, D.C. has the worst Internet overall speeds of any place in the country. This also happens to be where the FCC lives and where Congressional hearings discuss topics like Internet speed and connectivity. It's all too hilarious.
After D.C., the worst states in the U.S., as far as overall Internet speed is concerned, are: Missouri, Georgia, Alaska, Illinois, Iowa, Colorado, Ohio, and Texas.
Korea leads the world with a 17.5 Mbps average. Japan and Hong Kong are next, with 9.1 Mbps. The U.S. is number 13 on the list after Holland, Latvia, Switzerland, Czech Republic, and Romania. We average 5.8 Mbps. The average connection speed globally is 2.3 Mbps. This is humiliating, but the U.S. has not even been in the top 10 for more than a decade.
Number one city in the world is Taegu, South Korea with an average connection speed of 21.8 Mbps. The top six cities were all in Korea and the next eight were in Japan. An American city, Boston, finally makes the list at number 51 with 8.4 Mbps followed by North Bergen, New Jersey at number 52. Jersey City, New Jersey comes in number 58.
Curiosities, as far as I'm concerned, were Monterey Park, California at number 59 with 8.2 Mbps and Manchester, New Hampshire at number 68 with 7.8 Mbps.

It proves that, if you do some research, you can find some hotspots around the world where you can do business at high speed. At that point, you have to be concerned with uptime and backup connectivity. Places like New Caledonia, in the southwest Pacific Ocean, are highlighted for their initiatives such as soon-to-be implemented fiber to the home. Nobody even talks about fiber to the home in California anymore, at least since the housing market collapse.
The report is peppered with enticing snippets, too, such as these cocktail party facts:
"Fully one quarter of Moroccan households boast a broadband connection - up from just two percent in 2004."
"Argentina has one of the most developed broadband markets in Latin America, with some of the fastest and least expensive plans on offer."
"Romania is one of the leading countries in the world in terms of high speed internet access in larger cities, but fixed broadband internet penetration is still low in Romanian regions and rural areas."
There is a lot to be learned from reports like these, but the main lesson is that the world is moving fast on Internet technology and we need to be a leader, not a follower. This means everyone in Washington, D.C. should look over this data carefully—assuming, of course, they can actually manage to download it with their slow connections.

SOURCE   http://www.pcmag.com

Labels:

Thursday, April 28, 2011

THE MACHINE FOR DREAMING

Thomas Schenkel and his colleague T.C. Shen of the Accelerator and Fusion Research Division (BERKLEY LAB) are working with an electron-beam ion trap to develop a quantum computer based on single-electron transistors. (Photo Roy Kaltschmidt)

Moore’s law


If you were shrewd enough to invest $15,000 in Microsoft ten years ago you’d be a millionaire by now. The engine of growth that has driven the computer industry, and looks set to make Bill Gates the world’s first trillionaire, is no lucky fluke. Underpinning it have been sweeping advances in fundamental electronics. The first computers used bulky vacuum tubes and needed entire buildings to house them. Then in the 1960’s along came the transistor, which in turn gave way to the incredible shrinking microchip. But this is not the end of the story. Yet more exotic technologies are in the pipeline, and they promise to have as great an impact on the information industry as did the invention of the original computer.



The commercial success of computers stems from the fact that with each technological leap, the processing power of computers has soared and the costs have plummeted, allowing manufacturers to penetrate new markets and hugely expand production. The inexorable rise of the computer’s potency is expressed in a rough and ready rule known as Moore’s Law, after Gordon Moore, the co-founder of Intel. According to this dictum, the processing power of computers doubles every 18 months. But how long can it go on?


Moore’s law is a direct consequence of the “small is beautiful” philosophy. By cramming ever more circuitry into a smaller and smaller volume, faster information processing can be achieved. Like all good things, it can’t go on forever: there is a limit to how small electronic parts can be. On current estimates, in less than 15 years, chip components will approach atomic size. What happens then?


The problem is not so much with the particulate nature of atoms as such. Rather it lies with the weird nature of physics that applies in the atomic realm. Here, the dependable laws of Newtonian mechanics dissolve away into a maelstrom of fuzziness and uncertainty.


To understand what this means for computation, picture a computer chip as a glorified network of switches linked by wires in such a way as to represent strings of 1’s and 0’s – so-called binary numbers. Every time a switch is flipped, a bit of information gets processed; for example, a 0 becomes a 1. Computers are reliable because in every case a switch is either on or off; there can be no ambiguity. But for decades physicists have known that on an atomic scale, this either/or property of physical states is fundamentally compromised.



The source of the trouble lies with something known as Heisenberg’s uncertainty principle. Put crudely, it says there is an inescapable vagueness, or in determinism, in the behaviour of matter on the micro-scale. For example, today an atom in a certain state may do such-and-such, tomorrow an identical atom could do something completely different. According to the uncertainty principle, it’s generally impossible to know in advance what will actually happen – only the betting odds of the various alternatives can be given. Essentially, nature is reduced to a game of chance.

Atomic uncertainly

Atomic uncertainly is a basic part of a branch of science known as quantum mechanics, and it’s one of the oddest products of twentieth century physics. So odd, in fact, that no less a scientist than Albert Einstein flatly refused to believe it. “God does not play dice with the universe,” he famously retorted. Einstein hated to think that nature is inherently and fundamentally indeterministic. But it is. Einstein notwithstanding, it is now an accepted fact that, at the deepest level of reality, the physical world is irreducibly random.

When it comes to atomic-scale information processing, the fact that the behaviour of matter is unreliable poses an obvious problem. The computer is the very epitome of a deterministic system: it takes some information as input, processes it, and delivers a definite output. Repeat the process and you get the same output. A computer that behaved whimsically, giving haphazard answers to identical computations, would be useless for most purposes. But try to compute at the atomic level and that’s just what is likely to happen. To many physicists, it looks like the game will soon be up for Moore’s Law.

Although the existence of a fundamental physical limit to the power of computation has been recognized for many years, it was only in 1981 that the American theoretical physicist Richard Feynman confronted the problem head-on. In a visionary lecture delivered at the Massachusetts Institute of Technology (MIT), Feynman speculated that perhaps the sin of quantum uncertainty could be turned into a virtue. Suppose, he mused, that instead of treating quantum processes as an unavoidable source of error to classical computation, one instead harnessed them to perform the computations themselves? In other words, why not use quantum mechanics to compute?

It took only a few years for Feynman’s idea of a “quantum computer” to crystallize into a practical project. In a trail-blazing paper published in 1985, Oxford theoretical physicist David Deutsch set out the basic framework for how such a device might work. Today, scientists around the world are racing to be the first to make it happen. At stake is far more than a perpetuation of Moore’s Law. The quantum computer has implications as revolutionary as any piece of technology in history. If such a machine could be built, it would transform not just the computer industry, but our experience of physical existence itself. In a sense, it would lead to a blending of real and virtual reality.

At the heart of quantum computation lies one of the strangest and most baffling concepts in the history of science. It is known technically as ‘superposition’. A simple example concerns the way an electron circles the nucleus of an atom. The rules of quantum mechanics permit the electron to orbit only in certain definite energy levels. An electron may jump abruptly from one energy level to a higher one if enough energy is provided. Conversely, left to itself, an electron will spontaneously drop down from a higher level to a lower one, giving off energy in the process. That is the way atoms emit light, for example.

Because of the uncertainty principle, it’s normally impossible to say exactly when the transition will occur. If the energy of the atom is measured, however, the electron is always found to be either in one level or the other, never in between. You can’t catch it changing places.


Quantum superpositions

Now comes the weird bit. Suppose a certain amount of energy is directed at the atom, but not enough to make it jump quickly to an excited state. According to the bizarre rules of quantum mechanics, the atom enters a sort of limbo in which it is somehow in both excited and unexcited states at once. This is the all-important superposition of states. In effect, it is a type of hybrid reality, in which both possibilities – excited and unexcited atom  - co-exist. Such a ghostly amalgam of two alternative worlds is not some sort of mathematical fiction, but genuinely real. Physicists routinely create quantum superpositions in the laboratory, and some electronic components are even designed to exploit them in order to produce desired electrical effects.

For 70 years physicists have argued over what to make of quantum superpositions. What really happens to an electron or an atom when it assumes a schizophrenic identity? How can an electron be in two places at once? Though there is still no consensus, the most popular view is that a superposition is best thought of as two parallel universes that are somehow both there, overlapping in a sort of dual existence. In the case of the atom, there are two alternative worlds, or realities, one with the electron in the excited state, the other with the electron in the unexcited state. When the atom is put into a superposition, both worlds exist side-by-side.

Some physicists think of the alternative worlds in a superposition as mere phantom realities, and suppose that when an observation is made it has the effect of transforming what is only a potential universe into an actual one. Because of the uncertainty principle, the observer can’t know in advance which of the two alternative worlds will be promoted to concrete existence by the act of observation, but in every case a single reality is revealed – never a hybrid world. Other physicists are convinced that both worlds are equally real. Since a general quantum state consists of a superposition of not just two, but an unlimited number of alternative worlds, the latter interpretation implies an outlandish picture of reality: there isn’t just one universe, but an infinity of different universes, existing in parallel, and linked through quantum processes. Bizarre though the many-universes theory may seem, it should not be dismissed lightly. After all, its proponents include such luminaries as Stephen Hawking and Murray Gell-Mann, and entire international conferences are devoted to its ramifications.

How does all this relate to computation? The fact that an atom can be in either an excited or an unexcited state can be used to encode information: 0 for unexcited, 1 for excited. A quantum leap between the two states will convert a 1 to a 0 or vice versa. So atomic transitions can therefore be used as switches or gates for computation.

The true power of a quantum computer comes, however, from the ability to exploit superpositions in the switching processes. The key step is to apply the superposition principle to states involving more than one electron. To get an idea of what is involved, imagine a row of coins, each of which can be in one of two states: either heads or tails facing up. Coins too could be used to represent a number, with 0 for heads and 1 for tails.


Heads and Tails

Two coins can exist in four possible states: heads-heads, heads-tails, tails-heads and tails-tails, corresponding to the numbers 00, 01, 10 and 11. Similarly three coins can have 8 configurations, 4 can have 16 and so on. Notice how the number of combinations escalates as more coins are considered.

Now imagine that instead of the coins we have many electrons, each of which can exist in one of two states. This is close to the truth, as many subatomic particles when placed in a magnetic field can indeed adopt only two configurations: parallel or antiparallel to the field. Quantum mechanics allows that the state of the system as a whole can be a superposition of all possible such “heads/tails” alternatives. With even a handful of electrons, the number of alternatives making up the superposition is enormous, and each one can be used to process information at the same time as all the others. To use the jargon, a quantum superposition allows for massive parallel computation. In effect, the system can compute simultaneously in all the parallel universes, and then combine the results at the end of the calculation. The upshot is an exponential increase in computational power. A quantum computer with only 300 electrons, for example, would have more components in its superposition than all the atoms in the observable universe!

Achieving superpositions of many-particle states is not easy (the particles don’t have to be electrons). Quantum superpositions are notoriously fragile, and tend to be destroyed by the influence of the environment, a process known as decoherence. Maintaining a superposition is like trying to balance a pencil on its point. So far physicists have been able to attain quantum computational states involving only two or three particles at a time, but researchers in several countries are hastily devising subtle ways to improve on this and to combat the degenerative effects of decoherence. Gerard Milburn of the University of Queensland and Robert Clark of the University of New South Wales are experimenting with phosphorus atoms embedded in silicon, using the orientation of the phosphorus nuclei as the quantum equivalent of heads and tails.

The race to build a functioning quantum computer is motivated by more than a curiosity to see if it can work. If we had such a machine at our disposal, it could perform tasks that no conventional computer could ever accomplish. A famous example concerns the very practical subject of cryptography. Many government departments, military institutions and businesses keep their messages secret using a method of encryption based on multiplying prime numbers. (A prime number is one that cannot be divided by any other whole number except one.) Multiplying two primes is relatively easy. Most people could quickly work out that, say, 137 x 291 = 39867. But going backwards is much harder. Given 39867 and asked to find the prime factors, it could take a lot of trial and error before you hit on 137 and 291. Even a computer finds the reverse process hard, and if the two prime numbers have 100 digits, the task is effectively impossible even for a supercomputer.

 In 1995 Peter Shor, now at AT&T Labs in Florham Park, New Jersey, demonstrated that a quantum computer could make short work of the arduous task of factorising large prime numbers. At this stage governments and military organizations began to take an interest, since it implied that a quantum computer would render many encrypted data insecure.





Quantum computers

Research projects were started at defence labs such as Los Alamos in New Mexico. NATO and the U.S. National Security Agency began pumping millions of dollars into research. Oxford University set up a special Centre for Quantum Computation.

Soon mathematicians began to identify other problems that looked vulnerable to solution by quantum computation. Most of them fall in the category of search algorithms – various forms of finding needles in haystacks. Locating a friend’s phone number in a directory is easy, but if what you have is a number and you want to work backwards to find the name, you are in for a long job.

A celebrated challenge of this sort is known as the travelling salesman problem. Suppose a salesman has to visit four cities once and only once, and the company wishes to keep down the travel costs. The problem is to determine the routing that involves minimal mileage. In the case of four cities, A, B, C and D, it wouldn’t take long to determine the distance travelled in the various alternative itineraries – ABCD, ACBD, ADCB and so on. But for twenty cities the task becomes formidable, and soars further as additional cities are added.

It is too soon to generalise on how effectively quantum computers will be able to short-circuit these sorts of mega-search problems, but the expectation is that they will lead to a breathtaking increase in speed. At least some problems that would take a conventional supercomputer longer than the age of the universe should be solvable on a quantum computer in next to no time. The practical consequences of this awesome computational power have scarcely been glimpsed.

Some scientists see an altogether deeper significance in the quest for the quantum computer. Ultimately, the laws of the universe are quantum mechanical. The fact that we normally encounter weird quantum effects only at the atomic level has blinded us to the fact that - to paraphrase Einstein - God really does play dice with the universe. The main use of computers is to simulate the real world, whether it is a game of Nintendo, a flight simulator or a calculation of the orbit of a spacecraft. But conventional computers recreate the non-quantum world of daily experience. They are ill suited to dealing with the world of atoms and molecules. Recently, however, a group at MIT succeeded in simulating the behaviour of a quantum oscillator using a prototype quantum computer consisting of just four particles.

 But there is more at stake here than practical applications, as first pointed out by David Deutsch. A quantum computer, by its very logical nature, is in principle capable of simulating the entire quantum universe in which it is embedded. It is therefore the ultimate virtual reality machine. In other words, a small part of reality can in some sense capture and embody the whole. The fact that the physical universe is constructed in this way – that wholes and parts are mutually enfolded in mathematical self-consistency – is a stunning discovery that impacts on philosophy and even theology. By achieving quantum computation, mankind will lift a tiny corner of the veil of mystery that shrouds the ultimate nature of reality. We shall finally have captured the vision elucidated so eloquently by William Blake two centuries ago:


To see a World in a grain of sand,

And a Heaven in a wild flower,

Hold infinity in the palm of your hand,

And eternity in an hour.


by  Paul Davies  10/31/2002


source http://www.physicspost.com

MORE AT    http://youtu.be/I56UugZ_8DI



(CLICKING ON THE TITLED LINK WE ARE REDIRECTED TO A WIKI LIST OF QC SIMULATORS )

Labels:

Link