Wednesday, May 26, 2010

Blog: Electron 'Spin' in Silicon Will Lead to Revolutionary Quantum Chips

Electron 'Spin' in Silicon Will Lead to Revolutionary Quantum Chips
University of Southampton (ECS) (05/26/10) Lewis, Joyce

Researchers in the United Kingdom say they are building the world's first silicon-based integrated single-spin quantum bit system. The nanoscale system will be able to initialize, manipulate, and read the spin states of single electrons. Capturing the spin of electrons is an advantage over using the electronic charge because it can maintain coherence and is more resistant to interference in silicon or graphene. The integrated single-spin technology could lead to the development of novel nanospintronic devices that use the spin of individual electrons to transmit, store, and process information. The devices have the potential to dramatically improve scaling of functional density and performance while reducing the energy dissipated per functional operation. The technology could increase the processing power of conventional computers, and be used in quantum computers. "This project is a paradigm shift in information and communication technology," says University of Southampton professor Hiroshi Mizuta. "It is not just an extension of existing silicon technology; we have introduced a completely new principle based on quantum mechanics, which will make it possible for industry to continue to use silicon as devices get smaller."

View Full Article

Monday, May 24, 2010

Blog: Seven Atom Transistor Sets the Pace for Future PCs

Seven Atom Transistor Sets the Pace for Future PCs
BBC News (05/24/10)

A working transistor that contains only seven atoms has been built by a team in Australia. The researchers, led by University of New South Wales professor Michelle Simmons, developed the atomic transistor as part of a project to create a quantum computer. The atoms in a silicon crystal were replaced with phosphorus atoms using a scanning tunneling microscope. "Now we have just demonstrated the world's first electronic device in silicon systematically created on the scale of individual atoms," Simmons says. The working transistor was handmade, so there is a need for a process for developing them in large numbers. The researchers say the device could lead to chips that have components that are up to 100 times smaller than those on current processors. Simmons says the development could result in an "exponential" leap in processing power.

View Full Article

Sunday, May 23, 2010

Blog: Silicon Replacement: Gallium Arsenide?

Silicon Replacement: Gallium Arsenide?
PC World (05/23/10) Mulroy, James

Gallium arsenide (GaAs) could be used to develop electronic devices that are more efficient and less expensive than those run on chips made from silicon wafers. Researchers studying GaAs have developed a new process for producing chips made from the expensive but efficient semiconductor. The University of Illinois at Urbana-Champaign's John Rogers says stacks of thin films of semiconductor are grown onto a wafer, and then each individual film is removed one by one and placed onto a cheaper substrate or base to support the ultra-thin film. The technique eliminates the excess material thickness for larger diameter wafers, making them less expensive and more efficient. The Illinois team used the process to "build devices--including transistors, solar cells, and infrared cameras--on the substrates, leaving the wafer intact and ready for a new batch of film," Rogers says. The researchers say that using the technique to manufacture computer chips would result in faster and less expensive computers.

View Full Article

Friday, May 21, 2010

Blog: Paper Supercapacitor Could Power Future Paper Electronics

Paper Supercapacitor Could Power Future Paper Electronics
PhysOrg.com (05/21/10) Zyga, Lisa

Stanford University researchers have developed an onboard power source for paper transistors and paper displays. The paper supercapacitor is made by printing carbon nanotubes onto a treated piece of paper. In the paper supercapacitor, all the necessary components are integrated onto a single sheet of paper in the form of single walled carbon nanotubes (SWNTs). At first, the researchers found that the SWNTs penetrated the paper through micron-sized pores, which would cause the device to short-circuit. To solve this problem, the researchers coated both sides of the paper with polyvinylidene fluoride, which blocked the pores but still allowed for electrolytes to be transported through the paper, making the treated paper function like an electrolyte membrane and separator without short-circuiting. The researchers printed SWNTs on both sides of single sheets of paper and added electrolytes to form a supercapacitor. The new integrated structure allows for high-speed printing, which greatly reduces fabrication costs and brings disposable, flexible, and lightweight paper electronics closer to reality.

View Full Article

Thursday, May 20, 2010

Blog: Protecting Websites From Shared Code

Protecting Websites From Shared Code
Technology Review (05/20/10) Naone, Erica

Code sharing between Web sites can be an Achilles heel if third-party programs have security weaknesses, but the new ConScript browser extension could remove this vulnerability by giving developers and site owners an easier method for controlling the extent of what third-party code can do on their sites. ConScript works through the addition of a relatively small amount of code to the browser, which then analyzes JavaScript commands that the browser is processing. JavaScript can be prevented from attempting tasks that the user has configured to block through the injection of extra code. ConScript is aware of what behavior to enforce according to a set of policies selected by the site's owner. Microsoft researcher Ben Livshits says ConScript offers a technique for developers and browser makers to promote the ways that sites use JavaScript without endangering security. University of California, Berkeley researcher Leo Meyerovich says the extension's design should permit developers to use older code without having to modify it, even if it contains known security vulnerabilities.

View Full Article

Tuesday, May 18, 2010

Blog: Machines That Learn Better

Machines That Learn Better
MIT News (05/18/10) Hardesty, Larry

Massachusetts Institute of Technology (MIT) researchers have developed Church, a probabilistic programming language designed to cut the time it takes to build a machine-learning system to a matter of hours instead of months. Church is based on an inference algorithm, which instructs a machine-learning system on how to draw conclusions from the data presented to it. The algorithms currently used in probabilistic programming are designed to handle discrete data but struggle with continuous data. However, at the recent International Conference on Artificial Intelligence and Statistics, MIT student Daniel Roy presented a paper in which he and MIT instructor Cameron Freer describe an inference algorithm that can handle large classes of problems involving continuous data. Their work could be especially useful for artificial intelligence systems whose future behavior is dependent on their past behavior, says Rutgers University computer scientist Chung-chieh Shan.

View Full Article

Friday, May 14, 2010

Blog: Cyber Challenge: 10,000 Security Warriors Wanted

Cyber Challenge: 10,000 Security Warriors Wanted
Campus Technology (05/14/10) Schaffhauser, Dian

The goal of the U.S. Cyber Challenge is to recognize and train a cohort of 10,000 cybersecurity experts to help address gaps in government and industry. Program director Karen Evans says the concept behind the initiative is to cultivate participant skills and provide access to training and practice. She envisions the challenge possessing three core elements--community building for participants, "rack and stack" for recognizing skills and interests, and matching up individuals with government agencies offering scholarships and industry offering internships and jobs. An alpha run for the Cyber Challenge is being conducted this summer, where participants in California, New York, and Delaware can test a free online treasure hunt developed by the SANS Institute. Successful participants will be invited to attend a summer camp where they will get a week of training by SANS and university faculty and students. At the week's conclusion, participants will be broken up into teams to play a capture-the-flag competition by finding vulnerabilities in their opponents' systems while protecting their own.

View Full Article

Wednesday, May 12, 2010

Blog: U.S. Struggles to Ward Off Evolving Cyber Threat

U.S. Struggles to Ward Off Evolving Cyber Threat
Reuters (05/12/10) Stewart, Phil; Wolf, Jim

More than 100 foreign spy agencies, as well as criminal organizations and terrorist groups, are probing U.S. computer systems thousands of times per day and scanning them millions of times daily, says U.S. Department of Defense official James Miller. He says authorities have failed to stay ahead of the cyberattacks, which have resulted in the loss of an enormous amount of data. Miller says the problem is compounded by the fact that the U.S. does not fully understand the vulnerabilities that hackers are taking advantage of. However, he says there are several steps the U.S. could take to improve cybersecurity, including working with private industry to protect potential vulnerabilities in vital infrastructure such as power grids and financial markets. Miller also says the U.S. needs to focus on developing more computer programmers, since countries such as China and India are expected to produce many more computer scientists than the U.S. will over the next 20 to 30 years.

View Full Article

Tuesday, May 11, 2010

Blog: W3C Launches XProc Spec

W3C Launches XProc Spec
eWeek (05/11/10) Taft, Darryl K.

The World Wide Web Consortium (W3C) has released XProc, an Extensible Markup Language (XML) pipeline specification for managing XML-rich processes. W3C says the specification "provides a standard framework for composing XML processes [and] streamlines the automation, sequencing, and management of complex computations involving XML." XML is used throughout enterprise computing environments because it provides a standard way to manipulate data. "What we haven't had is any standard way to describe how to combine [XML functions] to accomplish any particular task," says XProc specification co-editor Mark Logic. "That's what XProc provides." University of Edinburgh reader Henry Thomas says "XProc exemplifies what W3C does best: We looked at existing practice--people have been using a number of similar-but-different XML-based languages--and we produced a consensus standard, creating interoperability and critical mass." W3C notes that XProc features a test suite that covers all of the required and optional steps of the language as well as all the static and dynamic errors.

View Full Article

Blog: Lining Up "Nanodot" Memory

Lining Up "Nanodot" Memory
Technology Review (05/11/10) Graham-Rowe, Duncan

North Carolina State University (NCSU) researchers have developed a method for growing magnetic nanoparticles that could lead to much more dense computer memory devices. The technique arranges magnetic nanodots, particles about six nanometers wide, in orderly arrays, making it easier to use them to store bits of information magnetically. A nanodot chip measuring one centimeter square could, in theory, store a terabit of data, says NCSU professor Jay Narayan. "The primary innovation is that we can keep all these dots ordered and aligned in the same way," he says. The technique, called domain-matching epitaxy, involves depositing a very thin layer of titanium nitride onto a substrate that serves as a template for the nanodots. The size and spacing of the dots can be controlled by varying the growth conditions, such as temperature. The nickel-based nanodots require low temperatures to function, but the researchers are working on making them out of iron-platinum, which should enable them operate at room temperature.

View Full Article

Blog: Carnegie Mellon Study of Twitter Sentiments Yields Results Similar to Public Opinion Polls

Carnegie Mellon Study of Twitter Sentiments Yields Results Similar to Public Opinion Polls
Carnegie Mellon News (05/11/10) Spice, Byron

Carnegie Mellon University (CMU) researchers analyzed the sentiments expressed in a billion Twitter messages during 2008-2009 relating to consumer confidence and presidential job approval ratings and found that they were similar to those of well-established public opinion polls. The research suggests that studying tweets could become an inexpensive, rapid way of gauging public opinion on some subjects, says CMU professor Noah Smith. "With seven million or more messages being tweeted each day, this data stream potentially allows us to take the temperature of the population very quickly," Smith says. The Twitter-derived sentiment measurements were much more volatile day-to-day than the polling data, but when the researchers looked at the results over a period of days, they often correlated closely with the polling data. The researchers say that improved natural-language processing tools, query-driven analysis, and the use of demographic and time stamp data could enhance the sophistication and reliability of the measurements.

View Full Article

Wednesday, May 5, 2010

Blog: Microsoft Researches Low Latency Operating System for Multicores

Microsoft Researches Low Latency Operating System for Multicores
ITPro (05/05/10) Scott, Jennifer

The Microsoft Research Lab in Cambridge, England, has developed BarrelFish, an operating system designed to overcome the latency problem in multi-core computers. BarrelFish's key concept is to restrict communication between the cores to create a better timeline of actions, says Microsoft Labs' Andrew Herbert. "The operating system understands the relationship between the resources and where the bottlenecks can be, and it takes its scheduling decisions about how it organizes the work in the machine to respect those resources and those bottlenecks," Herbert says. The awareness of where the latency occurs and the correct scheduling of how applications run has led to impressive results in Microsoft's labs. "The kind of performance graphs we get out of systems like BarrelFish [means] the latency holds much more constant even as the number of cores go up, so you win both on increased throughput and also not having to sacrifice the latency to achieve it," he says.

View Full Article

Blog: N.Y. Bomb Plot Highlights Limitations of Data Mining

N.Y. Bomb Plot Highlights Limitations of Data Mining
Computerworld (05/05/10) Vijayan, Jaikumar

The recent failed bombing attempt in New York City shows the limitations of data-mining technology when used in security applications. Since the terror attacks of Sept. 11, the U.S. government has spent tens of millions of dollars on data-mining programs that are used by agencies to identify potential terrorists. The Department of Homeland Security's Automated Targeting System assigns terror scores to U.S. citizens, and the Transportation Security Administration's Secure Flight program analyzes airline passenger data. However, it is unclear how effective these programs have been in identifying and stopping potential terrorist threats. Using data mining to search for potential terrorists is similar to looking for a needle in a haystack, says BT chief security officer Bruce Schneier. "Data mining works best when there's a well-defined profile you're searching for, a reasonable number of attacks per year, and a low cost of false alarms," Schneier says. However, even the most accurate and finely tuned data-mining system will generate one billion false alarms for each real terrorist plot it uncovers, he says.

View Full Article

Tuesday, May 4, 2010

Blog: New Data Analysis System Could Do Double Duty

New Data Analysis System Could Do Double Duty
UT Dallas News (05/04/10) Moore, David

A new system for identifying potential Internet threats has been developed by researchers at the University of Texas (UT) at Dallas. Designed to analyze behavioral data, the system monitors network traffic and provides an alert when it notices worrisome deviations in normal activity. "We proposed a novel platform that thoroughly analyzes network traffic behavior to identify potential Internet threats," says UT Dallas professor Mehrdad Nourani. The technology uses two subsystems that work in parallel to reach a high speed and use memory efficiently, which allows for faster results and optimal use of resources. A bell-shaped curve of normal activity is built, and it can achieve nearly zero false positives and negatives when identifying abnormalities outside the curve. Nourani says the technology also could be used to analyze health data and detect abnormal health issues such as heart arrhythmia, sleep apnea, or epileptic seizure.

View Full Article

Blog: Army of Smartphone Chips Could Emulate the Human Brain

Army of Smartphone Chips Could Emulate the Human Brain
New Scientist (05/04/10) Marks, Paul

University of Manchester computer scientist Steve Furber wants to build a silicon-based brain that contains one billion neurons. "We're using bog-standard, off-the-shelf processors of fairly modest performance," Furber says. The silicon brain, called Spiking Neural Network Architecture (Spinnaker), is based on a processor Furber helped design in 1987. Spinnaker's chips contain 20 ARM processor cores, each modeling 1,000 neurons. With 20,000 neurons per chip, Furber needs 50,000 chips to reach his goal of one billion neurons. A memory chip next to each processor stores the changing synaptic weights as numbers that represent the importance of a given connection. As the system becomes more developed, the only computer able to compute the connections will be the machine itself, Furber says. Spinnaker relies on a controller to direct spike traffic, similar to a router for the Internet. The researchers have built a small version of the silicon brain with 50 neurons and have created a virtual environment in which Spinnaker controls a Pac-Man-like program that learns to find a virtual doughnut.

View Full Article

Blog: New Technology Generates Database on Spill Damage

New Technology Generates Database on Spill Damage
New York Times (05/04/10) Wheaton, Sarah

A New Orleans advocacy group is using crowd-sourcing technology to map the damage of the Gulf of Mexico oil spill on the Gulf Coast and create a database on its impact on the region. In 2008, volunteers developed Ushahidi as an open source platform to help Kenyans track political violence. Ushahidi has since been used to map election irregularities in Sudan, crime in Atlanta, and earthquake victims in Haiti. The Louisiana Bucket Brigade is receiving texts, tweets, and email messages about odors, unemployed oystermen, oily birds, and more. The database of reports appears as a rainbow of dots on a map on its Web site. "You hear one anecdote and then another anecdote, but hopefully this will give a more global perspective of the damage," says Bucket Brigade director Anne Rolfes. "We will have absolutely crystal-clear data about people affected, and that should certainly inform policy makers."

View Full Article

Monday, May 3, 2010

Blog: Yale Scientists Explain Why Computers Crash But We Don't

Yale Scientists Explain Why Computers Crash But We Don't
Yale University (05/03/10) Hathaway, Bill

Yale University researchers have described why computers tend to malfunction more than living organisms by analyzing the control networks in both an E-coli bacterium and the Linux operating system. Both systems are arranged in hierarchies, but with some key differences in how they achieve operational efficiencies. The molecular networks in the bacteria are arranged in a pyramid, with a limited number of master regulator genes at the top that control a wide base of specialized functions. The Linux operating system is set up more like an inverted pyramid, with many different top-level routines controlling a few generic functions at the bottom. This organization arises because software engineers tend to save money and time by building on existing routines rather than starting systems from scratch, says Yale professor Mark Gerstein. "But it also means the operating system is more vulnerable to breakdowns because even simple updates to a generic routine can be very disruptive," Gerstein says.

View Full Article

Blog: Computer Science Shows How 'Twitter-Bombs' Wield Influence

Computer Science Shows How 'Twitter-Bombs' Wield Influence
Wellesley College (05/03/10) Corday, Arlie

Wellesley College computer science professor P. Takis Metaxas says "Twitter bombs"--sending many Tweets from a large number of Twitter accounts within a short period of time--are being used to affect the outcome of elections. Metaxas says Twitter bombs were used against U.S. Senate candidate Martha Coakley in the recent Massachusetts senatorial election. A Twitter bomb reaches many people very quickly. "In addition, because Google is displaying Twitter trends in a prominent place, you influence Google search results," Metaxas says. The result of the Twitter bomb was "disproportionate exposure to personal opinions, fabricated content, unverified events, lies, and misrepresentations that would otherwise not find their way in the first page (of Google search results), giving them the opportunity to spread virally," he says. In an analysis of the Coakley Twitter bomb, the researchers found that the attack was launched by the American Future Fund, the same group that attacked John Kerry's record during his 2004 presidential campaign. Metaxas is developing software to detect Twitter bombs in real time.

View Full Article

Blog Archive