Wednesday, December 28, 2011

Blog: Five Open Source Technologies for 2012

Five Open Source Technologies for 2012
IDG News Service (12/28/11) Joab Jackson

Five open source projects could become the basis for new businesses and industries in 2012. Nginx, a Web server program, could become popular due to its ability to easily handle high-volume traffic. Nginx already is used on highly trafficked Web sites, and the next release, due in 2012, will be more pliable for shared hosting environments. The OpenStack cloud computing platform has gained support from several technology firms due to its scalability. "We're not talking about [using OpenStack to run a] cloud of 100 servers or even 1,000 servers, but tens of thousands of servers," says OpenStack Project Policy Board's Jonathan Bryce. Stig was designed for the unique workloads of social networking sites, according to its developers. The data store's architecture allows for inferential searching, enabling users and applications to look for connections between disparate pieces of information. Linux Mint was designed specifically for users who want a desktop operating system and do not want to learn more about how Linux works. The Linux Mint project is now the fourth most popular desktop operating system in the world. GlusterFS is one of the fastest growing storage software systems on the market, as downloads have increased by 300 percent in the last year.

Wednesday, December 21, 2011

Blog: Computer Scientists Create Algorithm That Measures Human Pecking Order

Computer Scientists Create Algorithm That Measures Human Pecking Order
Technology Review (12/21/11)

Cornell University's Jon Kleinberg, who developed the Hyper Induced Topic Search (HITS) algorithm that led to Google's PageRank search algorithm, has developed a method for measuring power differences between individuals using the patterns of words they speak or write. "We show that in group discussions, power differentials between participants are subtly revealed by how much one individual immediately echoes the linguistic style of the person they are responding to," Kleinberg says. The key to the technique is linguistic co-ordination, in which speakers naturally copy the style of the interlocutors. The Cornell researchers focused on functional words that provide a grammatical framework for sentences but do not have real meaning themselves, such as articles, auxiliary verbs, conjunctions and high-frequency adverbs. The researchers studied editorial discussions between Wikipedia editors and transcripts of oral arguments in the U.S. Supreme Court. By looking at the changes in linguistic style that occur when people make the transition from non-admin to admin roles on Wikipedia, the researchers show that the pattern of linguistic co-ordination changes too. A similar effect occurs in the Supreme Court. "Our work is the first to identify connections between language coordination and social power relations at large scales, and across a diverse set of individuals and domains," Kleinberg says.

Sunday, December 11, 2011

Blog: Multi-Purpose Photonic Chip Paves the Way to Programmable Quantum Processors

Multi-Purpose Photonic Chip Paves the Way to Programmable Quantum Processors
University of Bristol News (12/11/11)

University of Bristol researchers have developed an optical chip that generates, manipulates, and measures two quantum phenomena, entanglement and mixture, which are essential for building quantum computers. The researchers showed that entanglement can be generated, manipulated, and measured on a silicon chip. The chip also has been able to measure mixture, which can be used to characterize quantum circuits. "To build a quantum computer, we not only need to be able to control complex phenomena, such as entanglement and mixture, but we need to be able to do this on a chip, so that we can scalably and practically duplicate many such miniature circuits--in much the same way as the modern computers we have today," says Bristol professor Jeremy O'Brien. "Our device enables this and we believe it is a major step forward towards optical quantum computing." The chip consists of a network of tiny channels that guide, manipulate, and interact with single photons. "It’s exciting because we can perform many different experiments in a very straightforward way, using a single reconfigurable chip," says Bristol's Peter Shadbolt. The researchers are now scaling up the complexity of the device for use as a building block for quantum computers.

Thursday, December 8, 2011

Blog: Streamlining Chip Design

Streamlining Chip Design
MIT News (12/08/11) Larry Hardesty

Massachusetts Institute of Technology (MIT) researchers have developed a system that enables hardware designers to specify, in a single programming language, all of the functions they want a device to perform. The system allows chip designers to designate which functions should run in hardware and which in software, and the system will automatically produce the corresponding circuit descriptions and computer code. The system is based on BlueSpec, a chip-design language that enables designers to specify a set of rules that the chip must follow and convert those specifications into Verilog code. The MIT researchers expanded the BlueSpec instruction set so that it can describe more elaborate operations that are possible only in software. "What we're trying to give people is a language where they can describe the algorithm once and then play around with how the algorithm is partitioned," says MIT student Myron King.

Wednesday, December 7, 2011

Blog: White House Sets Cybersecurity R&D Priorities

White House Sets Cybersecurity R&D Priorities
InformationWeek (12/07/11) Elizabeth Montalbano

The White House has published a cybersecurity research and development (R&D) roadmap developed by the U.S. Office of Science and Technology Policy. The roadmap, a product of a seven-year effort by both public- and private-sector experts, lists four areas of R&D concentration. The first priority is inducing change by applying game-changing themes toward the comprehension of the underlying reasons for current cybersecurity vulnerabilities, and devising ways to address them by disrupting the status quo. The next research priority focuses on the development of scientific foundations for cybersecurity, including laws, hypothesis testing, repeatable experimental designs, standardized data collection techniques, metrics, and common terminology. The third area of concentration entails facilitating the most comprehensive research impact by ensuring interagency collaboration, coordination, and integration of cybersecurity improvement operations. The final priority is to accelerate the time it takes to practically apply the cybersecurity research. "Given the magnitude and pervasiveness of cyberspace threats to our economy and national security, it is imperative that we fundamentally alter the dynamics in cybersecurity through the development of novel solutions and technologies," says U.S. chief technology officer Aneesh Chopra and White House cybersecurity coordinator Howard Schmidt.

Tuesday, December 6, 2011

Blog: System Would Monitor Feds for Signs They're 'Breaking Bad'

System Would Monitor Feds for Signs They're 'Breaking Bad'
Government Computer News (12/06/11) Kevin McCaney

Georgia Tech researchers, in collaboration with researchers at Oregon State University, the University of Massachusetts, and Carnegie Mellon University, are developing the Proactive Discovery of Insider Threats Using Graph Analysis and Learning (PRODIGAL) system. PRODIGAL is designed to scan up to 250 million text messages, emails, and file transfers to identify insider threats or employees that are about to turn against the organization. The system will integrate graph processing, anomaly detection, and relational machine learning to create a prototype Anomaly Detection at Multiple Scales system. PRODIGAL, which initially would be used to monitor the communications in civilian, government, and military organizations in which employees have agreed to be monitored, is intended to identify rogue individuals, according to the researchers. "Our goal is to develop a system that will provide analysts for the first time a very short, ranked list of unexplained events that should be further investigated," says Georgia Tech professor David Bader.

Monday, December 5, 2011

Blog: Creating Artificial Intelligence Based on the Real Thing

Creating Artificial Intelligence Based on the Real Thing
New York Times (12/05/11) Steve Lohr

Researchers from Cornell University, Columbia University, the University of Wisconsin, the University of California, Merced, and IBM are developing technology based on biological systems. The project recently received $21 million in funding from the U.S. Defense Advanced Research Projects Agency (DARPA), which helped lead to the development of prototype neurosynaptic microprocessors that function more like neurons and synapses than conventional semiconductors. The prototype chip has 256 neuron-like nodes, surrounded by more than 262,000 synaptic memory modules. A computer running the prototype chip has learned how to play the video game Pong and to identify the numbers one through 10 written by a human on a digital pad. The project aims to find designs, concepts, and techniques that might be borrowed from biology to push the limits of computing. The research is "the quest to engineer the mind by reverse-engineering the brain," says IBM's Dharmendra S. Modha. DARPA wants the project to produce technology that is self-organizing, able to learn instead of just responding to programming commands, and run on very little power. "It seems that we can build a computing architecture that is quite general-purpose and could be used for a large class of applications," says Cornell professor Rajit Manohar.

Friday, December 2, 2011

Blog: First Demonstration of Opto-Electronic Reservoir Computing

First Demonstration of Opto-Electronic Reservoir Computing
Technology Review (12/02/11)

Universite Libre de Bruxelles researchers have developed a form of computing that exploits feedback loops to perform extremely fast analog calculations. The researchers found that a nonlinear feedback mechanism is basically an information processor, because it takes a certain input and processes it to generate an output. The feedback loop is a kind of memory that stores information about the system's recent history, making this form of processing an analysis of just a small segment of the recent past. The researchers, led by Yvan Paquot, are working on reservoir computing, which consists of a large number of nodes that are randomly connected. Each node is a kind of non-linear feedback loop, and the inputs are fed into the random nodes in the reservoir, while the outputs are taken from other randomly chosen nodes. The researchers say the reservoir network is similar to a neural network, except the output signals are weighted during training, which makes the process much simpler than with a neural network. "Our experiment is the first implementation of reservoir computing fast enough for real time information processing," Paquot says.

Blog: U.S. Intelligence Group Seeks Machine Learning Breakthroughs

U.S. Intelligence Group Seeks Machine Learning Breakthroughs
Network World (12/02/11) Michael Cooney

The U.S. Intelligence Advanced Research Projects Activity (IARPA) announced that it is looking for new ideas that may become the basis of cutting-edge machine-learning projects. "In many application areas, the amount of data to be analyzed has been increasing exponentially [sensors, audio and video, social network data, Web information], stressing even the most efficient procedures and most powerful processors," according to IARPA. "Most of these data are unorganized and unlabeled and human effort is needed for annotation and to focus attention on those data that are significant." IARPA's request for information asks about proposed methods for the automation of architecture and algorithm selection and combination, feature engineering, and training data scheduling, as well as compelling reasons to use such approaches in a scalable multi-modal analytic system and whether supporting technologies are readily available. IARPA says that innovations in hierarchical architectures such as Deep Belief Nets and hierarchical clustering will be needed for useful automatic machine-learning systems. It wants to identify promising areas for investment and plans to hold a machine learning workshop in March 2012.

Blog Archive