Showing posts with label algorithm. Show all posts
Showing posts with label algorithm. Show all posts

Monday, April 2, 2012

Blog: UMass Amherst Computer Scientist Leads the Way to the Next Revolution in Artificial Intelligence

UMass Amherst Computer Scientist Leads the Way to the Next Revolution in Artificial Intelligence
University of Massachusetts Amherst (04/02/12) Janet Lathrop

University of Massachusetts Amherst researchers are translating the "Super-Turing" computation into an adaptable computational system that learns and evolves, using input from the environment the same way human brains do. The model "is a mathematical formulation of the brain’s neural networks with their adaptive abilities," says Amherst computer scientist Hava Siegelmann. When the model is installed in a new environment, the new Super-Turing model results in an exponentially greater set of behaviors than the classical computer or the original Turing model. The researchers say the new Super-Turing machine will be flexible, adaptable, and economical. "The Super-Turing framework allows a stimulus to actually change the computer at each computational step, behaving in a way much closer to that of the constantly adapting and evolving brain," Siegelmann says.

Friday, March 30, 2012

Blog: Sanjeev Arora Named Winner of 2011 ACM-Infosys Award

Sanjeev Arora Named Winner of 2011 ACM-Infosys Award
CCC Blog (03/30/12) Erwin Gianchandani

Princeton University professor Sanjeev Arora has received the 2011 ACM-Infosys Foundation Award in Computing Sciences for his contributions to computational complexity, algorithms, and optimization. "Arora’s research revolutionized the approach to essentially unsolvable problems that have long bedeviled the computing field, the so-called NP-complete problems," according to an ACM-Infosys press release. Arora is an ACM Fellow and won the Gödel Prize in 2001 and 2010, as well as the ACM Doctoral Dissertation Award in 1995. Arora also is the founding director of Princeton's Center for Computational Intractability, which addresses the phenomenon that many problems seem inherently impossible to solve on currently computational models. "With his new tools and techniques, Arora has developed a fundamentally new way of thinking about how to solve problems,” says ACM President Alain Chesnais. “In particular, his work on the PCP theorem is considered the most important development in computational complexity theory in the last 30 years. He also perceived the practical applications of his work, which has moved computational theory into the realm of real world uses.” The ACM-Infosys Foundation Award recognizes personal contributions by young scientists and system developers to a contemporary innovation and includes a $175,000 prize.

Tuesday, March 27, 2012

Blog: Algorithm Spells the End for Professional Musical Instrument Tuners

Algorithm Spells the End for Professional Musical Instrument Tuners
Technology Review (03/27/12)

University of Wurzburg researcher Hay Hinrichsen says he has developed an algorithm that makes it possible for electronic tuners to match the performance of the best human tuners. Hinrichsen's algorithm involves a process known as entropy minimization. First, Hinrichsen uses the equal temperament method and then divides the audio spectrum with a resolution that matches the human ear. The method then measures the entropy in the system and applies a small random change to the frequency of a note and measures the entropy again. Hinrichsen says the algorithm is comparable to the work of a professional tuner. He notes that the software can be added to the features of relatively inexpensive electronic tuners. "The implementation of the method is very easy," Hinrichsen says.

Monday, March 26, 2012

Blog: Robots to Organise Themselves Like a Swarm of Insects

Robots to Organise Themselves Like a Swarm of Insects
The Engineer (United Kingdom) (03/26/12)

A swarm of insects is the inspiration for a warehouse transport system that makes use of autonomous robotic vehicles. Researchers at the Fraunhofer Institute for Material Flow and Logistics (IML) have developed autonomous Multishuttle Moves vehicles to organize themselves like insects. The team is testing 50 shuttles at a replica warehouse. When an order is received, the shuttles communicate with one another via a wireless Internet connection and the closest free vehicle takes over and completes the task. "We rely on agent-based software and use ant algorithms based on the work of [swarm robotics expert] Marco Dorigo," says IML's Thomas Albrecht. The vehicles move around using a hybrid sensor concept based on radio signals, distance and acceleration sensors, and laser sensors to calculate the shortest route to any destination and avoid collisions. Albrecht says the system is more flexible and scalable because it can be easily adapted for smaller or larger areas based on changes in demand. "In the future, transport systems should be able to perform all of these tasks autonomously, from removal from storage at the shelf to delivery to a picking station," says IML professor Michael ten Hompel.

Wednesday, March 21, 2012

Blog: Computer Model of Spread of Dementia Can Predict Future Disease Patterns Years Before They Occur in a Patient

Computer Model of Spread of Dementia Can Predict Future Disease Patterns Years Before They Occur in a Patient
Cornell News (03/21/12) Richard Pietzak

Weill Cornell Medical College researchers have developed software that tracks the manner in which different forms of dementia spread within a human brain. The model can be used to predict where and when a person's brain will suffer from the spread of toxic proteins, a process that underlies all forms of dementia. The findings could help patients and their families confirm a diagnosis of dementia and prepare in advance for future cognitive declines over time. "Our model, when applied to the baseline magnetic resonance imaging scan of an individual brain, can similarly produce a future map of degeneration in that person over the next few years or decades," says Cornell's Ashish Raj. The computational model validates the idea that dementia is caused by proteins that spread through the brain along networks of neurons. Raj says the program models the same process by which any gas diffuses in air, except that in the case of dementia, the diffusion process occurs along connected neural fiber tracts in the brain. "While the classic patterns of dementia are well known, this is the first model to relate brain network properties to the patterns and explain them in a deterministic and predictive manner," he says.

Wednesday, December 21, 2011

Blog: Computer Scientists Create Algorithm That Measures Human Pecking Order

Computer Scientists Create Algorithm That Measures Human Pecking Order
Technology Review (12/21/11)

Cornell University's Jon Kleinberg, who developed the Hyper Induced Topic Search (HITS) algorithm that led to Google's PageRank search algorithm, has developed a method for measuring power differences between individuals using the patterns of words they speak or write. "We show that in group discussions, power differentials between participants are subtly revealed by how much one individual immediately echoes the linguistic style of the person they are responding to," Kleinberg says. The key to the technique is linguistic co-ordination, in which speakers naturally copy the style of the interlocutors. The Cornell researchers focused on functional words that provide a grammatical framework for sentences but do not have real meaning themselves, such as articles, auxiliary verbs, conjunctions and high-frequency adverbs. The researchers studied editorial discussions between Wikipedia editors and transcripts of oral arguments in the U.S. Supreme Court. By looking at the changes in linguistic style that occur when people make the transition from non-admin to admin roles on Wikipedia, the researchers show that the pattern of linguistic co-ordination changes too. A similar effect occurs in the Supreme Court. "Our work is the first to identify connections between language coordination and social power relations at large scales, and across a diverse set of individuals and domains," Kleinberg says.

Saturday, November 19, 2011

Blog: Google's Search Algorithm Challenged

Google's Search Algorithm Challenged
IDG News Service (11/19/11) Philip Willan

Padua University professor Massimo Marchiori is leading the development of Volunia, a new search engine that could challenge Google's search algorithm and lead to radically different search engines in the future. "It's not just Google plus 10 percent. It's a different perspective," says Marchiori, who contributed to the development of Google's search algorithm. "It's a new radical view of what a search engine of the future could be." Volunia's Web site allows visitors to sign up for a chance to test the beta version of the search engine, which will be launched in 12 languages by the end of the year. "If I didn't think it was something big, capable of competing with the giants of online search, I would never have got involved," Marchiori says. The project is headquartered in Padua, with funding being supplied by Sardinian entrepreneur Mariano Pireddu. "The difference of our search engine is what will enable us to emerge," Marchiori says. Pireddu says the Volunia researchers are not attempting to build a better search engine than Google's, but rather they are trying to create a different kind of search engine that can work alongside Google's.

Thursday, November 17, 2011

Blog: Smart Swarms of Bacteria Inspire Robotics Researchers

Smart Swarms of Bacteria Inspire Robotics Researchers
American Friends of Tel Aviv University (11/17/11)

Tel Aviv University (TAU) researchers have developed a computational model that describes how bacteria move in a swarm, a discovery they say could be applied to computers, artificial intelligence, and robotics. The model shows how bacteria collectively gather information about their environment and find an optimal plan for growth. The research could enable scientists to design smart robots that can form intelligent swarms, help in the development of medical micro-robots, or de-code social network systems to find information on consumer preferences. "When an individual bacterium finds a more beneficial path, it pays less attention to the signals from the other cells, [and] since each of the cells adopts the same strategy, the group as a whole is able to find an optimal trajectory in an extremely complex terrain," says TAU Ph.D. student Adi Shklarsh. The model shows how a swarm can perform optimally with only simple computational abilities and short term memory, Shklarsh says. He notes that understanding the secrets of bacteria swarms can provide crucial hints toward the design of robots that are programmed to perform adjustable interactions without needing as much data or memory.

Friday, November 11, 2011

Blog: Stanford Joins BrainGate Team Developing Brain-Computer Interface to Aid People With Paralysis

Stanford Joins BrainGate Team Developing Brain-Computer Interface to Aid People With Paralysis
Stanford University (11/11/11) Tanya Lewis

Stanford University researchers have joined the BrainGate research project, which is investigating the feasibility of people with paralysis using a technology that interfaces directly with the brain to control computer cursors, robotic arms, and other assistive devices. The project is based on technology developed by researchers at Brown and Harvard universities, Massachusetts General Hospital, and the Providence Veterans Affairs Medical Center. BrainGate is a hardware/software-based system that senses electrical signals in the brain that control movement. Computer algorithms translate the signals into instructions that enable users with paralysis to control external devices. "One of the biggest contributions that Stanford can offer is our expertise in algorithms to decode what the brain is doing and turn it into action," says Stanford's Jaimie Henderson. He is working with Stanford professor Krishna Shenoy, who is focusing on understanding how the brain controls movement and translating that knowledge into neural prosthetic systems controlled by software. "The BrainGate program has been a model of innovation and teamwork as it has taken the first giant steps toward turning potentially life-changing technology into a reality," Shenoy says. The researchers recently showed that the system allowed a patient to control a computer cursor more than 1,000 days after implementation.

Tuesday, October 25, 2011

Blog: How Revolutionary Tools Cracked a 1700s Code

How Revolutionary Tools Cracked a 1700s Code
New York Times (10/25/11) John Markoff

A cipher dating back to the 18th century that was considered uncrackable was finally decrypted by a team of Swedish and U.S. linguists by using statistics-based translation methods. After a false start, the team determined that the Copiale Cipher was a homophonic cipher and attempted to decode all the symbols in German, as the manuscript was originally discovered in Germany. Their first step was finding regularly occurring symbols that might stand for the common German pair "ch." Once a potential "c" and "h" were found, the researchers used patterns in German to decode the cipher one step at a time. Language translation techniques such as expected word frequency were used to guess a symbol's equivalent in German. However, there are other, more impenetrable ciphers that have thwarted even the translators of the Copiale Cipher. The Voynich manuscript has been categorized as the most frustrating of such ciphers, but one member of the team that cracked the Copiale manuscript, the University of Southern California's Kevin Knight, co-published an analysis of the Voynich document pointing to evidence that it contains patterns that match the structure of natural language.

Wednesday, October 12, 2011

Blog: Cops on the Trail of Crimes That Haven't Happened

Cops on the Trail of Crimes That Haven't Happened
New Scientist (10/12/11) Mellisae Fellet

The Santa Cruz, Calif., police department recently started field-testing Santa Clara University-developed software that analyzes where crime is likely to be committed. The software uses the locations of past incidents to highlight likely future crime scenes, enabling police to target and patrol those areas with the hope that their presence might stop the crimes from happening in the first place. The program, developed by Santa Clara researcher George Mohler, predicted the location and time of 25 percent of burglaries that occurred on any particular day in an area of Los Angeles in 2004 and 2005, using just the data on burglaries that had occurred before that day. The Santa Cruz police department is using the software to monitor 10 areas for residential burglaries, auto burglaries, and auto theft. If the program proves to be effective in thwarting crime in areas that are known for their high crime rates, it can be applied to other cities, says University of California, Los Angeles researcher Jeffrey Brantingham, who collaborated on the algorithm's development.

Tuesday, October 11, 2011

Blog: "Ghostwriting" the Torah?

"Ghostwriting" the Torah?
American Friends of Tel Aviv University (10/11/11)

Tel Aviv University (TAU) researchers have developed a computer algorithm that could help identify the different sources that contributed to the individual books of the Bible. The algorithm, developed by TAU professor Nachum Dershowitz, recognizes linguistic cues, such as word preference, to divide texts into probable author groupings. The researchers focused on writing style instead of subject or genre to avoid some of the problems that have vexed Bible scholars in the past, such as a lack of objectivity and complications caused by the multiple genres and literary forms found in the Bible. The software searches for and compares details that human scholars might have difficulty detecting, such as the frequency of the use of function words and synonyms, according to Dershowitz. The researchers tested the software by randomly mixing passages from the Hebrew books of Jeremiah and Ezekiel, and instructing the computer to separate them. The program was able to separate the passages with 99 percent accuracy, in addition to separating "priestly" materials from "non-priestly" materials. "If the computer can find features that Bible scholars haven't noticed before, it adds new dimensions to their scholarship," Dershowitz says.

Wednesday, September 21, 2011

Blog: Novel High-Performance Hybrid System for Semantic Factoring of Graph Databases

Novel High-Performance Hybrid System for Semantic Factoring of Graph Databases
Pacific Northwest National Laboratory (09/21/11) Kathryn Lang; Christine Novak

Researchers at the Pacific Northwest National Laboratory, Sandia National Laboratories, and Cray have developed an application that can analyze gigabyte-sized data sets. The application uses semantic factoring to organize data, revealing hidden connections and threads. The researchers used the application to analyze the massive datasets for the Billion Triple Challenge, an international competition aimed at demonstrating capability and innovation for dealing with very large semantic graph databases. The researchers utilized the Cray XMT architecture, which allowed all 624 gigabytes of input data to be held in RAM. The researchers are now developing a prototype that can be adapted to a variety of application domains and datasets, including working with the bio2rdf.org and future billion-triple-challenge datasets in prototype testing and evaluation.

Thursday, August 4, 2011

Blog: Wireless Network in Hospital Monitors Vital Signs

Wireless Network in Hospital Monitors Vital Signs
Washington University in St. Louis (08/04/11) Diana Lutz

Washington University in St. Louis researchers launched a prototype sensor network in Barnes-Jewish Hospital, with the goal of creating a wireless virtual intensive-care unit in which the patents are free to move around. When the system is fully operational, sensors will monitor the blood oxygenation level and heart rate of at-risk patients once or twice a minute. The data is transmitted to a base station and combined with other data in the patient's electronic medical record. The data is analyzed by a machine-learning algorithm that looks for signs of clinical deterioration, alerting nurses to check on patients when those signs are found. The clinical warning system is part of a new wireless health field that could change the future of medicine, says Washington University in St. Louis computer scientist Chenyang Lu. In developing the system, the researchers were focused on ensuring that the network would always function and never fail. The relay nodes are programmed as a self-organizing mesh network, so that if one node fails the data will be rerouted to another path. At the end of the trial, the researchers found that data were reliably received more than 99 percent of the time and the sensing reliability was 81 percent.

View Full Article

Thursday, July 21, 2011

Blog: Prof Says Tech Entering the Age of the Algorithm

Prof Says Tech Entering the Age of the Algorithm
University of Texas at Dallas (TX) (07/21/11) David Moore

University of Texas at Dallas (UTD) professor Andras Farago thinks that as algorithms become more important to software development, educational and career opportunities will follow. Farago says the rise in the importance of algorithms mirrors the life cycle of software, which originally was viewed as a secondary feature to hardware. "In a sense, algorithms up until very recently have had the same relationship to software implementation as software previously had to hardware: Icing on the cake," he says. However, Farago says there recently have been more cases, such as the Heritage Provider Network's $3 million prize, in which the hardest part is finding the perfect algorithm. "Once it is found, the implementation can be done by any skilled team, and I believe this may show the emergence of a trend in which the industry starts recognizing the real, hard value of sophisticated algorithms," he says. As part of the Heritage contest, participants are trying to design the algorithm that best predicts which people are more likely to require hospitalization in the future.

View Full Article

Tuesday, June 7, 2011

Blog: Watson's Lead Developer: 'Deep Analysis, Speed, and Results'

Watson's Lead Developer: 'Deep Analysis, Speed, and Results'
Computing Community Consortium (06/07/11) Erwin Gianchandani

The challenge of creating a machine that could compete on Jeopardy! was the genesis of IBM's Watson supercomputer, said Watson lead developer David Ferrucci in a keynote speech at ACM's 2011 Federated Computing Research Conference. He said the challenge involved balancing computer programs that are natively explicit, fast, and exciting in their calculations with natural language that is implicit, innately contextual, and frequently inaccurate. Ferrucci also said that it offered a "compelling and notable way to drive and measure technology of automatic question answering along five key dimensions: Broad, open domain; complex language; high precision; accurate confidence; and high speed." An analysis of tens of thousands of randomly sampled Jeopardy! clues revealed that the most common clue type occurred only 3 percent of the time, and this finding led to several guiding principles for Watson's development, including the unsuitabilty of specific large, hand-crafted models, the need to derive intelligence from many diverse techniques, and the primary requirement of massive parallelism. Ferrucci's team eventually designed a system that produces and scores numerous hypotheses using thousands of natural language processing, information retrieval, machine learning, and reasoning algorithms in combination. The research expanded the field of computer science by pursuing goal-oriented system-level metrics and longer-term incentives.

View Full Article

Thursday, May 12, 2011

Blog: How to Control Complex Networks

How to Control Complex Networks
MIT News (05/12/11) Anne Trafton

Researchers at the Massachusetts Institute of Technology (MIT) and Northeastern University have developed a computational model that can analyze any type of complex network and find the critical points that can be used to control the entire system. The method can be used to reprogram adult cells and identify new drug targets, among other applications, says MIT professor Jean-Jacques Slotine. The researchers developed an algorithm that determines how many nodes in a network need to be controlled for total network control. They then adapted the algorithm to show how many points are needed and where those points are located. The number of points needed depends on the network's degree distribution, which describes the number of connections per node. The researchers applied their model to several real-life networks, including cell phone networks, social networks, and neuronal networks, calculating the percentage of points that need to be controlled in order to gain total control of the system. The researchers found that sparse networks require a higher number of controlled points than denser networks. "The area of control of networks is a very important one, and although much work has been done in this area, there are a number of open problems of outstanding practical significance," says Northeastern professor Adilson Motter.

View Full Article

Wednesday, April 13, 2011

Blog: Programming Regret for Google

Programming Regret for Google
American Friends of Tel Aviv University (04/13/11)

Tel Aviv University researchers recently launched a project aimed at developing new algorithms that will help computers minimize the distance between a desired outcome and the actual outcome, or what Tel Aviv professor Yishay Mansour calls regret. Google plans to fund the research, which is on the cutting edge of computer science and game theory. "If the servers and routing systems of the Internet could see and evaluate all the relevant variables in advance, they could more efficiently prioritize server resource requests, load documents, and route visitors to an Internet site, for instance," Mansour says. His algorithm, which is based on machine learning, minimizes the amount of virtual regret a computer might experience. "Compared to human beings, help systems can much more quickly process all the available information to estimate the future as events unfold--whether it's a bidding war on an online auction site, a sudden spike of traffic to a media Web site, or demand for an online product," Mansour says.

View Full Article

Thursday, March 31, 2011

Blog: Targeted Results; determining a mathemtical graph's maximal independent set

Targeted Results
MIT News (03/31/11) Larry Hardesty

Massachusetts Institute of Technology (MIT) and Tel Aviv University researchers recently met at the Innovations in Computer Science conference at Tsinghua University to present a mathematical framework for finding localized solutions to complex calculations. The researchers say the framework could be used to solve classic computer science problems involving mathematical abstractions known as graphs. Graphs can represent any type of data, but it is often useful to determine the graph's maximal independent set, which occurs when enough vertices have been deleted from the graph so that there are no edges left, meaning that none are connected to any other. Graphs also can have more than one maximal independent set. The researchers developed an algorithm to efficiently determine which vertices are and are not included in at least one of the graph's maximal independent sets. Although the research is theoretical, the problem of calculating independent sets cuts across a variety disciplines, including artificial intelligence, bioinformatics, and scheduling and networking. "There have been lots of related concepts that have been sort of floating around," but the MIT and Tel Aviv researchers "have formalized it in an interesting way and, I think, the correct way," says Sandia National Labs' Seshadhri Comandur.

View Full Article

Wednesday, March 23, 2011

Blog: Fruit Flies Could Hold Key to Future Internet

Fruit Flies Could Hold Key to Future Internet
Internet Evolution (03/23/11) Michael Kassner

Developing truly effective distributed computing algorithms for parallel processors is a major challenge for computer scientists. Carnegie Mellon University (CMU) researchers are researching this problem by studying fruit flies. Fruit flies are very good at solving Maximal Independent Set problems, which identify a subset of computers that connect to every other node in the network and provide structure, says CMU professor Ziv Bar-Joseph. Fruit flies solve similar problems naturally because during their brain development a process called Sensory Organ Precursor selection occurs. Since the flies can solve the problem without possibly knowing how many neighbors each network node has, determining how they achieve this could lead to robust and efficient computational methods, Bar-Joseph says. Sensor networks also could benefit from the new algorithm, according to Bar-Joseph. He says that "our fruit fly-derived algorithm is more efficient than any known method," and could become the method of choice for sensor-network applications.

View Full Article

Blog Archive