Thursday, June 30, 2011

Blog: UCL Researchers Develop 'Darwinian' Software to Test Car Computers

UCL Researchers Develop 'Darwinian' Software to Test Car Computers
Science Business (06/30/11)

Researchers from University College London's Center for Research on Evolution, Search, and Testing are collaborating with a team from Berner & Mattner on improving search-based testing techniques. The researchers are using Darwinian evolution to breed scenarios for testing automotive software. The car-testing scenarios then will compete with each other in a virtual world. The idea is to breed the strongest and most demanding testing scenarios for the task of reading through the up to 10 million lines of software code that modern cars now contain to make sure everything in the vehicles work correctly. "Search-based testing techniques have the potential to fully automate testing of embedded systems," says Berner & Mattner's Joachim Wegener. "This will allow significant cost savings and increased product quality." Further development of the evolutionary testing techniques would enable its use in industrial practice.

View Full Article

Friday, June 24, 2011

Blog: Mozilla Eyes Hassle-Free PDFs on the Web

Mozilla Eyes Hassle-Free PDFs on the Web
CNet (06/24/11) Stephen Shankland

Mozilla is developing pdf.js, a PDF reader that uses Web technology to render PDFs in the browser. Although Google is developing PDF software designed for a specific processor, Mozilla's system will use the browser's engine. "Our most immediate goal is to implement the most commonly used PDF features so we can render a large majority of the PDFs found on the Web," says Mozilla's Andreas Gal. The project uses JavaScript to interpret the PDF coding. Gal says it will result in a substantial usability increase as well as a security improvement for users, since pdf.js only uses safe Web languages and does not contain any native code pieces that attackers can exploit. Mozilla also could use a PDF renderer with Scalable Vector Graphics (SVG) to overcome the shortcomings of Canvas, says Mozilla's Chris Jones. The Mozilla team wants to render a quick version using Canvas, then swap in a more sophisticated SVG-based version, according to Jones. Mozilla expects that pdf.js will improve user's experience with PDFs while slowly phasing out the technology. "We hope that a browser-native PDF renderer written on the Web platform allows Web technologies to subsume PDF," Gal says.

View Full Article

Thursday, June 23, 2011

Blog; CERN Experiments Generating One Petabyte of Data Every Second

CERN Experiments Generating One Petabyte of Data Every Second
V3.co.uk (06/23/11) Dan Worth

CERN researchers generate a petabyte of data every second as they work to discover the origins of the universe by smashing particles together at close to the speed of light. However, the researchers, led by Francois Briard, only store about 25 petabytes every year because they use filters to save just the results they are interested in. "To analyze this amount of data you need the equivalent of 100,000 of the world's fastest PC processors," says CERN's Jean-Michel Jouanigot. "CERN provides around 20 percent of this capability in our data centers, but it's not enough to handle this data." The researchers worked with the European Commission to develop the Grid, which provides access to computing resources from around the world. CERN receives data center use from 11 different providers on the Grid, including from companies in the United States, Canada, Italy, France, and Britain. The data comes from four machines on the Large Hadron Collider in which the particle collisions are monitored, which transmit data at 320 Mbps, 100 Mbps, 220 Mbps, and 500 Mbps, respectively, to the CERN computer center.

View Full Article

Tuesday, June 21, 2011

Blog: Carnegie Mellon Methods Keep Bugs Out of Software for Self-Driving Cars

Carnegie Mellon Methods Keep Bugs Out of Software for Self-Driving Cars
Carnegie Mellon University (06/21/11) Byron Spice

Carnegie Mellon University (CMU) researchers have developed a method to verify the safety of driver assistance technologies, such as adaptive cruise control and automatic braking. The researchers developed a model of a car-control system in which computers and sensors in each car combine to control acceleration, braking, and lane changes, and used mathematical algorithms to formally verify that the system would keep cars from crashing into each other. "The system we created is in many ways one of the most complicated cyber-physical systems that has ever been fully verified formally," says CMU professor Andre Platzer. The safety verification systems must take into account both physical laws and the capabilities of the system's hardware and software. The researchers showed that they could verify the safety of their adaptive cruise control system by breaking the problem into modular pieces and organizing the pieces in a hierarchy. Platzer says that automated driving systems have the potential to save many lives and billions of dollars by preventing accidents, but developers must be certain that they are safe. "The dynamics of these systems have been beyond the scope of previous formal verification techniques, but we've had success with a modular approach to detecting design errors in them," he says.

View Full Article

Tuesday, June 7, 2011

Blog: Watson's Lead Developer: 'Deep Analysis, Speed, and Results'

Watson's Lead Developer: 'Deep Analysis, Speed, and Results'
Computing Community Consortium (06/07/11) Erwin Gianchandani

The challenge of creating a machine that could compete on Jeopardy! was the genesis of IBM's Watson supercomputer, said Watson lead developer David Ferrucci in a keynote speech at ACM's 2011 Federated Computing Research Conference. He said the challenge involved balancing computer programs that are natively explicit, fast, and exciting in their calculations with natural language that is implicit, innately contextual, and frequently inaccurate. Ferrucci also said that it offered a "compelling and notable way to drive and measure technology of automatic question answering along five key dimensions: Broad, open domain; complex language; high precision; accurate confidence; and high speed." An analysis of tens of thousands of randomly sampled Jeopardy! clues revealed that the most common clue type occurred only 3 percent of the time, and this finding led to several guiding principles for Watson's development, including the unsuitabilty of specific large, hand-crafted models, the need to derive intelligence from many diverse techniques, and the primary requirement of massive parallelism. Ferrucci's team eventually designed a system that produces and scores numerous hypotheses using thousands of natural language processing, information retrieval, machine learning, and reasoning algorithms in combination. The research expanded the field of computer science by pursuing goal-oriented system-level metrics and longer-term incentives.

View Full Article

Wednesday, June 1, 2011

Blog: Quantum Knowledge Cools Computers

Quantum Knowledge Cools Computers
ETH Zurich (06/01/11) Simone Ulmer

Researchers at ETH Zurich and the National University of Singapore have found that, under certain conditions, cold is generated instead of heat when deleting data if the memory is known "more than completely," as is the case during quantum-mechanical entanglement because it carries more information than a classical copy of the data. The Landauer's Principle states that energy is always released as heat when data is deleted. "According to Landauer's Principle, if a certain number of computing operations per second is exceeded, the heat generated can no longer be dissipated," which will put supercomputers at a critical limit in the next 10 to 20 years, says ETH Zurich professor Renato Renner. However, the new study shows that during quantum entanglement, the deletion operation becomes a reversible process and Landauer's Principle holds true. The researchers proved this mathematically by combining the entropy concepts from information theory and thermodynamics. "We have now shown that the notion of entropy actually describes the same thing in both cases," Renner says. The results show that in a quantum computer, the entropy would be negative, meaning that heat would be withdrawn from the environment and the machine would cool down.

View Full Article

Blog: The Million-Dollar Puzzle That Could Change the World [determining whether P equals NP]

The Million-Dollar Puzzle That Could Change the World
New Scientist (06/01/11) Jacob Aron

The single biggest problem in computer science, for which the Clay Mathematics Institute is offering a $1 million prize to whoever solves it, is determining whether P equals NP, which raises the issue that computation has a fundamental, innate limitation that goes beyond hardware. The complexity class of NP, or non-deterministic polynomial time, is comprised of problems whose solutions are difficult to come by but can be confirmed in polynomial time. All problems in the set P also can be found in the set NP, but the core of the P=NP? problem is whether the reverse also applies. If P turns out not to be equal to NP, it demonstrates that some problems are so involved naturally that crunching through them is impossible, and validating this theory would gain insight on the performance of the latest computing hardware, which divides computations across multiple parallel processors, says the University of Massachusetts, Amherst's Neil Immerman. The field of cryptography also could be affected by this proof, as even the toughest codes would be cracked by a polynomial-time algorithm for solving NP problems. On the other hand, finding an algorithm to solve an NP-complete problem would enable any NP problem to be solved in polynomial time, establishing a universal computable solution. This could support the creation of algorithms that execute near-perfect speech recognition and language translation, and that facilitate computerized visual information processing equal to that of humans.

View Full Article

Blog Archive