Monday, June 30, 2008

Blog: Serial Computing Is Dead; the Future Is Parallelism - Issues that force serial execution may become more difficult to handle

Serial Computing Is Dead; the Future Is Parallelism
SearchDataCenter.com (06/30/08) Botelho, Bridget

Serial computing is extinct and the future belongs to parallel computing, argued Dave Patterson, head of the University of California, Berkeley's Parallel Computing Laboratory, during his keynote speech at the Usenix conference. Parallel processing can now be executed on a single chip across multiple cores, thanks to the emergence of multicore chips, while Patterson contended that serial computing has reached its limits in terms of memory and power. He maintained that programmers who require greater performance must write programs capable of leveraging multiple cores via parallelism, and researchers at his lab have concentrated on applications that ought to be parallelized. The proper writing and implementation of parallel programs can address power issues and performance bottlenecks. But computer scientist Andrew S. Tanenbaum, a recipient of the Usenix Lifetime Achievement Award, said that writing parallel applications can give rise to more problematic software rather than less. "Sequential programming is really hard, and parallel programming is a step beyond that," he said. "I have a great fear that we will have all of these cores, and our software programs will be even worse." Patterson noted that the success of parallel computing depends on its ability to improve efficiency, accuracy, and productivity, but he cautioned that the majority of programmers are not ready to write suitable parallel programs.
Click Here to View Full Article

Monday, June 23, 2008

Blog: Scripting Languages Spark New Programming Era

Scripting Languages Spark New Programming Era
InfoWorld (06/23/08) Krill, Paul

Scripting or dynamic languages are opening up a new age of programming for the masses thanks to their flexibility and ease of use, says analyst Michael Cote. The Perl Foundation's Joshua McAdams says scripting languages have become viable platforms thanks to the growing power of computers. JavaScript, which is popular for facilitating rich client activities in browsers, is thought by many to be the lead scripting language, while PHP is dominant in the server side. Both JavaScript and PHP deliver more simplicity than older languages, while core PHP developer Andi Gutmans says PHP is superior to Java in terms of time to completion and cost. McAdams emphasizes the Perl scripting language's flexibility and speed of development, as well as its access to extensions through the Comprehensive Perl Archive Network. Meanwhile, core Python developer Raymond Hettinger cites the language's readability and reliability, adding that it is significantly easier to program with than compiled languages as well as boasting substantially more conciseness. FiveRuns software developer Bruce Williams says Ruby is "a very elegant language, it's easy to work with, and because it's not compiled, it's also very quick." The Ruby on Rails Web framework complements the Ruby language, with Williams listing rapid development as its major advantage.
Click Here to View Full Article

Friday, June 13, 2008

Blog: BU Prof Unlocks a Business Algorithm; Simplex algorithm (linear programming)

BU Prof Unlocks a Business Algorithm (the Linear Programming Simplex Algorithm)
Boston University (06/13/08) Daniloff, Caleb

Boston University professor Shanghua Teng and Yale University professor Daniel Spielman will receive ACM's Godel Prize at the International Colloquium on Automata, Languages, and Programming (ICALP), awarded by ACM's Special Interest Group on Algorithms and Computing Theory and the European Association for Theoretical Computer Science. The $5,000 award is given for outstanding papers in theoretical science. Teng and Spielman are being recognized for their 2004 paper in the Journal of the ACM titled "Smoothed Analysis of Algorithms: Why the Simplex Algorithm Usually Takes Polynomial Time." The paper explains why a common algorithm, used to solve efficiency problems in fields ranging from airlines to online games, functions so well, particularly in business. The simplex algorithm, developed in 1947 by George Dantzig, has practical applications in almost all areas of business, including advertising, distribution, pricing, production planning, and transportation. The simplex algorithm is designed to find a solution in a reasonable amount of time, but scientists have been able to create worst-case scenarios by introducing abnormalities that cause the algorithm's running time to grow exponentially, creating a situation where it could take virtually forever to find a solution. Smoothed analysis gives an explanation for why the simplex method behaves so well in practice despite the danger of its worst-case complexity. Teng's and Spielman's work also represents an advance in predicting the performance of algorithms and heuristics.
Click Here to View Full Article

Wednesday, June 11, 2008

Blog: Standardization of Rule Based Technologies Moving Forward to Enable the Next Generation Web

Standardization of Rule Based Technologies Moving Forward to Enable the Next Generation Web
Pressemitteilung Web Service (06/11/2008)

RuleML is following up its first industry-oriented gathering last year with the International RuleML Symposium on Rule Interchange and Applications (RuleML-2008). This year's event is scheduled for Oct. 30-31 at the Buena Vista Palace in the Walt Disney World Resort in Orlando, Fla., and will give business and technology professionals, researchers, and standardization representatives another opportunity to focus on the increasing performance and applicability of rule technologies. The international umbrella organization for Web rule research, standardization, and adoption will make every topic related to rules a focus of the symposium, from engineering and use of rule-based systems, the integration of rules and other Web technologies, languages and frameworks for rule representation and processing, rule-related web standards, to the interoperation of rule-based systems and the incorporation of rule technology into enterprise architectures. RuleML-2008 will offer peer-reviewed paper presentations, invited talks, software demonstrations, and social events. There will be a challenge session that gives participants the opportunity to show their commercial and open source tools, use cases, and applications, and to win prizes for best applications. The symposium will be co-located with the Business Rules Forum to bring greater attention to the connection between rule and business logic technologies. ACM is among the sponsors and partner organizations that support RuleML-2008.
Click Here to View Full Article

Tuesday, June 10, 2008

Blog: 'Saucy' Software Updates Finds Symmetries Dramatically Faster; graph searching

'Saucy' Software Updates Finds Symmetries Dramatically Faster
University of Michigan News Service (06/10/08) Moore, Nicole Casal

University of Michigan computer scientists have developed open-source software that reduces the time it takes to find symmetries in complicated equations from days to a few seconds. Finding symmetries can reveal shortcuts to answers that, for example, verify the safety of train schedules, find bugs in software and hardware designs, or improve common search queries. The algorithm updates a program called "saucy" that the researchers developed in 2004. The software's applications include artificial intelligence and logistics. In complicated equations, symmetries reveal repeated branches of the search for solutions that only need to be solved once. Current programs that search for symmetries can take days to find results, even if no instances are found. The new method can finish in seconds even if there are millions of variables. An artificial intelligence capable of recognizing symmetries could quickly help a computer generate a plan or an optimal schedule, and the computer would know when the order of tasks was interchangeable. The algorithm converts a complicated equation into a graph and searches for similarities in the arrangement of the vertices. It narrows the search while exploiting "sparsity," or the fact that almost every node on the graph is connected to a few other nodes. Other symmetries can be derived from sparse symmetries, and the number of distinct symmetries can grow exponentially with the size of the system.
Click Here to View Full Article

Sunday, June 8, 2008

Blog: Can Machines Be Conscious?

Can Machines Be Conscious?
IEEE Spectrum (06/08) Vol. 45, No. 6, P. 55; Koch, Christof; Tononi, Giulio

Some people are convinced that a conscious machine could be constructed within a few decades, including Caltech professor Christof Koch and University of Wisconsin, Madison professor Giulio Tononi, who write that the emergence of an artificially created consciousness may not take the form of the most popular speculations. They note that consciousness requires neither sensory input nor motor output, as exemplified by the phenomenon of dreaming, and emotions are not a necessary component for consciousness, either. Koch and Tononi also cite clinical data to suggest that other traditional elements of consciousness--explicit or working memory, attention, self-reflection, language--may not be essential, while the necessary properties of consciousness depend on the amount of integrated information that an organism or machine can produce. The authors offer the integrated information theory of consciousness as a framework for measuring different neural architectures' effectiveness at generating integrated information and achieving consciousness, and this framework outlines what they describe as "a Turing Test for consciousness." One test would be to ask the machine to concisely describe a scene in a manner that efficiently differentiates the scene's key features from the vast spectrum of other possible scenes. Koch and Tononi suggest that the building of a conscious machine could involve the evolution of an abstracted mammal-like architecture into a conscious entity.
Click Here to View Full Article

Blog: Information Accountability

Information Accountability
Communications of the ACM (06/08) Vol. 51, No. 6, P. 82; Weitzner, Daniel J.; Abelson, Harold; Berners-Lee, Tim

Accountability for the misuse of personal information must be enforced by systems and statutes, as the openness of the information environment makes protection via encryption and access control impossible. "Information accountability means the use of information should be transparent so it is possible to determine whether a particular use is appropriate under a given set of rules and that the system enables individuals and institutions to be held accountable for misuse," write the authors. Rules are needed, both in the United States and internationally, to address the permissible use of certain types of information, in addition to simple access and collection restrictions. The authors say that the information-accountability framework is more reflective of the relationship between the law and human behavior than the various initiatives to enforce policy compliance via access control over information. Supporting information accountability requires a technical architecture that features policy-aware transaction logs, a common framework for representing policy rules, and policy-reasoning tools. "One possible approach to designing accountable systems is to place a series of accountable appliances throughout the system that communicate through Web-based protocols," the authors suggest. The authors conclude that perfect compliance should not be the standard for evaluating laws and systems that aid the enforcement of information accountability. "Rather we should ask how to build systems that encourage compliance and maximize the possibility of accountability for violations," they write.
Click Here to View Full Article - Web Link to Publication Homepage

Blog Archive