Tuesday, April 29, 2008

Blog: Microsoft Says SQL-Injection Attacks Not Due to Flaws in Their Products; Rather Due To Application Programming Errors

Microsoft Says SQL-Injection Attacks Not Due to Flaws in Their Products; Rather Due To Application Programming Errors

SANS NewsBites Vol. 10 Num. 34 (fwd); 4/29/2008 10:46 AM

Microsoft Says SQL-Injection Attacks Not Due to Flaws in Their Products; Rather Due To Application Programming Errors (April 27 & 28, 2008) Microsoft maintains that the SQL-injection attacks spreading to hundreds of thousands of web pages are not due to new or unknown vulnerabilities in its Internet Information Server (IIS) or SQL Server. The Microsoft Security Response Center's Bill Sisk said the attacks are the result of SQL injection exploits and proffered a set of industry best practices for organizations to follow to protect themselves from such attacks.

http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9080678&source=rss_topic17

http://www.news.com/8301-10784_3-9929861-7.html?part=rss&subj=news&tag=2547-1_3-0-20

http://www.heise-online.co.uk/security/Microsoft-offers-assistance-to-combat-mass-SQL-injection--/news/110616

http://blogs.technet.com/msrc/archive/2008/04/25/questions-about-web-server-attacks.aspx

[Editor's Note (Paller): The Microsoft guidance for programmers on how to avoid programming errors that enable SQL Injection attacks (posted at http://msdn2.microsoft.com/en-us/library/ms998271.aspx) is excellent.

These guidelines reflect the skills that are now being tested for Java and soon for .NET programmers. If you have more than 300 programmers, you can have up to 10 of them use the free online skills assessment to find their skills gaps. Email spa@sans.org ]

Monday, April 28, 2008

Research: Prize Wining Scientist Wins Another Prize; Daphne Koller - Artificial Evidence

Prize Wining Scientist Wins Another Prize
Wall Street Journal (04/28/08) Clark, Don

Stanford University computer scientist Daphne Koller has won the first-ever ACM-Infosys Foundation Award for her ground-breaking research in artificial intelligence. Koller's work unites two disciplines to help solve difficult computing problems. The first field, sometimes identified with databases and relational logic, traditionally focused on representing complex relationships between groups of objects. The second field uses theories about probabilities to project outcomes of situations that involve significant uncertainty. "The two communities each had valid points," Koller says. "They were almost in conflict with each other." Combining the two approaches makes it possible to sort through massive amounts of data to find new insights. Koller has been particularly focused on developing ways of analyzing significant amounts of genetic data to find explanations for how genes function, as well as working on large sets of data from sensors and cameras with the goal of improving machine vision to help robots navigate. Koller won a MacArthur fellowship "genius grant" in 2004.
Click Here to View Full Article

Research: Beating the Codebreakers With Quantum Cryptography

Beating the Codebreakers With Quantum Cryptography
ICT Results (04/28/08)

Cryptography has been an arms race, with codemakers and hackers constantly updating their arsenals, but quantum cryptography could theoretically give codemakers the upper hand. Even the absolute best in classical encryption, the 128-bit RSA, can be cracked using brute force computing power. However, quantum cryptography could make possible uncrackable code using quantum key distribution (QKD). Modern cryptography relies on the use of digital keys to encrypt data before sending it over a network so it can be decrypted by the recipient. QKD promises a theoretically uncrackable code, one that can be easily distributed and still be transparent. Additionally, the nature of quantum mechanics makes it so that if an eavesdropper tries to intercept or spy on the transmission, both the sender and the receiver will know. Any attempt to read the transmission will alert the sender and the receiver, allowing them to generate a new key to send securely. QKD had its first real-world application in Geneva, where quantum cryptography was used in the electronic voting system. Not only did QKD guarantee that the poll was secure, but it also ensured that no votes were lost in transmission, because the uncertainty principle established that there were no changes in the transmitted data. The SECOQC project, which did the work for the voting system, says the goal is to establish network-wide quantum encryption that can work over longer distances between multiple parties.
Click Here to View Full Article

Friday, April 25, 2008

Software: Blog: Interview With Donald Knuth; problems with modern programming "fads," "literate programming"

Interview With Donald Knuth
InformIT (04/25/08) Binstock, Andrew

Computer scientist Donald E. Knuth, winner of ACM's A.M. Turing Award in 1974, says in an interview that open-source code has yet to reach its full potential, and he anticipates that open-source programs will start to be totally dominant as the economy makes a migration from products to services, and as increasing numbers of volunteers come forward to tweak the code. Knuth admits that he is unhappy about the current movement toward multicore architecture, complaining that "it looks more or less like the hardware designers have run out of ideas, and that they're trying to pass the blame for the future demise of Moore's Law to the software writers by giving us machines that work faster only on a few key benchmarks!" He acknowledges the existence of important parallelism applications but cautions that they need dedicated code and special-purpose methods that will have to be significantly revised every several years. Knuth maintains that software produced via literate programming was "significantly better" than software whose development followed more traditional methodologies, and he speculates that "if people do discover nice ways to use the newfangled multithreaded machines, I would expect the discovery to come from people who routinely use literate programming." Knuth cautions that software developers should be careful when it comes to adopting trendy methods, and expresses strong reservations about extreme programming and reusable code. He says the only truly valuable thing he gets out of extreme programming is the concept of working in teams and reviewing each other's code. Knuth deems reusable code to be "mostly a menace," and says that "to me, 're-editable code' is much, much better than an untouchable black box or toolkit."
Click Here to View Full Article

Wednesday, April 23, 2008

Blog: Patches Pose Significant Risk, Researchers Say

Patches Pose Significant Risk, Researchers Say
SecurityFocus (04/23/08) Lemos, Robert

A team of computer scientists has developed a technique that exploits patches and updates by automatically comparing the vulnerable and repaired versions of a program and creating attack code. The technique, which the researchers call automatic patch-based exploit generation (APEG), can generate attack code for most major vulnerabilities in minutes by automatically analyzing a patch design to fix a flaw. If Microsoft does not change how it distributes patches to customers, attackers could create a system that attacks the flaws in unpatched systems minutes after an update is sent out, says Carnegie Mellon computer science PhD candidate David Brumley. The technique is built on methods used by many security researchers, who reverse engineer patches to find vulnerabilities fixed by the update. Normally the process can take a few days, or even hours, but Brumley and his colleagues were able to use APEG to create exploits in five recent Microsoft patches in under six seconds each time. The system does not create fully weaponized exploits and may not work on all types of vulnerabilities, but it shows that developing exploits from patches can be done in minutes. The researchers suggest that Microsoft could increase the likelihood that customers receive patches before attackers can reverse engineer them by obfuscating the code, encrypting the patches and waiting to distribute the key simultaneously, and using peer-to-peer networks to increase the distribution of patches.
Click Here to View Full Article

Tuesday, April 22, 2008

Security: To Defeat a Malicious Botnet, Build a Friendly One

To Defeat a Malicious Botnet, Build a Friendly One
New Scientist (04/22/08) Inman, Mason

University of Washington computer scientists want to create swarms of good computers to neutralize hostile computers, which they say is an inexpensive way to handle botnets of any size. Current botnet countermeasures are being overwhelmed by the growing size of botnets, the researchers say, but creating swarms of good computers could neutralize distributed denial-of-service attacks. The UW system, called Phalanx, uses its own large network of computers to shield the protected server. Instead of accessing the server directly, all information passes through the herd of "mailbox" computers. The good botnet computers only pass information when the server requests it, allowing the server to work at its own pace instead of being flooded by requests. Phalanx also requires computers requesting information from the server to solve a computational puzzle, which takes a small amount of time for a normal Web user but significantly slows down a zombie computer that sends numerous requests. The researchers simulated an attack by a million-computer botnet on a server protected by a network of 7,200 mailbox computers running Phalanx. Even when the majority of mailbox computers were under attack, the server was able to run normally.
Click Here to View Full Article

Friday, April 18, 2008

Research: Universal 'Babelfish' Could Translate Alien Tongues

Universal 'Babelfish' Could Translate Alien Tongues
New Scientist (04/18/08) Reilly, Michael

A linguist and anthropologist in the United States believes it is possible to build a universal translator that would enable humans to communicate with intelligent aliens, if contact was ever made. University of California, Berkeley's Terrence Deacon believes language develops from the need to describe the physical world, which would restrict the construction of a language. Even if an alien race used scents to communicate, the language would still have an underlying universal code that could be deciphered, as in mathematics. Words serve as symbols, and no matter how abstract they are, their reference to a physical object limits their relationship to other symbol words, which would define the grammatical structure that emerges from putting words together. As a result, researchers one day might be able to develop devices that use sophisticated software to translate alien language on the spot. Florida Atlantic University's Denise Herzing believes the theory can be tested by studying dolphins.
Click Here to View Full Article

Wednesday, April 9, 2008

Research: Supercomputer Beats Go Master

Supercomputer Beats Go Master
HPC Wire (04/09/08)

The MoGo artificial intelligence engine defeated professional 5th DAN Catalin Taranu in a 9x9 game of Go during the Go Tournament in Paris in late March. The victory, the first officially sanctioned "non blitz" victory for a machine over a Go Master, is considered a significant achievement because the game is patterned more after human thought than chess and its possible combinations exceed the number of particles in the universe. Taranu says the system was close to reaching the level of DAN in performance. The computer did lose to Taranu in a 19x19 configuration with a nine-stone handicap. The French National Institute for Research in Computer Science and Control (INRIA) developed the artificial intelligence engine. "The software used in this victory--the result of a collaboration between INRIA, the CNRS(1), LRI(2) and CMAP(3)--is based on innovative technologies that can be used in numerous different areas, particularly in the conservation of resources which is such a vital issue when it comes to tackling environmental problems," says INRIA researcher Olivier Teytaud, who led the MoGo team.
Click Here to View Full Article

Friday, April 4, 2008

Research: McCormick Researchers Take Step Toward Creating Quantum Computers

McCormick Researchers Take Step Toward Creating Quantum Computers
Northwestern University (04/04/08)

Northwestern University researchers have demonstrated one of the basic building blocks for distributed quantum computing using entangled photons generated in optical fibers. "Because it is done with fiber and the technology that is already globally deployed, we think that it is a significant step in harnessing the power of quantum computers," says Northwestern professor Prem Kumar. The superposition of a quantum bit, or qubit, would allow a quantum computer to process significantly more information than a traditional computer. Kumar's group, which uses photons as qubits, found that they can entangle two indistinguishable photons in an optical fiber by using the fiber's inherent nonlinear response. The researchers also found that no matter how far the two photons are separated in standard transmission fibers, they remain entangled and "mysteriously" connected to each other's quantum state. Kumar and his team used the fiber-generated indistinguishable photons to implement the most basic quantum computer task, a controlled-NOT gate, which allows two photonic qubits to interact. DARPA has funded the group's next research effort, which will study how to implement a quantum network for physically demonstrating efficient public goods strategies, such as government contract auctions that would be able to find the most inexpensive contract arrangements by pairing contractors that have previous experience working together.
Click Here to View Full Article

Wednesday, April 2, 2008

Research: Usability or User Experience--What's the Difference?

Usability or User Experience--What's the Difference?
E-Consultancy (04/02/08) Stewart, Tom

User experience is often contrasted to usability, with the latter frequently being defined as a system's ease of use while the former is considered a blanket term for the relationship between people and technology, writes Tom Stewart, chair of the ISO subcommittee responsible for the International Standard for Human Centered Design. He says ISO's definition of usability is much closer to the concept of user experience as encompassing issues that include usefulness, desirability, credibility, and accessibility, and the new version of ISO 13407 will employ the term user experience. "In the revised standard we define [user experience] as 'all aspects of the user's experience when interacting with the product, service, environment or facility' and we point out that 'it is a consequence of the presentation, functionality, system performance, interactive behavior, and assistive capabilities of the interactive system," Stewart says. He hopes that incorporating the user experience within the human-centered design process will avoid marginalization and turn user experience into a primary business motivator for a wide array of systems. "Whatever we call it, getting the relationship between people and technology right is critical to a project's success and the intelligent application of a structured, people-centered approach to design can only be a step in the right direction," Stewart says.
Click Here to View Full Article

Tuesday, April 1, 2008

Research: Hypercubes Could Be Building Blocks of Nanocomputers

Hypercubes Could Be Building Blocks of Nanocomputers
PhysOrg.com (04/01/08) Zyga, Lisa

Multi-dimensional structures called hypercubes could serve as the building blocks in future nanocomputers. University of Oklahoma researchers Samuel Lee and Loyd Hook say tomorrow's nanoelectronic-based devices will be dominated by quantum properties that will require new architectures and structures. "Compared to today's microcomputers, the main advantages of future nanocomputers are higher circuit density, lower power consumption, faster computation speed, and more parallel and distributed computing capabilities," Lee says. For example, while current integrated circuits process information as a continual flow of electrons, nano-integrated circuits would process individual electrons. Lee and Hook are working on a variant of the hypercube called the M-hypercube, which could provide a higher-dimensional layout to support the three-dimensional integrated circuits needed for nanocomputers. M-hypercubes are composed of nodes, which act as gates that receive and pass electrons, and links that act as the paths that electrons travel along. "The unique structure of hypercubes, including M-hypercubes, has been shown to be effective in parallel computing and communication networks and provides a unique ideal intrinsic structure which fulfills many of the needs of future nanocomputing systems," Lee says. "These needs include massively parallel and distributed processing architecture with simple and robust communication linkages."
Click Here to View Full Article

Security: National Institute of Standards and Technology Shows On-Card Fingerprint Match Is Secure, Speedy; getting better, but ...

National Institute of Standards and Technology Shows On-Card Fingerprint Match Is Secure, Speedy
NIST Tech Beat (04/01/08) Brown, Evelyn

Researchers at the National Institute of Standards and Technology say a new fingerprint identification technology for use in personal identification verification (PIV) cards is both fast and secure. As part of the authentication process for the technology, the cardholder enters a personal identification number to authorize the reading of fingerprint data from the card, and a card reader matches the stored data against the newly scanned image of the cardholder's fingerprints. In one model, biometric data on the card would travel across a secure wireless interface, which would eliminate the need to insert the card into a reader. In a second model, biometric data from the fingerprint scanner would be sent to the PIV smart card for matching by a processor chip embedded in the card, and the stored data would never leave the card. "If your card is lost and then found in the street, your fingerprint template cannot be copied," says computer scientist Patrick Grother. Ten cards with a standard 128-byte-long key and seven cards that use a more secure 256-byte key passed the security and timing test using wireless, but only one of three teams met NIST's criteria for accuracy. A new round of tests on the technology, which offers an improvement in protection against identity theft, will begin shortly.
Click Here to View Full Article

Blog Archive