Tuesday, October 25, 2011

Blog: How Revolutionary Tools Cracked a 1700s Code

How Revolutionary Tools Cracked a 1700s Code
New York Times (10/25/11) John Markoff

A cipher dating back to the 18th century that was considered uncrackable was finally decrypted by a team of Swedish and U.S. linguists by using statistics-based translation methods. After a false start, the team determined that the Copiale Cipher was a homophonic cipher and attempted to decode all the symbols in German, as the manuscript was originally discovered in Germany. Their first step was finding regularly occurring symbols that might stand for the common German pair "ch." Once a potential "c" and "h" were found, the researchers used patterns in German to decode the cipher one step at a time. Language translation techniques such as expected word frequency were used to guess a symbol's equivalent in German. However, there are other, more impenetrable ciphers that have thwarted even the translators of the Copiale Cipher. The Voynich manuscript has been categorized as the most frustrating of such ciphers, but one member of the team that cracked the Copiale manuscript, the University of Southern California's Kevin Knight, co-published an analysis of the Voynich document pointing to evidence that it contains patterns that match the structure of natural language.

Monday, October 24, 2011

Blog: XML Encryption Cracked, Exposing Real Threat to Online Transactions

XML Encryption Cracked, Exposing Real Threat to Online Transactions
Government Computer News (10/24/11) William Jackson

Ruhr-University Bochum researchers have demonstrated a technique for breaking the encryption used to secure data in online transactions, posing a serious threat on all currently used implementations of XML encryption. The attack can recover 160 bytes of a plain-text message in 10 seconds and decrypt larger amounts of data at the same pace, according to the researchers. The attack exploits weaknesses in the cipher-block chaining (CBC) mode of operation that is commonly used with many cryptographic algorithms, making it possible to also use the attack against non-XML implementations. "I would not be surprised to see variants of this attack applied to other protocols, when CBC mode is used in similar context," says the World Wide Web Consortium's (W3C's) Thomas Roessler. The researchers recommend fixing existing CBC implementations or developing secure new implementations without changing the XML Encryption standard. Roessler says such a change should be simple because the XML Encryption standard is not specific to any algorithm or mode of operation. He notes that W3C's XML Security Working Group is developing a set of mandatory algorithms used in XML Encryption to include use of only non-CBC modes of operation.

Thursday, October 20, 2011

Blog: RAMCloud: When Disks and Flash Memory Are Just Too Slow

RAMCloud: When Disks and Flash Memory Are Just Too Slow
HPC Wire (10/20/11) Michael Feldman

Stanford University researchers have developed RAMCloud, a scalable, high performance storage approach that can store data in dynamic random access memory and aggregate the memory resources of an entire data center. The researchers say the scalability and performance components make RAMCloud a candidate for high performance computing, especially with those applications that are data-intensive. "If RAMCloud succeeds, it will probably displace magnetic disk as the primary storage technology in data centers," according to the researchers, who are led by professor John Ousterhout. RAMCloud's two most important features are its ability to scale across thousands of servers and its extremely low latency. RAMCloud has a latency that is 1,000 times faster than disk and about five times faster than flash. In addition, the researchers predict that RAMClouds as big as 500 terabytes can be built. Although there is no set timeline to turn RAMCloud into a commercial offering, the researchers do not foresee any technological hurdles.

Wednesday, October 12, 2011

Blog: Cops on the Trail of Crimes That Haven't Happened

Cops on the Trail of Crimes That Haven't Happened
New Scientist (10/12/11) Mellisae Fellet

The Santa Cruz, Calif., police department recently started field-testing Santa Clara University-developed software that analyzes where crime is likely to be committed. The software uses the locations of past incidents to highlight likely future crime scenes, enabling police to target and patrol those areas with the hope that their presence might stop the crimes from happening in the first place. The program, developed by Santa Clara researcher George Mohler, predicted the location and time of 25 percent of burglaries that occurred on any particular day in an area of Los Angeles in 2004 and 2005, using just the data on burglaries that had occurred before that day. The Santa Cruz police department is using the software to monitor 10 areas for residential burglaries, auto burglaries, and auto theft. If the program proves to be effective in thwarting crime in areas that are known for their high crime rates, it can be applied to other cities, says University of California, Los Angeles researcher Jeffrey Brantingham, who collaborated on the algorithm's development.

Tuesday, October 11, 2011

Blog: Father of SSL Says Despite Attacks, the Security Linchpin Has Lots of Life Left

Father of SSL Says Despite Attacks, the Security Linchpin Has Lots of Life Left
Network World (10/11/11) Tim Greene

Despite high-profile exploits, secure sockets layer/transport layer security (SSL/TLS), the protocol that safeguards e-commerce security, can remain viable through proper upgrades as it becomes necessary, says SSL co-creator Taher Elgamal in an interview. He says the problem is not rooted in SSL/TLS itself, but rather in the surrounding trust framework and the problems it causes when it comes time to patch the protocol to correct vulnerabilities. "If there is a way that we can separate who we trust from the vendor of the browsers, then that would be the best thing to do," Elgamal notes. "And the root of the trust should be the Internet with its built-in reputation ecosystem." Elgamal says that in such a scenario, if people were to notice that a specific certificate authority is issuing bad certificates, then the reputation would jettison it immediately with no need to issue patches. What is needed is the construction of an automatic update mechanism, and Elgamal believes the technology to facilitate self-updating exists. "I hope people look for these things because honestly, every protocol will have roles for self-updating things," he notes. "Nothing will remain secure forever."

Blog: "Ghostwriting" the Torah?

"Ghostwriting" the Torah?
American Friends of Tel Aviv University (10/11/11)

Tel Aviv University (TAU) researchers have developed a computer algorithm that could help identify the different sources that contributed to the individual books of the Bible. The algorithm, developed by TAU professor Nachum Dershowitz, recognizes linguistic cues, such as word preference, to divide texts into probable author groupings. The researchers focused on writing style instead of subject or genre to avoid some of the problems that have vexed Bible scholars in the past, such as a lack of objectivity and complications caused by the multiple genres and literary forms found in the Bible. The software searches for and compares details that human scholars might have difficulty detecting, such as the frequency of the use of function words and synonyms, according to Dershowitz. The researchers tested the software by randomly mixing passages from the Hebrew books of Jeremiah and Ezekiel, and instructing the computer to separate them. The program was able to separate the passages with 99 percent accuracy, in addition to separating "priestly" materials from "non-priestly" materials. "If the computer can find features that Bible scholars haven't noticed before, it adds new dimensions to their scholarship," Dershowitz says.

Monday, October 10, 2011

Blog: Google Launches Dart as a JavaScript Killer

Google Launches Dart as a JavaScript Killer
IDG News Service (10/10/11) Joab Jackson

Google announced the launch of a preview version of Dart, an object-oriented Web programming language that has capabilities that resemble those of JavaScript but also addresses some of its scalability and organizational shortcomings. Google software engineer Lars Bak describes Dart as "a structured yet flexible language for Web programming." Dart is designed to be used for both quickly cobbling together small projects as well as for developing larger-scale Web applications. Programmers will be able to add variables without defining their data type or to define their data types. A compiler and a virtual machine, along with a set of basic libraries, are part of the preview version. Initially, programmers will have to compile their Dart creations to JavaScript, using a tool included in the Dart package, to get them to run on browsers. However, Google would like future Web browser to include a native Dart virtual machine for running Dart programs.

Blog Archive