Monday, August 24, 2009

Blog: Bing, Wolfram Alpha agree on licensing deal

Bing, Wolfram Alpha agree on licensing deal

By Tom Krazit CNET News

Posted on ZDNet News: Aug 24, 2009 5:00:23 AM


Microsoft's Bing search engine and Wolfram Alpha have reached a licensing deal that allows Bing to present some of the specialized scientific and computational content that Wolfram Alpha generates, according to a source familiar with the deal.

Representatives from Microsoft and Wolfram Research declined to comment on the deal.

Wolfram Alpha's unique blend of computational input and curated output has not taken the world by storm, but it is considered an interesting enough take on the business of internet search to attract high-profile attention within the industry.

Wolfram Alpha does not return the usual list of links to pages with search keywords, instead providing answers to questions such as stock prices and complex mathematical formulas — with mixed results.

Bing, on the other hand, is enjoying a solid start in the three months since it made its debut as it gains users, and it will at some point be the default search experience on Yahoo's highly trafficked pages following a long-awaited deal.

It is not clear whether Bing results will carry Wolfram's branding (that is, results 'Powered By Wolfram Alpha'), but there will be some sort of presence.


Thursday, August 20, 2009

Blog: Millionths of a Second Can Cost Millions of Dollars: A New Way to Track Network Delays

Millionths of a Second Can Cost Millions of Dollars: A New Way to Track Network Delays
University of California, San Diego (08/20/09) Kane, Daniel

Researchers at the University of California, San Diego and Purdue University have developed the Lossy Difference Aggregator, a method for diagnosing delays in data center networks in as little as tens of millionths of seconds. Delays in data center networks can result in multimillion dollar losses for investment banks that run automatic stock-trading systems, and similar delays can hold up parallel processing in high performance cluster computing applications. The Lossy Difference Aggregator can diagnose fine-grained delays in as little as tens of microseconds, and packet loss as infrequent as one in a million at every router within a data center network. The researchers say their method could be used to modernize router designs with almost no cost in terms of router hardware and no performance penalty. The performance of the routers within the data centers that run automated stock-trading systems are large and difficult to monitor. Delays in these routers, called latencies, are what can add microseconds to a network's time and potentially cost millions of dollars. The traditional way of measuring latency is to track when a packet arrives and leaves a router. However, instead of tracking every packet, the new system randomly splits incoming packets into groups and adds up arrival and departure times for each group. As long as the number of losses is smaller than the number of groups, at least one group will provide an accurate estimate. Subtracting the two sums, from groups that have no loss, and dividing by the number of messages, provides the estimated average delay. By implementing this system on every router, a data center manager could quickly identify slow routers. The research was presented at the recent ACM SIGCOMM 2009 conference. Purdue University professor Ramana Kompella says the next step will be to build the hardware implementation.

Tuesday, August 18, 2009

Blog: Desktop Multiprocessing: Not So Fast

Desktop Multiprocessing: Not So Fast
Computerworld (08/18/09) Wood, Lamont

The continuance of Moore's Law--the axiom that the number of devices that can be economically installed on a processor chip doubles every other year--will mainly result in a growing population of cores, but the exploitation of those cores by the software requires extensive rewriting. "We have to reinvent computing, and get away from the fundamental premises we inherited from [John] von Neumann," says Microsoft technical fellow Burton Smith. "He assumed one instruction would be executed at a time, and we are no longer even maintaining the appearance of one instruction at a time." Although vendors offer the possibility of higher performance by adding more cores to the central processing unit, the achievement of this operates on the assumption that the software is aware of those cores, and will use them to run code segments in parallel. However, Amdahl's Law dictates that the anticipated improvement from parallelization is 1 divided by the percentage of the task that cannot be parallelized combined with the improved run time of the parallelized segment. "It says that the serial portion of a computation limits the total speedup you can get through parallelization," says Adobe Systems' Russell Williams. Consultant Jim Turley maintains that overall consumer operating systems "don't do anything very smart" with multiple cores, and he points out that the ideal tool--a compiler that takes older source code and distributes it across multiple cores--remains elusive. The public's adjustment to multicore exhibits faster progress than application vendors, with hardware vendors saying that today's buyers are counting cores rather than gigahertz.

View Full Article

Blog: A-Z of Programming Languages: Scala

A-Z of Programming Languages: Scala
Computerworld Australia (08/18/09) McConnachie, Dahna

The Scala programming language, which runs on the Java Virtual Machine, could become the preferred language of the modern Web 2.0 startup, according to a Twitter developer. Scala creator Martin Odersky says the name Scala "means scalable language in the sense that you can start very small but take it a long way." He says he developed the language out of a desire to integrate functional and object-oriented programming. This combination brings together functional programming's ability to build interesting things out of simple elements and object-oriented programming's ability to organize a system's components and to extend or adapt complex systems. "The challenge was to combine the two so that it would not feel like two languages working side by side but would be combined into one single language," Odersky says. The challenge lay in identifying constructs from the functional programming side with constructs from the object-oriented programming side, he says. Odersky lists the creation of the compiler technology as a particularly formidable challenge he faced in Scala's development. He notes that support of interoperability entailed mapping everything from Java to Scala, while another goal of the Scala developers was making the language fun to use. "This is a very powerful tool that we give to developers, but it has two sides," Odersky says. "It gives them a lot of freedom but with that comes the responsibility to avoid misuse."

View Full Article

Blog: FTC Rule Expands Health Data Breach Notification Responsibility to

FTC Rule Expands Health Data Breach Notification Responsibility to Web-Based Entities

SANS NewsBites Vol. 11 Num. 66 (August 18, 2009)

The US Federal Trade Commission has issued a final rule on health care breach notification. The rule will require web-based businesses that store or manage health care information to notify customers in the event of a data security breach. Such entities are often not bound by the requirements of the Health Insurance Portability and Accountability Act (HIPAA); this rule addresses that discrepancy.

http://www.darkreading.com/security/government/showArticle.jhtml?articleID=219400484

[Editor's Note (Pescatore): If my kids grow up to be government agencies, I hope they turn out to be the FTC. Any government agency is my kind of government agency when they issues press releases with headlines like "FTC Says Mortgage Broker Broke Data Security Laws: Dumpster Wrong Place for Consumers' Personal Information."]

Monday, August 17, 2009

Blog: International Win for Clever Dataminer; Weka data-mining software

International Win for Clever Dataminer; Weka data-mining software
University of Waikato (08/17/09)

The first place finisher in the 2009 Student Data Mining Contest, run by the University of California, San Diego, used the Weka data-mining software to predict anomalies in e-commerce transaction data. Quan Sun, a University of Waikato computer science student, says it took about a month to find the answer. The contest drew more than 300 entries from students in North America, Europe, Asia, and Australasia. "I couldn't have done it without Weka," Sun says of the open source software that was developed at Waikato. "Weka is like the Microsoft Word of data-mining, and at least half of the competitors used it in their entries." ACM's Special Interest Group on Knowledge Discovery and Data Mining gave the Weka software its Data Mining and Knowledge Discovery Service Award in 2005. Weka has more than 1.5 million users worldwide.

View Full Article

Wednesday, August 12, 2009

Blog: Safer Software

Safer Software
The Engineer Online (08/12/09)

Researchers at Australia's Information and Communications Technology Research Centre of Excellence (NICTA) have developed the Secure Embedded L4 (seL4) microkernel, which they say is the world's first formal machine-checked proof of a general-purpose operating system kernel. The researchers say the seL4 microkernel provides the ability to mathematically prove that software governing critical safety and security systems in aircraft and motor vehicles is free of large class errors. The microkernel has potential applications in military, security, and other industries in which the flawless operation of complex embedded systems is critical. "Proving the correctness of 7,500 lines of C code in an operating system's kernel is a unique achievement, which should eventually lead to software that meets currently unimaginable standards of reliability," says Cambridge University Computer Laboratory professor Lawrence Paulson. NICTA principle researcher Gerwin Klein says the researchers have created a general, functional correctness proof, which he says is unprecedented for real-world, high-performance software of such a large size and complexity. The NICTA team invented new techniques in formal machine-checked proofs, made advancements in the mathematical understanding of real-world programming languages, and developed new methodologies for rapid prototyping of operating system kernels. "The project has yielded not only a verified microkernel but a body of techniques that can be used to develop other verified software," Paulson says. The research will be presented at the 22nd ACM Symposium on Operating Systems Principles, which takes place Oct. 11-14 in Big Sky, Montana.

View Full Article

Tuesday, August 11, 2009

Blog: Twenty Critical Controls ("the CAG") Update

Twenty Critical Controls ("the CAG") Update

SANS NewsBites Vol. 11 Num. 63 (August 11, 2009)

(1) V2.1 To Be Released This Week

On Friday of this week Version 2.1 of the 20 Critical Controls for Effective Cyber Defense will be published at the CSIS site. This update reflects input from more than 100 organizations that reviewed the initial draft and contains the mapping of the 20 Critical Controls to revised NIST 800-53 controls requested by NIST.

(2) Search for Effective Automation Tools Begins This release also signals the launch of the search for tools that automate one or more of the controls. The authors have already received seven submissions from vendors that believe their tools provide effective automation for the implementation and continuous monitoring of several controls. The new search will last until August 31. Any user that has automated elements of the 20 Critical Controls and any vendors that have tools that automate those controls, should send submission to cag@sans.org before August 31. Those that are demonstrated to actually work will be posted and may be included in the first National Summit on Planning and Implementing the 20 Critical Controls to be held at the Reagan Center in November. If you are wondering whether your tools meet the needs, you can find a draft at http://www.sans.org/cag/guidelines.php

(3) A 60 minutes webcast on Thursday, August 13, 1PM - 2PM EDT:

"Three Keys To Understanding and Implementing the Twenty Critical Controls for Improved Security in Federal Agencies" with James Tarala and Eric Cole. For free registration, visit

https://www.sans.org/webcasts/show.php?webcastid=92748

Monday, August 10, 2009

Blog: The A-Z of Programming Languages: Clojure

The A-Z of Programming Languages: Clojure
Computerworld Australia (08/10/09) Edwards, Kathryn

Clojure programming language creator Rick Hickey says Clojure came out of his desire for "a dynamic, expressive, functional language, native on the [Java Virtual Machine/Common Language Runtime (JVM/CLR)]." He says Clojure is designed to support the writing of simple, fast, and robust programs. Hickey says he elected to develop another Lisp dialect rather than extend an existing one because he wanted the language's appeal to reach beyond existing Lisp users, and to support design decisions that would have broken backward compatibility with the existing Scheme and Common Lisp programs. "I originally targeted both the JVM and CLR, but eventually decided I wanted to do twice as much, rather than everything twice," Hickey says. "I chose the JVM because of the much larger open source ecosystem surrounding it, and it has proved to be a good choice." Hickey stresses that solid concurrency support is a key feature of Clojure. "All of the core data structures in Clojure are immutable, so right off the bat you are always working with data that can be freely shared between threads with no locking or other complexity whatsoever, and the core library functions are free of side effects," he says. "But Clojure also recognizes the need to manage values that differ over time."

View Full Article

Thursday, August 6, 2009

Blog: XML Library Flaws Affect Numerous Applications

XML Library Flaws Affect Numerous Applications

SANS NewsBites Vol. 11 Num. 62 (August 6, 2009)

Researchers have uncovered a significant number of flaws in Extensible Markup Language (XML) libraries that could be exploited to crash machines and execute malicious code. The flaws affect large numbers of applications that use the libraries in question. Sun Microsystems, Apache, and Python products are known to be vulnerable.

http://www.securecomputing.net.au/News/152193,researchers-find-largescale-xml-library-flaws.aspx

http://www.theregister.co.uk/2009/08/06/xml_flaws/

http://voices.washingtonpost.com/securityfix/2009/08/researchers_xml_security_flaw.html

[Editor's Note (Northcutt): Uh Oh. This is not good. XML is behind the scenes in almost everything. I wonder whether XML gateways could be used to mitigate the problem to some extent.]

Blog: 5 lessons from the dark side of cloud computing

5 lessons from the dark side of cloud computing

InfoWorld: Robert Lemos CIO.com; August 6, 2009

While many companies are considering moving applications to the cloud, the security of the third-party services still leaves much to be desired, security experts warned attendees at last week's Black Hat Security Conference.

The current economic downturn has made cloud computing a hot issue, with startups and smaller firms rushing to save money using virtual machines on the Internet and larger firms pushing applications such as customer relationship management to the likes of Salesforce.com. Yet companies need to be more wary of the security pitfalls in moving their infrastructure to the cloud, experts say.

Wednesday, August 5, 2009

Blog: Warning Issued on Web Programming Interfaces

Warning Issued on Web Programming Interfaces
Technology Review (08/05/09) Naone, Erica

Application programming interfaces (APIs), software specifications that allow Web sites and services to interact with each other, have been a major factor in the rapid growth of Web applications, but security experts at the DEFCON hacking conference revealed ways of exploiting APIs to attack different sites and services. APIs have been key to the success of many social sites. John Musser, founder of Programmable Web, a Web site for users of mashups and APIs, says that the traffic driven to Twitter through APIs, like from desktop clients, is four to eight times greater than the traffic that comes through Twitter's Web site. However, Nathan Hamiel from Hexagon Security Group and Shawn Moyer from Agura Digital Security say that APIs could be exploited by hackers. The security researchers note that several APIs are often stacked on top of each other. Hamiel says this kind of stacking could led to security problems on several layers, and that APIs can open sites to new kinds of threats. In the presentation, Hamiel demonstrated that an attack might be able to use an API in unintended ways to gain access to parts of a Web site that should not be visible to the public. Hamiel says whenever a site adds functionality it increases its attack surface, and the same thing that makes APIs powerful often makes them vulnerable. Musser says any site that builds an API on top of another site's API is relying on someone else's security, and it is difficult to determine what has been built to see how well it is handled. WhiteHat Security founder and chief technology officer Jeremiah Grossman says sites that publish APIs can find it difficult to discover security flaws in their own APIs, and it is often hard to tell how a third-party site is using an API and if that site has been compromised by an attacker.

View Full Article

Tuesday, August 4, 2009

Blog: New Epidemic Fears: Hackers

New Epidemic Fears: Hackers
The Wall Street Journal (08/04/09) P. A6; Worthen, Ben

Under the economic stimulus bill and other U.S. federal government proposals, hospitals and doctors' offices that invest in electronic records systems may receive compensation from part of a $29 billion fund. However, such systems can be vulnerable to security breaches. Last year health organizations publicly disclosed 97 data breaches, up from 64 in 2007, including lost laptops with patient data on them, misconfigured Web sites that accidentally disclosed confidential information, insider theft, and outside hackers breaking into a network. Because most healthcare organizations keep patients' names, Social Security numbers, dates of birth, and payment information such as insurance and credit cards, criminals often target these places for identity theft. "Healthcare is a treasure trove of personally identifiable information," says Secure Works researcher Don Jackson. The U.S. Federal Trade Commission says medical fraud is involved in about 5 percent of all identity theft. Smaller practices can become easier targets, as they rarely have a technology professional or security specialists, and often lack a security plan or proper tools. The government plans to release guidelines over the next year, as part of the stimulus bill, to illustrate a secure information system, but critics warn that data encryption and other security functions are worthless if they are not correctly used. "If you take a digital system and implement it in a sloppy way, it doesn't matter how good the system is," says World Privacy Forum executive director Pam Dixon. "You're going to introduce risk."

View Full Article

Monday, August 3, 2009

Blog: NIST Issues Final Version of SP 800-53; Enables Rapid Adoption of the Twenty Critical Controls (Consensus Audit Guidelines)

NIST Issues Final Version of SP 800-53; Enables Rapid Adoption of the Twenty Critical Controls (Consensus Audit Guidelines)

SANS NewsBites Vol. 11 Num. 61 (August 3, 2009)

The National Institute of Standards and Technology (NIST) has published the final version of SP 800-53, Revision 3, "Recommended Security Controls for Federal Information Systems and Organizations." The document is the first major revision of guidelines for implementing the Federal Security Management Act (FISMA) since 2005. Among the changes in this updated version are "A simplified, six-step Risk Management Framework; Recommendations for prioritizing security controls during implementation or deployment; and Guidance on using the Risk Management Framework for legacy information systems and for external information system services providers." The new version of 800-53 solves three fatal problems in the old version - calling for common controls (rather than system by system controls), continuous monitoring (rather than periodic certifications), and prioritizing controls (rather than asking IGs to test everything). Those are the three drivers for the 20 Critical Controls (CAG). In at least five agencies, contractors that previously did 800-53 evaluations are being re-assessed on their ability to implement and measure the effectiveness of the 20 Critical Controls in those agencies. One Cabinet-level Department has proven that implementing the 20 Critical Controls with continuous monitoring reduced the overall risk by 84% across all departmental systems world-wide.

http://gcn.com/Articles/2009/08/03/NIST-release-of-800-53-rev-3-080309.aspx

http://csrc.nist.gov/publications/nistpubs/800-53-Rev3/sp800-53-rev3-final.pdf

[Editor's Note (Paller): This is very good news. John Gilligan reports that a new version of the 20 Critical Controls document will be released next week with a table, put in the document at NIST's request, showing how the 20 Critical Controls are a proper subset of the priority one controls in the revised 800-53. A course on implementing and testing the 20 Critical Controls will be run in San Diego next month and in Chicago in October https://rr.sans.org/ns2009/description.php?tid=3467.]

Blog: NCSA Researchers Receive Patent for System that Finds Holes in Knowledge Bases

NCSA Researchers Receive Patent for System that Finds Holes in Knowledge Bases
University of Illinois at Urbana-Champaign (08/03/09) Dixon, Vince

Researchers at the National Center for Supercomputing Applications (NCSA) at the University of Illinois, Urbana-Champaign, have received a patent for a method of determining the completeness of a knowledge base by mapping the corpus and locating weak links and gaps between important concepts. NCSA research programmer Alan Craig and former NCSA staffer Kalev Leetaru were building databases using automatic Web crawling and needed a way of knowing when to stop adding to the collection. "So this is a method to sort of help figure that out and also direct that system to go looking for more specific pieces of information," says Craig. Using any collection of information, the technique graphs the data, analyzes conceptual distances within the graph, and identifies parts of the corpus that are missing important documents. The system then suggests what concepts may best fill those gaps, creating a link between two related concepts that might otherwise not have been found. Leetaru says this system helps users complete knowledge bases with information they are initially unaware of. Leetaru says the applications for this method are limitless, as the corpus does not have to be computer-based and the method can be applied to any situation involving a collection of data that users are not sure is complete.

View Full Article

Blog: Computers Unlock More Secrets of the Mysterious Indus Valley Script

Computers Unlock More Secrets of the Mysterious Indus Valley Script
UW News (08/03/09) Hickey, Hannah

A team of Indian and U.S. researchers, led by University of Washington professor Rajesh Rao, is attempting to decipher the script of the ancient Indus Valley civilization. Some researchers have questioned whether the script's symbols are actually a language, or are instead pictograms of political or religious icons. The researchers are using computers to extract patterns from the ancient Indus symbols. The researchers have uncovered several distinct patterns in the symbols' placement in sequences, which has led to the development of a statistical model for the unknown language. "The statistical model provides insights into the underlying grammatical structure of the Indus script," Rao says. "Such a model can be valuable for decipherment, because any meaning ascribed to a symbol must make sense in the context of other symbols that precede or follow it." Calculations show that the order of the symbols is meaningful, as taking one symbol from a sequence and changing its position creates a new sequence that has a much lower probability of belonging to the language. The researchers say the presence of such distinct rules for sequencing provides support for the theory that the unknown script represents a language. The researchers used a Markov model, a statistical model that estimates the likelihood of a future event, such as inscribing a particular symbol, based on previously observed patterns. One application uses the statistical model to fill in missing symbols on damaged artifacts, which can increase the pool of data available for deciphering the writings.

View Full Article

Blog Archive