Tuesday, March 31, 2009

Blog: New Architects of Service-Oriented Computing, or SOC for Short!

New Architects of Service-Oriented Computing, or SOC for Short!
ICT Results (03/31/09)

European researchers working on the Sensoria project have developed service-oriented computing (SOC) tools for creating a robust software over service-oriented architecture. Sensoria project researchers have developed the Software Development Environment, which they say makes composing services easier through the use of graphical design tools. The services can be located dynamically and triggered by other services, and the relationship between the services is loose and flexible as a result of the nature of the SOC architecture. Service-oriented applications are designed using a standard Unified Modeling Language or domain-specific modeling languages as needed. At the back end, mathematical analysis helps to reveal bottlenecks, errors, or violations of service contracts. The project also developed mathematic foundations, techniques, and approaches to more pragmatic and reliable software engineering. Sensoria project developers say the key to scalable, cost-effective SOC is the ability to "compose" existing services so they perform higher-level functions that form new services in their own right and can be re-orchestrated into even higher-level compositions.

View Full Article

Sunday, March 29, 2009

Blog: Vast Spy System Loots Computers in 103 Countries

Vast Spy System Loots Computers in 103 Countries
New York Times (03/29/09) Markoff, John

Researchers at the University of Toronto's Munk Center for International Studies say a massive electronic spying operation has successfully stolen documents from hundreds of government and private offices around the world. The researchers say the system was controlled from computers almost exclusively in China, but they cannot conclusively say the Chinese government is involved. The researchers were asked by the office of the Dalai Lama to examine its computers for signs of malware and discovered a vast operation that, in less than two years, managed to infiltrate at least 1,295 computers in 103 countries, including computers belonging to many embassies, foreign ministries, other government offices, and the Dalai Lama's Tibetan exile centers in India, Brussels, London, and New York. The researchers say that in addition to spying on the Dalai Lama, the system, which they named GhostNet, also focused on governments in South Asian and Southeast Asian countries. GhostNet is by far the largest, in terms of the number of countries affected, spying operation to be exposed, and it is believed that this is the first time that researchers have been able to uncover the workings of a computer systems used for intrusions of such magnitude. The researchers say GhostNet continues to infect and monitor more than a dozen new computers a week. The malware not only "phishes" for unwary victims but also "whales" for specific, important targets. The malware can even turn on the video and audio features of an infected computer, enabling the malware's operators to see and hear what goes on in front of the computer. The researchers have notified international law enforcement agencies of the spying operation, which they believe exposes shortcomings in the legal structure of cyberspace.

View Full Article

Saturday, March 28, 2009

Blog: A New Step Towards Quantum Computers

A New Step Towards Quantum Computers
Ruhr-University Bochum (Germany) (03/28/09)

Researchers from Dortmound, St. Petersburg, Washington, and the Rurh-Universitaet-Bochum (RUB) in Germany have succeeded in aligning electron spin. The researchers also were able to rotate the spin, using a laser pulse, in any desired direction at any time, as well as read the direction with another laser pulse. "This is the first, important step toward addressing these 'quantum bits,' which will form an integral part of data transfer systems and processors in the future," says RUB professor Andreas Wieck. By applying an external magnetic field, an electron's spin can be accelerated or decelerated, causing it to waver and rotate its axis to virtually any desired angle. If these variations could be used to carry information it would be possible to store more than just 0s and 1s in an electron. A single electron has a very small measurable effect, requiring highly sensitive instruments, but by grouping electrons into ensembles the researchers created signals that are stronger by a magnitude of six, making them very sturdy and enabling the signals to be easily recorded. The team managed to confine nearly one million electrons each in virtually identical indium-arsenic islands, or quantum dots, improving their measurable effect.

View Full Article

Friday, March 20, 2009

Blog: Multicore Chips Pose Next Big Challenge for Industry

Multicore Chips Pose Next Big Challenge for Industry
IDG News Service (03/20/09) Shah, Agam

Increasing the number of processing cores has become the main way of improving the performance of server and PC chips, but any added benefits will be significantly reduced if the industry is unable to overcome hardware and programming challenges, according to participants at the recent Multicore Expo. Most modern software is written for single-core chips and will need to be rewritten or updated to capitalize on the increasing number of cores that chip manufacturers are adding to their products, says analyst Linley Gwennap. Off-the-shelf applications can run faster on central processing units with up to four processor cores, but beyond that performance levels stall, and may even decrease as additional cores are added, Gwennap says. Chip manufacturers and system builders are working to educate software developers and provide them with better tools for multicore programming. Intel and Microsoft have provided $20 million to open two research centers at U.S. universities dedicated to multicore programming. Gwennap says the lack of multicore programming tools for mainstream developers may be the industry's biggest obstacle. Nevertheless, some software vendors are developing parallel code for simple tasks, such as image and video processing, Gwennap says. For example, Adobe has rewritten Photoshop so the program can assign duties to specific x86 cores, improving performance three- to four-fold, he says.

View Full Article

Thursday, March 19, 2009

Blog: Will HIPAA changes torpedo health IT stimulus?

Will HIPAA changes torpedo health IT stimulus?

Posted by Dana Blankenhorn; March 19th, 2009 @ 9:51 am

The industry charged with scaring physicians about HIPAA requirements (and avoiding automation like a plague) has gone into overdrive over changes to the law created by the Obama stimulus.

The stimulus, by the way, is now called the American Recovery and Reinvestment Act. The part dealing with health IT is called Health Information Technology for Economic and Clinical Health (HITECH — get it?).

In brief, the new act extends the definition of “covered entities” to include all those a physician’s practice does business with — lawyers, accountants, suppliers, etc.

So if you’re handing your lawyer patient records (as in a malpractice suit) that exchange of data is now covered under HIPAA. They can’t spread it around as part of your defense.

HITECH also tells all “covered entities” they have to notify authorities if data is lost. Previously only Arkansas and California had this requirement — apparently everywhere else doctors were dropping laptops with patient data into trash cans and keeping it a secret.

Wednesday, March 18, 2009

Blog: Stimulus Package Includes Changes to HIPAA Privacy Rules

SANS NewsBites Vol. 11 Num. 23 (3/24/2009)

Stimulus Package Includes Changes to HIPAA Privacy Rules (March 18, 2009)

The federal stimulus package includes amended rules regarding the Health Insurance Portability and Accountability Act (HIPAA). The new provisions require doctors to keep records of when they disclose patient information. The previous regulations allowed doctors to share patient information for treatment, payment or healthcare reasons without noting when the information was shared. The new provisions do not take effect until January 2014. Medical practices are also required to post notices of data security breaches if 10 or more patients are affected. If the number of affected patients is 500 or more, the practice must notify all affected patients, a media outlet and the US Department of Health and Human Services (HHS).

http://www.aafp.org/online/en/home/publications/news/news-now/government-medicine/20090318hipaa-security-rules.html

[Editor's Note (Cole): If you work in health care now is the time to act, even though the new laws will not take effect for 5 more years. As systems and networks are re-designed, start to incorporate detailed logging, concise access lists and control of patient information. It is easier to design security in than try to fix it later.]

Tuesday, March 17, 2009

Blog: Hadoop, a Free Software Program, Finds Uses Beyond Search

Hadoop, a Free Software Program, Finds Uses Beyond Search
New York Times (03/17/09) P. B3; Vance, Ashlee

Hadoop software has quickly become widely used by the top search engines and other Web sites to analyze and access the unprecedented amounts of data created by the Internet. The free program maps information over thousands of computers and offers a simpler method for writing analytical queries, thus enabling users to explore data by simply asking a question. "It's a breakthrough," says Lawrence Livermore National Laboratory's Mark Seager. "I think this type of technology will solve a whole new class of problems and open new services." Hadoop is based on MapReduce technology developed by Google. MapReduce, when paired with the file management technology Google uses to catalog the Web, can be used to index the entire Internet on a regular basis and analyze the vast amounts of information to determine the quality of search results and how people use the company's various services. MapReduce makes it possible to break large sets of data into small pieces, which can be spread across thousands of computers, ask the computers questions, and then receive cohesive answers. Google has largely kept the MapReduce technology a secret, but the company published papers on some of the underlying techniques, which software consultant Doug Cutting used to create Hadoop. Hadoop can track people's behavior to see what types of stories and content they view, and then match ads with that content. Microsoft uses Hadoop to improve its search system, and Facebook uses the program to determine how closely linked people are based on who appears in users' photographs.

View Full Article

Friday, March 13, 2009

Blog: New System for Improving Decision Support Systems

New System for Improving Decision Support Systems
Universidad Politecnica de Madrid (03/13/09)

Universidad Politecnica de Madrid School of Computing researchers have developed a system designed to improve decision-making processes in complex situations. The system was tested on the restoration of Lake Svyatoye in Belarus, which was contaminated by the Chernobyl accident. Professors Antonio Jimenez, Alfonso Mateos, and Sixto Rios, from the Department of Artificial Intelligence's Decision Analysis and Statistics Group, aimed to account for incomplete information and any possible effects those gaps could have on decision making. Multi-Attribute Utility Theory is often used to solve decision-making problems. The theory says that after building a hierarchy of objectives and identifying a set of alternatives and each alternative's value for impact on the objectives, the decision maker's preferences are quantified. The new system uses two approaches to manage incomplete information, which occurs when the impacts of some alternatives and attributes are unknown. The first approach redistributes criteria weights with missing values or impacts throughout the objectives hierarchy and across other criteria, which means the criteria hierarchy and its assigned weights vary when each alternative is analyzed depending on the criteria with missing values. The second approach associates the citerion range, the set of possible values, as the impact for a criterion with missing values, which means the entire range of values are considered possible and equally likely.

View Full Article

Blog: Society's Vital Networks Prone to 'Explosive' Changes

Society's Vital Networks Prone to 'Explosive' Changes
New Scientist (03/13/09) Barras, Colin

Researchers led by University of California, Santa Cruz professor Dimitris Achlioptas have discovered that controlling the develop of random networks could lead to a better understanding of how to slow or stop the spread of diseases or make delivery networks more efficient. Networks that grow randomly, such as the connections between computers that create the Internet, often quickly gain a central backbone of connections that makes it easy to travel between any two points. The researchers used simulations to find a way of growing a network randomly while delaying the emergence of the backbone. However, they found that when the network becomes fully connected it tends to occur in an explosive manner. Random networks usually grow by selecting two random nodes that become connected. Instead, the researchers picked two pairs of random nodes, but only connected the pair with the fewest pre-existing connections to other nodes. The result is a network that grows steadily but does not become fully connected for a longer time. Eventually, the addition of a single connection triggers an instantaneous phase change and the network becomes fully connected. A variation of this technique makes it possible to execute the opposite process and accelerate the development of a network's backbone. "We know that for some networks, like the Internet, connectivity is a fundamental desired property," says research team member Raissa D'Souza from the University of California, Davis. "For others, like a virus spreading through a network of humans or computers, connectivity is a liability."

View Full Article

Thursday, March 12, 2009

Blog: Berners-Lee: Semantic Web Will Have Privacy Built-In

Berners-Lee: Semantic Web Will Have Privacy Built-In
ZDNet UK (03/12/09) Espiner, Tom

World Wide Web Consortium director Sir Tim Berners-Lee says the Semantic Web will improve online privacy protection by allowing Internet users to control who can access their data. Researchers have warned that the combination of personal information and a semantic Web could lead to privacy problems, including increased data mining. However, Berners-Lee says that teams working on the Semantic Web project are working to ensure that privacy principles are built into the Semantic Web's architecture. "The Semantic Web project is developing systems which will answer where data came from and where it's going to--the system will be architectured for a set of appropriate uses," he says. Berners-Lee also says the Semantic Web will be based on the principle that people who make a Web request for information held by third parties, such as a company or a government agency, will be able to see all the data those organizations will keep on them. The Semantic Web project will include accountable data-mining components, which enable people to know who is mining data on them, and it is exploring making the Web adhere to privacy preferences set by the users.

View Full Article

Tuesday, March 10, 2009

Blog: Application Security Best Practices: A New Maturity Model for Building Security In

SANS NewsBites Vol. 11 Num. 20 (3/13/2009)

Application Security Best Practices: A New Maturity Model for Building Security In

(March 9 & 10, 2009)

The Building Security in Maturity Model (BSIMM) is "a set of best practices developed by Citigal and Fortify" that draws together data from nine software security initiatives to help software developers build more secure products. The model "breaks down" the best practices into 12 areas, including strategy and metrics, security features and design and configuration and vulnerability management.

http://www.csoonline.com/article/print/483716

http://www.scmagazineuk.com/Secrets-of-the-providers-detailed-in-new-report/article/128448/

http://blogs.wsj.com/digits/2009/03/04/new-effort-hopes-to-improve-software-security/

http://bsi-mm.com/

[Editor's Note (Pescatore): Good stuff, but the real value is in the listed best practices and being able to see which are common practice and which are best practice, vs. the idea of maturity levels.

(Paller): John Pesactore is exactly right (as usual). The value here is in the common, best practices that can instruct other organizations that want to learn from these leaders. We talked at length with two of the biggest participants to better understand what they have learned about security education for programmers. They explained that security awareness training was not helpful at all unless it was complemented by actual secure coding training often including the use of libraries that make secure coding easy.]

Blog: An Upgrade for the Web; HTML5

An Upgrade for the Web
Technology Review (03/10/09) Naone, Erica

HTML 5 will make the latest high-bandwidth Internet applications run even better, and could help lead the way to an application-enabled Web. Currently, Web applications are limited because Web browsers were not designed to run full desktop-style programs. For example, most browsers can only run a single piece of JavaScript code at a time, which limits the functionality of Web applications. Furthermore, different browsers react differently to existing Web standards. HTML 5 is designed to solve these problems. "We're trying to find ways for people to be able to take the live, programmable documents that make up the Web and start integrating them with all these other pieces outside the scope of the browser," says the Mozilla Foundation's Christopher Blizzard. World Wide Web Consortium HTML working group member Michael Smith says the most important part of HTML 5 has been creating specifications to ensure that different browsers perform more tasks in the same manner. To help browsers run demanding Web applications, HTML 5 has a feature called worker threads, which allows a browser to manage heavier computations by running JavaScript in the background while the user interacts with the application. HTML 5 also features new video and audio capabilities. The Canvas feature enables developers to create HTML graphics that mach graphics built using Adobe's Flash software. HTML 5 also places a greater emphasis on enabling Web applications to work offline.

Blog: Cyberattack Mapping Could Alter Security Defense Strategy

Cyberattack Mapping Could Alter Security Defense Strategy
SearchSecurity.com (03/10/09) Howard, Alexander B.

During a recent seminar at Harvard University, researchers from Sandia National Laboratories presented maps they developed of massive cyberattacks against large computer networks. The maps—which are made up of a series of colored dots, lines, and graphs—simulate a type of cyberattack known as a root attack, in which hackers try to gain control of a computer at its most basic level. Sandia's Steven Y. Goldsmith says the maps could help IT security professionals protect their networks from attacks. Goldsmith also has created intelligent white hat software agents that look for suspicious requests from internal or external sources. When the agents detect an attack, they cut off malicious agents from the group, which only authorizes authenticated data. He says the technology will enable networks to defend themselves. Goldsmith says that both aspects of Sandia's research could someday be used together to improve the effectiveness of enterprise intrusion-detection software.

View Full Article

Monday, March 9, 2009

Blog: NIST Suggests Areas for Further Security Metrics Research

NIST Suggests Areas for Further Security Metrics Research
Government Computer News (03/09/09) Jackson, William

Scientists at the National Institute of Standards and Technology's (NIST's) Computer Security Division have identified several areas that need to be researched to spur the creation of useful security metrics. One key area is the creation of formal models of security measurement and metrics. NIST scientists say the absence of these models and other formalisms has made it difficult to create security metrics that are useful in practice. Another area that needs to be researched is historical data collection and analysis. The scientists say that predictive estimates of the security of software components and applications that are being examined should be able to be derived from historical data collected about the characteristics of similar types of software and the vulnerabilities those applications experienced. The scientists observe that insights into security metrics could be gained by using analytical techniques on historical data in order to identify trends and correlations, discover unexpected relationships, and uncover other predictive interactions. Finally, the scientists say the development of computing components that are designed for measurement would be a significant step toward developing effective security metrics.

View Full Article

Sunday, March 8, 2009

Blog: Wolfram Alpha: 'A new paradigm for using computers and the web'

Wolfram Alpha: 'A new paradigm for using computers and the web'

Posted by Larry Dignan; March 8th, 2009@ 12:29 pm

Another week another Google killer. Last week, it was Twitter as Google killer. This week it's Wolfam Alpha. The difference with Wolfram Alpha is that it has the pedigree, engineering heft and perhaps a better mousetrap to actually live up to the billing.

Techmeme is a flutter with talk of Wolfram Alpha. Dan Farber notes that Stephen Wolfram is a scientist who has recorded a few breakthroughs and a little controversy. In a nutshell, Wolfram Alpha blends natural language, a new search model and an algorithm that takes all the data on the Web and makes it "computable." Wolfram just recently outlined his latest creation and added:

I think it's going to be pretty exciting. A new paradigm for using computers and the web.

Dan writes about Wolfram:

He received his Ph.D. in theoretical physics from Caltech in 1979 when he was 20 and has focused most of his career on probing complex systems. In 1988 he launched Mathematica, powerful computational software that has become the gold standard in its field. In 2002, Wolfram produced a 1,280-page tome, A New Kind of Science, based on a decade of exploration in cellular automata and complex systems.

In May, Wolfram will launch Wolfram Alpha, which is dubbed a computational knowledge engine. ...

For the brainiacs in the house, Nova Spivak has a long post outlining Wolfram Alpha (it's a must read). Simply put, if Spivak's outline is only half on target Wolfram Alpha could be big.

Saturday, March 7, 2009

Blog: Noise Could Mask Web Searchers' IDs

Noise Could Mask Web Searchers' IDs
New Scientist (03/07/09) Marks, Paul

Microsoft researchers say that adding noise to search engine records could protect Web users' identities, and that implementing such a technique would be a major step toward provable privacy. Records of Web searches are extremely useful to software engineers looking to improve search technology, and can provide valuable insight for scientists exploring digital search behaviors. However, attempts to make search data anonymous have been mostly unsuccessful. Microsoft researchers Krishnaram Kenthapadi, Nina Mishra, Alex Ntoulas, and Aleksandra Korolova say they have developed a safe way to release search data. The researchers propose publishing data associated only with the most popular queries, so that specific, rarely performed searches, such as for individual names or unique interests, cannot be used to identify people. The researchers also inserted noise into the data by adding digits to the data's figures. Korolova says that adding the noise gives the data provable privacy, and the amount of noise added defines the level of privacy that can be guaranteed. She says the added noise strikes a balance between guaranteeing privacy and providing useful data sets.

View Full Article

Thursday, March 5, 2009

Blog: Computer Scientists Deploy First Practical Web-Based Secure, Verifiable Voting System

Computer Scientists Deploy First Practical Web-Based Secure, Verifiable Voting System
Harvard University School of Engineering and Applied Sciences (03/05/09) Rutter, Michael Patrick

The Harvard School of Engineering and Applied Sciences' Center for Research on Computation and Society (CRCS) and scientists at the University Catholique de Louvain in Belgium deployed a Web-based, secure, verifiable-voting system for the Belgium presidential election that was held in early March. Called Helios, the system was developed by CRCS fellow Ben Adida. "Helios allows any participant to verify that their ballot was correctly captured, and any observer to verify that all captured ballots were correctly tallied," Adida says. "We call this open-audit voting because the complete auditing process is now available to any observer." The open source software uses advanced cryptographic techniques to maintain ballot secrecy while providing a mathematical proof that the election tally was correctly computed. Helios uses public-key homomorphic encryption, a method in which a public key is used to encrypt a message, or a vote. Homomorphic encryption allows messages to be combined while still encrypted, which works for counting votes, and requires multiple private keys to decrypt a message, which was the election tally. In an election, voters receive a tracking number for each of their votes, and each vote is encrypted with the election public key before leaving the voter's browser. Voters can then use their tracking numbers to verify that their ballot was correctly captured by the voting system, which publishes a list of all tracking numbers received before tallying. Finally, the voter, or any observer, can verify that the tracking numbers and votes were tallied appropriately. Adida says the encryption allows the entire verification process to take place without revealing the contents of each vote.

View Full Article

Tuesday, March 3, 2009

Blog: Google Launches Google Code Labs

Google Launches Google Code Labs
eWeek (03/03/09) Taft, Darryl

Google has created a new Web site that will give outside developers an opportunity to contribute to the development of the company's products. Google Code Labs already offers more than 60 application programming interfaces (APIs) and tools that are in their early stages of development, says Google's Tom Stocky. Google will look to graduate APIs and tools from Google Code Labs, and will offer deprecation policies and other critical support services. The first set of graduates includes App Engine, Google Web Toolkit, AJAX Search API, Maps API, Earth API, Calendar Data API, and YouTube APIs. For example, each version of the Visualization API terms, Contacts Data API terms, and Picasa Web Albums Data API terms will be supported for at least three years from when they are deprecated or a newer version is introduced. Google also will require a dedicated, ongoing engineering team and comprehensive test suite to graduate an API from Code Labs. Some graduated products may have experimental features that allow them to be changed or removed at any time.

View Full Article

Monday, March 2, 2009

Blog: Koobface Variant Spreading Through Social Networking Sites

SANS NewsBites vol. 11 Num. 17 (3/3/2008)

Koobface Variant Spreading Through Social Networking Sites (March 2, 2009)

A variant of the Koobface worm has been spreading through social networking communities such as Facebook and MySpace. The malware spreads by sending messages that appear to come from friends, asking them to click on a link to watch a video. When the users reach the malicious website, they receive a message that they need to install an Adobe Flash plug-in to view the clip properly. If they agree to install the plug-in, a Trojan horse program is installed on the computer instead, giving attackers control over the machine. This Koobface variant also sends out invitations to watch the bogus clip to contacts through the social networking account. In addition, two rogue Facebook applications have been attempting to steal user data.

http://voices.washingtonpost.com/securityfix/2009/03/koobface_worm_resurfaces_on_fa.html

[Editor's Note (Skoudis): Get used to this. I think we'll see a steady stream of these kinds of stories with malware propagating via social networking contacts throughout the next few years. And, given the increasingly flexible APIs the social network sites are implementing, bad guys will be able to mine this information for attacks far more effectively.]

Blog: A New World Record in Go Established by PRACE Prototype and French Software

A New World Record in Go Established by PRACE Prototype and French Software
Partnership for Advanced Computing in Europe (03/02/09)

The Dutch national supercomputer Huygens defeated two human Go professionals in an official match at the recent Taiwan Open 2009 tournament. The Huygens supercomputer was running the MoGo TITAN application, which was developed by the INRIA research organization in France and Maastricht University. Go has replaced chess as a test bed for artificial intelligence (AI) research because it is one of the last board games in which humans are still better players than computers. However, since 2006, after a new algorithm called Monte-Carlo Tree Search was developed, the level of Go programs has rapidly improved. The Huygens supercomputer running MoGo TITAN achieved its first victory in August 2008 at the 24th Annual Congress of the Go competition, when it defeated a professional Go player in an official match. At the Taiwan Open in February, MoGo TITAN set a world record by winning two matches against professional players. "This new milestone in AI research once again clearly demonstrates the great potential of Huygens in many nontraditional areas of usage of supercomputing," says Anwar Osseyran, managing director of SARA Computing and Networking Services in Amsterdam, where the Huygens supercomputer is located.

View Full Article

Blog Archive