Thursday, November 27, 2008

Blog: Srizbi Bots Seek Alternate Command-and-Control Servers; believed to generate half of all worldwide spam

SANS NewsBites Vol. 10 Num. 94 (fwd)

Tue, 2 Dec 2008

--Srizbi Bots Seek Alternate Command-and-Control Servers (November 26 & 27, 2008)

The Srizbi botnet, which was disabled by the shutdown of web hosting company McColo several weeks ago, appeared to be back online early last week. Srizbi includes an algorithm that attempts to establish new domain names that the malware could contact for instructions should the initial connection be severed. The botnet suffered another setback when the Estonian Internet service provider (ISP) that had hosted its command and control servers for a short period of time also cut off service to those servers. Srizbi is estimated to comprise more than 450,000 PCs, and it is believed that half of all spam generated worldwide comes through the Srizbi botnet. The reason Srizbi was kept at bay for several weeks was that researchers reverse engineered the Srizbi code and figured out what domains the bots would be searching for, then created and seized them so the bot masters could not regain control of the army of infected machines.

http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9121758&source=rss_topic17

http://www.theregister.co.uk/2008/11/26/srizbi_returns_from_dead/

[Editor's Note (Pesactore): The bot client strategies for finding command and control centers has gotten increasingly devious. New techniques used mechanisms that are very similar to old style spycraft, the cyber equivalent of spy numbers stations and chalk Xs on mailboxes.

The needed security breakthrough here is being able to tell automated actions from user-driven actions from the network, rather than relying on blocking communications to command and control centers. ]

Tuesday, November 18, 2008

Blog: An Algorithm With No Secrets; need for a new security hash function

An Algorithm With No Secrets
Technology Review (11/18/08) Naone, Erica

The National Institute of Standards and Technology (NIST) is organizing a competition to find an algorithm to replace the Secure Hash Algorithm 2 (SHA-2), which is becoming outdated. NIST plans to release a short list of the best entries by the end of November, the beginning of a four-year-long process to find the overall winner. In 2005, Tsinghua University Center for Advanced Study professor Xiaoyun Wang found weaknesses in several related hashing algorithms, and since then Wang and others have found faults in several other hashing schemes, causing officials to worry that SHA-2 also may eventually be found to be vulnerable. A hash algorithm creates a digital fingerprint for messages that keep them secure during transit, but it is only considered secure if there is no practical way of running it backward and finding the original message from the fingerprint. There also cannot be a way of producing two messages with the same exact fingerprint. The weaknesses discovered by Wang and others relate to this problem, which cryptographers call a collision. It is impossible to completely avoid collisions, but the best algorithms make collisions extremely hard to produce. "Hash functions are the most widely used and the most poorly understood cryptographic primitives," says BT Counterpane's Bruce Schneier. "It's possible that everything gets broken here, simply because we don't really understand how hash functions work." NIST already has received 64 entries and is counting on cryptographers to narrow the list.

View Full Article

Monday, November 17, 2008

Blog: UT Trainees Tackle Health Information Technology Issues; health IT can cause a loss in efficiency and an increase in medical errors

UT Trainees Tackle Health Information Technology Issues
University of Texas Health Science Center at Houston (11/17/08) Cahill, Rob

University of Texas School of Health Information Sciences at Houston (SHIS) has received a $1.3 million grant from the Agency for Healthcare Research and Quality to support research by six trainees in health information technology. SHIS principal investigator Todd R. Johnson says a huge effort is underway to use health information technology to improve health care, but notes that current technology is not designed to efficiently support the needs of clinicians. As a result, Johnson says there have been numerous cases where the introduction of health IT causes a loss in efficiency and an increase in medical errors. A report by the Institute of Medicine estimates that medical errors cost the United States approximately $38 billion a year and that as many as 98,000 people die in hospitals each year as a result of a medical error. Training grant co-director Eric Thomas says that most medical errors are partially because of information overload. The trainees are working on projects that will increase patient safety. One trainee is focusing on how to design information communications systems so physicians do not miss abnormal test result notifications, while another trainee is looking to improve emergency room decision making. Another trainee is using radio identification technology to track the movements of both emergency room personnel and supplies, and another is using information from simulation studies to develop an IT solution to improve the effectiveness and timeliness of clinical decisions in emergency rooms.

View Full Article

Blog: 'Six Degrees of Kevin Bacon' Game Provides Clue to Efficiency of Complex Networks

'Six Degrees of Kevin Bacon' Game Provides Clue to Efficiency of Complex Networks
University of California, San Diego (11/17/08) Froelich, Warren; Zverina, Jan

The small-world paradigm, discovered by sociologist Stanley Milgram in the 1960s and popularized by the "Six Degrees of Kevin Bacon" game, has become a source of inspiration for researchers studying the Internet as a global complex network. A study published in Nature Physics reveals a previously unknown mathematical model called "hidden metric space," which could explain the small-world phenomenon and its relationship to both man-made and natural networks such as human language, as well as gene regulation or neural networks that connect neurons to organs and muscles within our bodies. The concept of an underlying hidden space also may be of interest to researchers working to remove mounting bottlenecks within the Internet that threaten the smooth passage of digital information. "Internet experts are worried that the existing Internet routing architecture may not sustain even another decade," says Dmitri Krioukov, a researcher at the Cooperative Association for Internet Data Analysis (CAIDA), based at the University of California, San Diego (UCSD). CAIDA director and UCSD professor Kimberly Claffy says the discovery of a metric space hidden beneath the Internet could point toward architectural innovations that would remove this bottleneck, and Krioukov says the reconstruction of hidden metric spaces underlying a variety of real complex networks could have other practical applications. For example, hidden spaces in social or communications networks could lead to new, efficient strategies for searching for specific content or individuals.

View Full Article

Blog: Burned Once, Intel Prepares New Chip Fortified by Constant Tests (formal methods)

Burned Once, Intel Prepares New Chip Fortified by Constant Tests
The New York Times (11/17/08) P. B3; Markoff, John

Despite rigorous stress testing on dozens of computers, Intel's John Barton is still nervous about the upcoming release of Intel's new Core i7 microprocessor. Even after months of testing, Barton knows that it is impossible to predict exactly how the chip will function once it is installed in thousands of computers running tens of thousands of programs. The new chip, which has 731 million transistors, was designed for use in desktop computers, but the company hopes that it will eventually be used in everything from powerful servers to laptops. The design and testing of an advanced microprocessor is one of the most complex endeavors humans have undertaken. Intel now spends $500 million annually to test its chips before selling them. However, it still is impossible to test more than a fraction of the total number of "states" that the new Core i7 chip can be programmed in. "Now we are hitting systemic complexity," says Synopsys CEO Aart de Geus. "Things that came from different angles that used to be independent have become interdependent." In an effort to produce error-free chips, Intel in the 1990s turned to a group of mathematical theoreticians in the computer science field who had developed advanced techniques for evaluating hardware and software, known as formal methods. In another effort to minimize chip errors, the Core i7 also contains software that can be changed after the microprocessors are shipped, giving Intel the ability to correct flaws after the product's release.

View Full Article

Blog: 11 "Laws of IT Physics"

November 17th, 2008

11 “Laws of IT Physics”

Posted by Michael Krigsman @ 6:02 am

Given high rates of failed IT projects, it’s helpful to examine first principles that underlie successful technology deployments. Even though devils live in the details, understanding basic dynamics behind successful project setup, execution, and management is important.

During testimony before Congress, in a hearing titled “The Dismal State of Federal Information Technology Planning,” Norm V. Brown, Executive Director of the Center for Program Transformation, an IT failures think tank, presented what he calls “Laws of IT Physics™.”

These “Laws” highlight hidden pitfalls the hurt many projects, and help explain why some projects succeed while others fail. They recognize that successful IT project delivery is primarily about managing people, process, and deliverables. Yes, technology is important, but the people side comes first.

Thursday, November 13, 2008

Blog: Working at 99% CPU utilization

November 13th, 2008

Working at 99% CPU utilization

Posted by Paul Murphy @ 12:15 am

The key metric used in data processing management is system utilization - and that same metric has been adopted to push the idea that consolidation through virtualization makes sense.

The key factor driving the evolution of system utilization as a management metric was the high cost of equipment - first in the 1920s when data processing was getting started, and again in the 1960s when the transition from electro-mechanical to digital equipment got under way.

What made it possible, however was something else: the fact that batch processing of after the fact data was the natural norm for the gear meant that user management expected printed reports to arrive on their desks on a predictable schedule wholly divorced from the processes generating the data.

Thus what drove data processing management to use its gear 24 x 7 was money - but what made it possible to do this was user management’s acceptance of the time gaps implicit in an overall process consisting of distinct data collection, data processing, and information reporting phases.

Almost a hundred years later data processing still operates in much the same way - and the utilization metric is still their most fundamental internal performance measure. And, since their costs are still high too, the financial incentive to high utilization continues to provide justificatory power.

Wednesday, November 12, 2008

Blog: Computer Science Outside the Box; research project requirements

Computer Science Outside the Box
Computing Community Consortium (11/12/08) Lazowska, Ed

Computing Community Consortium (CCC) chairman Ed Lazowska attended the "Computer Science Outside the Box" workshop hosted by several organizations, including the National Science Foundation and the CCC. One of the insights he took away from the event was that smart decisions can be aided by a model of computer science's evolution depicted as an ever-expanding sphere. "Even when working inside the sphere, we've got to be looking outward," Lazowska writes. "And at the edges of the sphere, we've got to be embracing others, because that's how we reinvent ourselves." Lazowska observes that most computer science work is driven both by usage concerns and a longing to evolve precepts of lasting value, and he contends that researchers may be ascribing too little worth to research without obvious utility and may also be too hesitant to spurn work that concentrates on applications where it may not be apparent that their own field will move forward. This assumption applies to both the interfaces between computer science and other disciplines as well as the intersections between sub-disciplines of computer science. "We've got to produce students who are comfortable at these interfaces," Lazowska says. A research project must be challenging enough to sustain the participant's interest, yet easy enough to be achievable, he says, and the project must possess a long-term vision and be undertaken in increments.

View Full Article

Tuesday, November 11, 2008

Blog: Why Veins Could Replace Fingerprints and Retinas as Most Secure Form of ID

Why Veins Could Replace Fingerprints and Retinas as Most Secure Form of ID
Times Online (UK) (11/11/08) Harvey, Mike

Finger vein authentication is starting to gain traction in Europe. Easydentic Group in France says it will use finger vein security for door access systems in the United Kingdom and other European markets. The advanced biometric system, which verifies identities based on the unique patterns of veins inside the finger, has been widely introduced by Japanese banks in thousands of cash machines over the last two years. Hitachi developed the technology, which captures the pattern of blood vessels by transmitting near-infrared light at different angles through the finger, and then turns it into a digital code to match it against preregistered profiles. Veins are difficult to forge and impossible to manipulate because they are inside the body, according to Hitachi. The company also says finger vein technology is more affordable than iris scanning or face/voice recognition and has a lower false rejection rate than fingerprinting. Finger vein authentication is primarily used in Japan for ATMs, door access systems, and computer log-in systems.

View Full Article

Monday, November 10, 2008

Blog: Study Shows How Spammers Cash In

Study Shows How Spammers Cash In
BBC News (11/10/08)

Researchers at the University of California, Berkeley and the University of California, San Diego (UCSD) hijacked a working spam network to analyze its economic value. The analysis found that spammers can make money by getting just one response for every 12.5 million emails sent. However, the researchers say that spam networks may be susceptible to attacks that make it more costly to send junk email. The researchers, led by UCSD professor Stefan Savage, took over a piece of the Storm spam network and created several proxy bots that acted as conduits of information between the command and control system for Storm and the hijacked home PCs that send the junk mail. The researchers used the machines to conduct their own fake spam campaigns. Two types of fake spam were sent, one that mimicked the way Storm spreads using viruses and the other aimed at tempting people to visit a fake pharmacy site and buy an herbal remedy to boost their libido. The fake pharmacy site always returned an error message when potential buyers clicked a button to submit their credit card details. The researchers sent about 469 million spam messages, and after 26 days and almost 350 million email messages, only 28 "sales" were made. The response rate for the campaign was less than 0.00001 percent, and would have resulted in revenues of $2,731.88, just over $100 a day for the measurement period. The researchers say that spam's small profit margin indicates that spammers would be economically susceptible to any disruptions in their networks.

View Full Article

Saturday, November 8, 2008

Blog: Computers checking mathematical proofs?

November 8th, 2008

Computers checking mathematical proofs?

Posted by Roland Piquepaille @ 8:55 am

Computer-assisted of mathematical proofs are not new. For example, computers were used to confirm the so-called ‘four color theorem.’ In a short release, ‘Proof by computer,’ the American Mathematical Society (AMS) reports it has published a special issue of one its journals dedicated to computer-aided proofs. ‘The four Notices articles explore the current state of the art of formal proof and provide practical guidance for using computer proof assistants.’ And as said one of the researchers involved, ’such a collection of proofs would be akin to the sequencing of the mathematical genome.’ But read more…

Let’s first look at how mathematical theorems were proven in the past. “When mathematicians prove theorems in the traditional way, they present the argument in narrative form. They assume previous results, they gloss over details they think other experts will understand, they take shortcuts to make the presentation less tedious, they appeal to intuition, etc. The correctness of the arguments is determined by the scrutiny of other mathematicians, in informal discussions, in lectures, or in journals. It is sobering to realize that the means by which mathematical results are verified is essentially a social process and is thus fallible.”

This is why the concept of ‘formal proof’ is now being used. “To get around these problems, computer scientists and mathematicians began to develop the field of formal proof. A formal proof is one in which every logical inference has been checked all the way back to the fundamental axioms of mathematics. Mathematicians do not usually write formal proofs because such proofs are so long and cumbersome that it would be impossible to have them checked by human mathematicians. But now one can get ‘computer proof assistants’ to do the checking. In recent years, computer proof assistants have become powerful enough to handle difficult proofs.”

Here is a link to this special issue of the Notices of the American Mathematical Society (December 2008, Volume 55, Issue 11).

Thursday, November 6, 2008

Blog: Proof by Computer

Proof by Computer
EurekAlert (11/06/08)

New developments in the use of formal proof in mathematics are investigated by a series of articles by leading experts, published in the December 2008 edition of the Notices of the American Mathematical Society. The traditional method of proving theorems involves mathematicians presenting the argument in narrative form, and the correctness of the arguments is decided by the analysis of other mathematicians, in informal discussions, in lectures, or in journals. Problems of reliability inevitably crop up because the process of verifying mathematical results is basically social and therefore prone to error, and so the field of formal proof was developed to circumvent these shortcomings. A formal proof is one in which every single logical presumption has been checked all the way back to the basic mathematical precepts, and technology has advanced in recent years so that computer proof assistants that perform such checking are sufficiently powerful to accommodate challenging proofs. If the use of these assistants proliferates, the practice of mathematics could be dramatically revised. One vision is to have formal proofs of all central mathematical proofs, which Thomas Hales of the University of Pittsburgh compares to "the sequencing of the mathematical genome." The quartet of articles analyzes the current state of the art of formal proof and offers practical guidance for employing computer proof assistants.

View Full Article

Wednesday, November 5, 2008

Blog: Yahoo's Hadoop Software Transforming the Way Data Is Analyzed

Yahoo's Hadoop Software Transforming the Way Data Is Analyzed
SiliconValley.com (11/05/08) Ackerman, Elise

Yahoo!'s Hadoop open source data-mining program is capable of searching through the entire Library of Congress in less than 30 seconds. Universities also are using Hadoop, which is part of Yahoo!'s huge computing grid. "It makes it possible to actually take advantage of all the computers we have hooked up together," says Yahoo!'s Larry Heck. Hadoop improves the relevance of ads Yahoo! displays on the Internet by analyzing the company's endless flow of data, which is now more than 10 terabytes a day, in real time. As users navigate through Yahoo!, Hadoop determines which ads are likely to catch their attention. Yahoo! also will be using Hadoop on the sites owned by the 796 members of a newspaper consortium that is working with Yahoo! to sell more advertising at better prices. Hadoop was first used to build Yahoo!'s Web index. Since then, the software has been adjusted by engineers and researchers both inside and outside of the company for use in experiments with giant data sets. Amazon, Facebook, and Intel developers are using Hadoop for tasks such as log analysis to modeling earthquakes. "We are leveraging not only the contribution that we are giving to the software, but the contributions from the larger community as well, everybody wins from it," Heck says.

View Full Article

Tuesday, November 4, 2008

Blog: Multicore: New Chips Mean New Challenges for Developers

Multicore: New Chips Mean New Challenges for Developers
IDG News Service (11/04/08) Krill, Paul

The development of multicore processors is forcing software developers to work on getting software to be processed across multiple cores to fully utilize the new performance capabilities available in the hardware. However, this task has proven to be quite challenging, with developers struggling with issues surrounding concurrency and potential performance bottlenecks. Already, 71 percent of organizations are developing multithreaded applications for multicore hardware, according to a recent IDC survey. IDC analyst Melinda Ballou says developers need to approach multicore with a level of commitment to better practices throughout an organization and from a project perspective. Multicore processors are becoming increasingly common as single-core chips reach their limits and as power-consumption issues become more important. As hardware continues to change, the pressure will be on software developers to adapt and capitalize on the new capabilities. Developers must learn new techniques and use new tools to maximize performance. Intel's James Reinders acknowledges that developing multicore applications requires a much more complicated thought process about software design than most developers understand. "By and large, the majority of programmers don't have experience with it and are in need of tools and training and so forth to help take advantage of it," Reinders says. Intel is offering its Threading Building Blocks template library to help C++ programmers with parallel programming. The Intel Thread Checker helps find nondeterministic programming errors, and the Intel Thread Profiler helps visualize a program to check what each core is doing.

View Full Article

Monday, November 3, 2008

Blog: Microsoft Security Intelligence Report for First Half of 2008 (November 3, 2008)

SANS NewsBites Vol. 10 Num. 87 (fwd)

Tue, 4 Nov 2008

--Microsoft Security Intelligence Report for First Half of 2008 (November 3, 2008)

According to Microsoft's most recent semi-annual Security Intelligence Report, while machines running Windows Vista are less likely to be infected with malware than their Windows XP counterparts, ActiveX browser plug-ins still pose a threat to the newer operating system.

During the first six months of 2008, for each thousand times Microsoft's Malicious Software Removal Tool (MSRT) was executed, it scrubbed malware from three Vista SP1 machines, 10 Windows XP SP2 machines and eight Windows XP SP3 machines. Of the top 10 browser-based attacks against Vista during that same period, eight were ActiveX vulnerabilities. The report also found that 90 percent of disclosed vulnerabilities were in applications, while just six percent were in operating systems.

http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9118879&source=rss_topic17

http://news.cnet.com/8301-1009_3-10080428-83.html?part=rss&subj=news&tag=2547-1009_3-0-20

[Editor's Note (Pescatore): There are far more applications than there are operating systems, so that last bit is not very surprising. The most meaningful data in this report is the chart that shows what types of installed malware the MSRT found and removed. It shows that Trojans and "potentially unwanted software" are getting through desktop defenses pretty easily - the signature and patch-centric approach to protecting desktops isn't dealing with the new, targeted threats that aim at the user, not unpatched PCs.]

Blog: Profile: Luis von Ahn; human-assisted computation

Profile: Luis von Ahn
BusinessWeek (11/03/08) Scanlon, Jessie

Carnegie Mellon University computer science professor Luis von Ahn has developed digitization software that could put the New York Times' entire archive, which dates back to 1851, online by late 2009. The newspaper has been using typists to digitize its archive, and in 10 years they have been able to digitize 27 years of articles. Von Ahn's software will process 129 years in less than 24 months. Von Ahn's research focuses on what he calls "human computation." He develops Web-based programs that take advantage of human abilities, such as reading or knowing common-sense facts, and then aggregating that knowledge to solve large-scale, ongoing problems in computer science. Von Ahn's first breakthrough technology, the Captcha, developed with von Ahn's thesis advisor Manuel Blum in 2000, is used by 60,000 Web sites to verify that the entity filling out a Web registration form is in fact a human. However, von Ahn released ReCaptcha, an updated version of the technology that replaced Captcha's random letters with words from library archives, and now the New York Times' archive, that computers were not able to read, helping complete digitization projects. Another example of human computation is von Ahn's ESP Game, in which two players are shown the same image and asked to type in descriptive labels. When the labels match, the players are awarded points and shown another image. The game helps generate tags for online images.

View Full Article

Blog Archive