Friday, December 28, 2007

Software: Java Is Becoming the New Cobol

Java Is Becoming the New Cobol
InfoWorld (12/28/07) Snyder, Bill
Java is becoming less popular with developers as many are switching to Ruby on Rails, PHP, AJAX, and Microsoft's .Net to develop rich Internet applications. Many developers feel that Java slows them down. Peter Thoneny, CEO of Twiki.net, which produces a certified version of the open source Twiki wiki-platform software, says Java promised to solve incompatibility problems across platforms, but the different versions and different downloads of Java are creating complications. Ofer Ronen, CEO of Sendori, which routes domain traffic to online advertisers and ad networks, says languages such as Ruby offer pre-built structures such as shopping carts that would have to be built from scratch with Java. Zephyr CEO Samir Shah says Java's user-interface capabilities and memory footprint simply do not measure up and put it at a serious disadvantage in regards to mobile application development. Nevertheless, developers and analysts agree that Java is still going strong in internally developed enterprise apps. "On the back end, there is still a substantial amount of infrastructure available that makes Java a very strong contender," Shah says.
Click Here to View Full Article

Saturday, December 22, 2007

Security: Wi-Fi Routers Are Vulnerable to Viruses

Wi-Fi Routers Are Vulnerable to Viruses
New Scientist (12/22/07) Merali, Zeeya
Indiana University in Bloomington researcher Steven Myers has been investigating how a virus could be spread between wireless routers. "We forget that routers are mini-computers," Myers says. "They have memory, they are networked, and they are programmable." However, routers are not usually scanned for viruses or protected by firewalls, and while Myers says there are no known viruses that target routers, they are still easy targets. Routers within about 100 meters would be able to spread viruses to one another and create a vast network for viruses. While routers normally do not communicate with each other, it would be easy for hackers to create a virus that enables routers to communicate. Myers used records on the location of Wi-Fi routers around Chicago, Manhattan, San Francisco, Boston, and parts of Indianapolis to create a simulation of how a router attack might spread. In each simulated city, viruses were able to jump between routers lacking high-security encryption within 45 meters of each other. The virus spread surprisingly fast, with most of the tens of thousands of routers becoming infected within 48 hours. The geography of the cities affected how the virus spread, with rivers and bays acting as "natural firewalls." Routers can be protected by changing the password from the default setting and enabling high-security WPA encryption. University of Cambridge computer scientist Ross Anderson says the study exposes a more significant problem in that all electronics, including phones, routers, and even microwaves, are being built with software that could potentially become infected.
Click Here to View Full Article

Tuesday, December 11, 2007

Security: DNS Attack Could Signal Phishing 2.0

DNS Attack Could Signal Phishing 2.0
Robert McMillan, IDG News Service


Researchers at Google and the Georgia Institute of Technology are studying a virtually undetectable form of attack that quietly controls where victims go on the Internet

The study, set to be published in February, takes a close look at "open recursive" DNS servers, which are used to tell computers how to find each other on the Internet by translating domain names like google.com into numerical Internet Protocol addresses. Criminals are using these servers in combination with new attack techniques to develop a new generation of phishing attacks.



The Georgia Tech and Google researchers estimate that as many as 0.4 percent, or 68,000, open-recursive DNS servers are behaving maliciously, returning false answers to DNS queries. They also estimate that another two percent of them provide questionable results. Collectively, these servers are beginning to form a "second secret authority" for DNS that is undermining the trustworthiness of the Internet, the researchers warned.

http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9052198

Saturday, December 1, 2007

Software: Computing in a Parallel Universe

Computing in a Parallel Universe
American Scientist (12/07) Vol. 95, No. 6, P. 476; Hayes, Brian
Multicore chips that facilitate parallel processing will require a major rethinking in program design, writes Brian Hayes. Software for parallel processors is vulnerable to subtle bugs that cannot manifest themselves in strictly sequential programs. Running correct concurrent programs is possible, but a key challenge is that running the same set of programs on the same set of inputs can entail different results depending on the precise timing of events. One concept for addressing this problem is to have the operating system manage the allocation of tasks to processors and balance the workload, which is currently the chief strategy with time-sliced multiprocessing and dual-core chips. Another tactic is to assign this responsibility to the compiler, which, like the earlier strategy, would be the job of expert programmers. But making the most of parallel computing requires all programmers to deal with the problems of creating programs that run efficiently and properly on multicore systems. "We have a historic opportunity to clean out the closet of computer science, to throw away all those dusty old sorting algorithms and the design patterns that no longer fit," Hayes concludes. "We get to make a fresh start."
Click Here to View Full Article

Wednesday, November 28, 2007

Blog: Zero Days: How to protect yourself; includes a pretty good list of "should do's"

Zero Days: How to protect yourself; Larry Dignan: November 28th, 2007

The SANS Institute released its top 20 security risks for 2007, which documents the security arms race between cyber-criminals and the folks playing defense. But let's focus on the big scourge -- zero day attacks.
READ FULL STORY

Monday, November 26, 2007

Medical Software: Design of Patient Tracking Tools May Have Unintended Consequences

Design of Patient Tracking Tools May Have Unintended Consequences
University at Buffalo News (11/26/07) Goldbaum, Ellen
A new field study by researchers at the University at Buffalo, the University of Rochester, and the University of Florida, Jacksonville found that properly designing computational tools is critical for the successful use of such tools in patient-care applications, particularly in hospital emergency rooms. The study examined the use and efficiency of new electronic patient-status boards in the emergency departments of two busy, university-affiliated hospitals. Overall, the researchers found that computational tracking systems affect how health care providers communicate information and track activities regarding patient care, which can change how providers work. The results provide an important example of what can happen when new technologies are not developed by designers with a sufficient understanding of how the technology will be used, says UB professor Ann Bisantz. "Research in human factors, the study of the interactions between humans and technology, has shown that in complex workplaces where safety is critical, such mismatches between the way practitioners work and the technologies that are supposed to support them can have unintended consequences, including inefficiencies and workarounds, where the technology demands that people change their work method," Bisantz says. During observations, focus groups, and interviews with nurses, physicians, secretaries, IT specialists, and administrators, the researchers found that the computerized systems are unable to match the functionality of the manual, erasable whiteboards traditionally used in emergency departments. "If you don't understand the underlying structure of the work that is being done in a particular setting, then you cannot design the technology that will best support it," Bisantz says.
Click Here to View Full Article

Monday, November 19, 2007

Research: Simplicity Sought for Complex Computing [Wolfram]

Simplicity Sought for Complex Computing
Chicago Tribune (11/19/07) Van, Jon
Stephen Wolfram says people building complex computers and writing complicated software may achieve more studying nature. Wolfram says his company is exploring the "computational universe" to find more simple solutions to complex problems that are currently handled by complex software. "Nature has a secret it uses to make this complicated stuff," Wolfram says. "Traditionally, we're not taking advantage of that secret. We create things that go around things nature is doing." Wolfram believes that nature has created a molecule that could be used as a computer if people ever manage to isolate and program the molecule. University of Chicago Department of Computer Science Chairman Stuart Kurtz says a lot of computer scientists are fascinated by finding simple systems capable of producing complex results. For example, a University of Southern California professor has proposed using recombinant DNA for computing. While DNA computers are largely theoretical, computer scientists take them quite seriously, Kurtz says. "People are used to the idea that making computers is hard," Wolfram says. "But we're saying you can make computers out of small numbers of components, with very simple rules."
Click Here to View Full Article

Thursday, November 1, 2007

Research: 'Suicide Nodes' Defend Networks From Within

'Suicide Nodes' Defend Networks From Within
New Scientist (11/01/07) Marks, Paul
University of Cambridge researchers have developed a computer defense system that mimics how bees sacrifice themselves for the greater good of the hive. The approach starts by giving all the devices on a network, or nodes, the ability to destroy themselves, and take down any nearby malevolent devices with them. The self-sacrifice provision provides a defense against malicious nodes attacking clean nodes. "Bee stingers are a relatively strong defense mechanism for protecting a hive, but whenever the bee stings, it dies," says University of Cambridge security engineer Tyler Moore. "Our suicide mechanism is similar in that it enables simple devices to protect a network by removing malicious devices--but at the cost of its own participation." The technique, called "suicide revocation," allows a single node to quickly decide if a nearby node's behavior is malevolent and to shut down the bad node, but at the cost of deactivating itself. The node also sends an encrypted message announcing that itself and the malevolent node have been shut down. The purpose of the suicide system is to protect networks as they become increasingly distributed and less centralized. Similar systems allow nodes to "blackball" malicious nodes by taking a collective vote before ostracizing the malicious node, but the process is slow and malicious nodes can outvote legitimate nodes. "Nodes must remove themselves in addition to cheating ones to make punishment expensive," says Moore. "Otherwise, bad nodes could remove many good nodes by falsely accusing them of misbehavior."
Click Here to View Full Article

Monday, October 29, 2007

Security: AT&T Invents Programming Language for Mass Surveillance; dynamic data mining

AT&T Invents Programming Language for Mass Surveillance
Wired News (10/29/07) Singel, Ryan

AT&T researchers have developed Hancock, a C language-based programming language designed to mine the company's telephone and internet records for surveillance data. A recently discovered AT&T research paper published in 2001 shows how the phone company uses Hancock-based software to process tens of millions of long distance phone records to create "communities of interest," or calling circles that show who people are talking to. Hancock was developed in the late 1990s to develop marketing leads and as a security tool to see if new customers called the same numbers as previously disconnected fraudsters, which the research paper called "guilt by association." Hancock-based programs work by analyzing data as it enters a data warehouse, a significant difference from traditional data-mining tools that tend to look for patterns in static databases. A 2004 paper published in ACM Transactions on Programming Languages and Systems demonstrates how Hancock can sort through calling card records, long distance calls, IP addresses and Internet traffic dumps, and even track the movement of a cell phone as it switches between signal towers.
Click Here to View Full Article


Wednesday, October 24, 2007

FW: Simplest 'Universal Computer' Wins Student $25,000; 1-D Cellular Automata

Simplest 'Universal Computer' Wins Student $25,000
New Scientist (10/24/07) Giles, Jim

University of Birmingham computer science student Alex Smith solved the simplest "universal computer" proof by proving that a simple mathematical calculator can be used as a "universal computing machine," earning a $25,000 prize. The proof involves a mathematical calculator known as a Turing machine, some of which are "universal computers" that given enough time and memory can solve almost any mathematical problem. In May 2007, mathematician Stephen Wolfram announced a contest to see if anyone could prove that the simplest Turing machine, a cellular automaton that uses just three different symbols in its calculations, is also a universal computer. Smith, who is 20 years old and knows 20 different programming languages, including six he describes as "esoteric," solved the proof by showing that the machine is equal to another mathematical device that is already known to be a universal computer. Wolfram says proving that even the simplest machine is capable of being a universal computer indicates that equally simple molecular versions could some day be the foundation for new kinds of computing. "We are also at the end of a quest that has spanned more than half a century to find the very simplest universal Turing machine," says Wolfram.
Click Here to View Full Article


Security: Password-Cracking Chip Causes Security Concerns



Password-Cracking Chip Causes Security Concerns
New Scientist (10/24/07) Brandt, Andrew


Russia's Elcomsoft has filed a U.S. patent application for a technique for cracking computer passwords using inexpensive off-the-shelf computer graphics hardware. Using an inexpensive graphics card, Elcomsoft was able to increase its password cracking speed by a factor of 25, says Elcomsoft's Vladimir Katalov. The most difficult passwords, such as those used to log onto a Windows Vista computer, would normally take months of continuous computer processing using a normal central processing unit. However, Katalov says they can be cracked in as little as three to five days by using a graphics processing unit. He says less complex passwords can be cracked in a few minutes instead of hours or days. The speed increase comes from how a GPU processes information. Password cracking is an effective way to access information on a computer, but is generally ineffective at accessing online banking services since their Web sites often require multiple passwords and shut down after several incorrect attempts. Cryptography Research's Benjamin Jun says the technique is an impressive achievement that required elegant, intelligent design, and while the ability to crack passwords using GPUs is concerning, it is not a cause for panic. Advancements in cryptographic keys and the growing trend of encrypting entire hard drives is making accessing sensitive data more difficult. "Should I throw away my Web server and run for the hills?" asks Jun. "I don't think so."
Click Here to View Full Article


Tuesday, October 23, 2007

Security: Identity Theft: Costs More, Tech Less; average loss - $30K+

Identity Theft: Costs More, Tech Less
Network Computing (10/23/07) Claburn, Thomas

A study by Utica College's Center for Identity Management and Information Protection (CIMIP) revealed that the median actual dollar loss for victims of identity theft is $31,356, a much higher figure than suggested by past studies. However, earlier studies primarily concentrated on consumer losses, whereas Utica's study reviewed 517 cases investigated by the U.S. Secret Service, which tend to be major incidents, not minor scams. Indeed, the CIMIP study is the first to review the Secret Services' closed case files, and as such aims to provide empirical data. The report proved that companies as well as individuals are affected by identity theft. The study also discovered that the Internet is not always an essential tool for identity thieves. Of the 517 cases reviewed, 102 cases involved Internet use and 106 involved non-technological means, such as mail rerouting. In other instances, criminals used mail theft to access sensitive information and then used Internet-related tools to create fake documents. Another unanticipated finding was that in the 274 cases with identifiable points of compromise, businesses were the starting point for half of the breaches. Moreover, one-third of the identity theft cases reviewed implicated insiders. Finally, the study's results challenged the belief that most identity thieves are white males, as roughly 50 percent of the offenders were black and roughly 40 percent were white. CIMIP works with corporate, government, and academic institutions to research identity management, information sharing, and data protection, including the Carnegie Mellon University Software Engineering Institute, Indiana University's Center for Applied Cybersecurity Research, and Syracuse University's CASE Center.
Click Here to View Full Article


Sunday, October 21, 2007

Security: 'Half-Quantum' Cryptography Promises Total Security; quantum-encrypted key only

'Half-Quantum' Cryptography Promises Total Security
New Scientist (10/21/07) Marks, Paul

Many cryptographers believed that the only way to achieve complete security when transmitting information was to use quantum cryptography to share the key used for encryption. However, researchers say they can achieve the same level of security even if one party stays in the world of classical physics. In conventional quantum cryptography, a sender, dubbed Alice, generates a string of 0s and 1s and encodes them using a photon polarized in either the computational "basis" in which 0 and 1 are represented by vertical and horizontal polarizations, or in diagonal bases in which 1 and 0 are represented by 45 degree and negative 45 degree polarizations. When the photons arrive at their destination, the receiver, dubbed Bob, chooses either the computational or diagonal bases to measure each one, telling Alice which he has chosen. If the chosen basis is wrong, Alice tells Bob to discard that bit. The bits that are guessed correctly form the secret key. If an eavesdropper intercepts any photons, the stream is interrupted and Bob's ability to read a number of the photons he might have read correctly is destroyed. The increase in unreadable photons tells Bob the communication channel has been compromised. Researchers at the Israel Institute of Technology in Haifa and the University of Montreal have demonstrated that only Alice needs to be quantum-equipped. Alice encodes the bits as usual, though Bob can only use the computational basis. Bob randomly measures some of the received photons and returns the rest to Alice untouched. The bits read in the computational basis form the key. The system is still secure because anyone eavesdropping does not know which photons will be returned to Alice unmeasured.
Click Here to View Full Article


Wednesday, October 17, 2007

Security: Rebinding Attacks Unbound; DNS rebinding vulnerability

Rebinding Attacks Unbound
Security Focus (10/17/07) Biancuzzi, Federico

Stanford University Ph.D. student Adam Barth participated in a study that determined that Web browsers are still vulnerable to DNS rebinding. He says in an interview that rebinding attacks are successful because browsers and plug-ins employ DNS host names to distinguish between different origins, but browsers do not really communicate with the hosts by name--they must first use DNS to align the host name with an IP address and then communicate with the host through its IP address. DNS rebinding can be used to bypass firewalls or to temporarily commandeer a client's IP address to issue spam email or defraud pay-per-click advertisers. Barth says the solution used to fix the classic DNS rebinding vulnerability--DNS pinning--no longer effectively defends against the vulnerability because today's browsers contain many different technologies that allow network access, such as Java and Flash. These technologies support separate pin databases, but are allowed to communicate within the browser. Barth says an effective defense against firewall circumvention is the configuration of DNS resolvers not to bind host names to internal IP addresses, while host name authorization can prevent DNS rebinding vulnerabilities in the longer term. "I'm hopeful the vendors will reach a consensus to fix these issues using host name authorization, but this requires the vendors to cooperate with each other," he notes. Barth says DNSSEC offers no protection against DNS rebinding attacks because it is designed to prevent pharming not rebinding. Barth and fellow members of the Stanford Web Security Lab are presenting a paper on DNS rebinding at the 2007 ACM Conference on Computer and Communications Security, Oct. 29-Nov. 2, in Alexandria, Va. For more information about the conference, visit http://www.sigsac.org/ccs.html
Click Here to View Full Article


Monday, October 1, 2007

Security: Hacker Curriculum: How Hackers Learn Networking

Hacker Curriculum: How Hackers Learn Networking
IEEE Distributed Systems Online (10/07) Bratus, Sergey
The hacker community has devised effective methods for the analysis, reverse engineering, testing, and modification of software and hardware, and it behooves leaders in industry and academia to understand this culture and be cognizant of its values, unique strengths, and weaknesses, writes Dartmouth College's Sergey Bratus. He observes that many quirks of the hacker culture are rooted in frustration with certain industry and academic trends (pressure to follow standard solutions, a limited perspective of the API, a dearth of tools for studying the state of a system, etc.), which he believes contribute to the current abundance of software vulnerabilities. This in turn fuels the hacker culture's impetus to fully comprehend underlying standards and systems, which largely formalize hackers' learning and work ethic. Among the sources hackers tap to acquire skills are classic textbooks highly rated by fellow hackers, electronic magazines, online forums dedicated to specific technical areas, source code from released tools, talks and private communications at hacker conventions, and IRC communities. Hackers have a tendency to adopt a cross-layer approach that tracks data through multiple tiers of interfaces, in accordance with three guiding principles. Bratus lists these principles as inspecting the system state or network on all levels down to the bit level; injecting arbitrary data into the system or network; and identifying and second-guessing deployment peculiarities. The author concludes that in many respects, hacker culture "produces impressive results that enrich other computing cultures, and its influence and exchange of ideas with these other cultures are growing. So, understanding the hacker learning experience and approaches is becoming more important day by day."
Click Here to View Full Article

Sunday, July 1, 2007

Security: DHS and OMB Paper on Data Security Risk and Mitigation for Federal Agencies; 1st comment includes some risk assessment metrics advice

--DHS and OMB Paper on Data Security Risk and Mitigation for Federal Agencies (July 2007)

[in SANS NewsBites Vol. 9 Num. 57 of July 20,2007]

The US Department of Homeland Security (DHS) and the Office of

Management and Budget (OMB) have released a paper called "Common Risks

Impeding the Adequate Protection of Government Information." The "paper

identifies common risks or 'mistakes'" agencies make when protecting

sensitive data. Each risk is accompanied by a list of best practices

to avoid the pitfalls and a list of resources from which agencies can

draw support and obtain concrete information.

http://www.fcw.com/article103240-07-17-07-Web&printLayout

http://csrc.nist.gov/pcig/document/Common-Risks-Impeding-Adequate-Protection-Govt-Info.pdf

[Editor's Note (Kreitner): This document contains solid guidance for

managing the security of information, but it's implementation and

effectiveness will be unknown without tracking a few well-chosen

enterprise performance metrics, particularly results-oriented metrics .

I hope OMB and DHS will follow this up with an effort to devise some key

metrics. Metrics that highlight the root causes of security incidents

are a good place to start. Examples: Percent of incidents that

involved third parties; Percent of intrusions for which security

controls were known but not implemented that would have prevented the

intrusion. If enterprise management knows what is causing its security

incidents, it can apply its attention to eliminating those causes.

Several years ago, a sub-committee convened the Corporate Information

Security Working Group (CISWG) that developed a pretty good set of

information security metrics that provide some suggestions. See

http://www.cisecurity.org/Documents/BPMetricsTeamReportFinal111704Rev11005.pdf

(Honan): While this may prove to be an excellent resource, I always

worry when people title reports outlining recommendations with the word

"Adequate". I prefer my security, like my steak dinners, to be more

than "adequate".]

Wednesday, May 16, 2007

Security: No "Natural Proof" that certain computational problems used in cryptography are hard to solve

ACM Group Honors Research Team for Rare Finding in Computer Security
AScribe Newswire (05/16/07)

ACM's Special Interest Group on Algorithms and Computing Theory (SIGACT) announced that Alexander A. Razborov and Steven Rudich, two computer scientists who developed a rare finding that addresses the P vs. NP Problem, will receive the 2007 Godel Prize for outstanding papers in theoretical computer science at the ACM Symposium on Theory of Computing, which takes place June 11-13, 2007, in San Diego. P vs. NP is a fundamental problem in computer and network security techniques and many security optimization techniques. For years, questions on the limits of proof and computation raised by P vs. NP has hindered computer scientists. These questions affect complex mathematical problems common in creating security solutions for ATM cards, computer passwords, and electronic commerce. In a paper titled "Natural Proofs," originally presented at the ACM Symposium on Theory of Computing in 1994, Razborov and Rudich addressed what is widely considered to be the most important question in computing theory, and is one of seven $1 million reward Prize Problems by the Clay Mathematics Institute in Cambridge, Mass. The questions asks if the solution to a question is easily checked, is the problem easy to solve? Razborov and Rudich proved that there is no "Natural Proof" that certain computational problems used in cryptography are hard to solve, and though they are widely thought to be unbreakable, there is no natural proof that they are secure. Such cryptographic methods are critical to electronic commerce. Razborov is the leading researcher at the Russian Academy of Science Steklov Mathematical Institute in Moscow, Russia, and Rudich is an associate professor of computer science at Carnegie Mellon University.
Click Here to View Full Article

Blog Archive