Tuesday, November 30, 2010

Blog: IBM Chip Breakthrough May Lead to Exascale Supercomputers

IBM Chip Breakthrough May Lead to Exascale Supercomputers
Computerworld (11/30/10) Agam Shah

IBM's new CMOS Integrated Silicon Nanophotonics technology boosts the data transfer rate between computer chips using pulses of light, a development that could increase the performance of supercomputers by a thousand times or more. CMOS Integrated Silicon Nanophotonics combines electrical and optical components on one piece of silicon. The new technology can replace the copper wires that are used in most chips today. The integrated silicon converts electrical signals into pulses of light, making the communication between chips faster, says IBM researcher Will Green. He says the photonics technology could boost supercomputing calculations to speeds approaching an exaflop, which IBM hopes to develop into an exaflop computer by 2020. "This is an interesting milestone for system builders [who are] looking at building ... exascale systems in 10 years," Green says. IBM also plans to use the optics technology to develop new types of transistors. "The nice thing about it is we have a platform which allows us to address many different places simultaneously," he says.

View Full Article

Tuesday, November 16, 2010

Blog: ECS Researcher Highlights Need for Transparency on the Web

ECS Researcher Highlights Need for Transparency on the Web
University of Southampton (United Kingdom) (11/16/10) Joyce Lewis

The complex flows of information on the Web make it difficult to determine where information originates from, says University of Southampton professor Luc Moreau. "This is a challenge since we want to be able to establish the exact source of information, we want to decide whether information has been altered, and by whom, we want to corroborate and possibly reproduce such information, and ultimately we want to decide whether the information comes from a trustworthy source," Moreau says. The solution lies in provenance, which focuses on establishing that an object has not been forged or altered, and could apply to computer-generated data. He says enabling users to determine where data comes from and decide if it is trustworthy will lead to a new generation of Web services that are capable of producing trusted information. Moreau notes that systems would become transparent as a result of provenance. "Our aim, with the community of researchers, is to establish a standard method to ascertain the provenance of information on the Web," he says.

View Full Article

Blog: 'Chaogates' Hold Promise for the Semiconductor Industry

'Chaogates' Hold Promise for the Semiconductor Industry
EurekAlert (11/16/10) Jason Socrates Bardi

Researchers have created alternative logic gates, dubbed chaogates, by selecting desired patterns offered by a chaotic system, and using a subset to map system inputs to desired outputs. The process offers a way to use the richness of nonlinear dynamics to design computing devices with the capacity to reconfigure into a range of logic gates. "Chaogates are the building block of new, chaos-based computer systems that exploit the enormous pattern formation properties of chaotic systems for computation," says Arizona State University's William Ditto. "Imagine a computer that can change its own internal behavior to create a billion custom chips a second based on what the user is doing that second--one that can reconfigure itself to be the fastest computer for that moment, for your purpose." Ditto says chaogates offer advantages for gaming, secure computer chips, and custom, morphable gaming chips. He notes that integrated circuits using chaogates can be manufactured using existing production systems, and they can incorporate standard logic, memory, and chaogates on the same device.

View Full Article

Monday, November 15, 2010

Blog: Rensselaer Team Shows How to Analyze Raw Government Data

Rensselaer Team Shows How to Analyze Raw Government Data
RPI News (11/15/10) Marshall Hoffman; Mark Marchand

Researchers at Rensselaer Polytechnic Institute's Tetherless World Research Constellation have developed a method for finding relationships buried within government data, using mash-up technology that can combine it to identify new relationships. "We're working on designing simple yet robust Web technologies that allow someone with absolutely no expertise in Web Science or semantic programming to pull together data sets from Data.gov and elsewhere and weave them together in a meaningful way," says Rensselaer professor Deborah McGuinness. The approach also enables U.S. government agencies to share information more readily. The researchers developed a Web site that provides examples of what the approach can accomplish. The RPI researchers used Semantic Web technologies, enabling multiple data sets to be linked even when the underlying structure is different. "Data.gov mandates that all information is accessible from the same place, but the data is still in a hodgepodge of different formats using differing terms, and therefore challenging at best to analyze and take advantage of," says Rensselaer professor James Hendler. "We are developing techniques to help people mine, mix, and mash-up this treasure trove of data, letting them find meaningful information and interconnections."

View Full Article

Friday, November 12, 2010

Blog: Time to blow up best practices myths

Time to blow up best practices myths

By Dennis Howlett | November 12, 2010, 11:25am PST

I’m not sure if this is a trend but I am noticing the term ‘best practices’ turning up in many a presentation. It kinda goes like this: ‘We know your implementation is failing but if you follow these best practices, then hey presto, good karma will be magically restored to all.’ The polite way of characterizing this is ‘piffle.’ The less polite way is per this Tweet:

@SAP_Jarret The term ‘best practice’ is - Grade A BS. A best practice might exist for 1,2 co’s rarely if ever for everyone

This Tweet prompted SAP consultant Nathan Genez to get up on his hind legs on the SAP Community Netowrk and lay into the expression. He starts:

Best Practices are merely a guide or process that is believed to be more effective than an alternative processes. Note that they are not *the* solution even though most everyone in the industry equates one with the other. When interpreted from this more moderate (realistic?) viewpoint, they serve as a good reference point for SAP customers. Considering the large number of SAP projects that fail to live up to pre-implementation expectations and deliver sub-optimal solutions, it would seem that the industry would be falling over itself to continually refine these best practices. Part of that process would be to correctly interpret the phrase from the get-go but the industry doesn’t seem to care about that.

… But then in comes the consultants. As Nathan describes:

I dislike the phrase because I routinely see consultants using it as a shield. By that, I mean that they use the phrase “its best practice” as a way to justify what is, in fact, just their opinion. This seems to come mostly from people who can’t justify their answer on their own. They can’t explain the rationale behind why their solution is better / easier / quicker / more stable / etc. Either they don’t fully understand the functionality or process in question, or they aren’t aware of all of the alternative solutions that are available, and therefore can’t justify their answer based on merit. They take the easy way out… they recommend a course of action based on the little that they know and then append “its best practice” to it as if this will legitimze the inaccuracies of their answer. Then they sweat bullets as they pray that the other party won’t press them on the issue.

Nathan’s argument takes us to a level I believe is sorely under-estimated. When you look at the footprint that an ERP covers it may, and I say may, reach 30-45% of required functionality. It should therefore be obvious that what masquerades as a claimed best practice needs careful examination. Too often, customers are blinded by Jar-Gon and then wonder what went wrong.

READ FULL STORY

Blog: Rats to Robots--Brain's Grid Cells Tell Us How We Navigate

Rats to Robots--Brain's Grid Cells Tell Us How We Navigate
Queensland University of Technology (11/12/10) Niki Widdowson

Queensland University of Technology (QUT) robotics researchers have formulated a theory on how the brain combines separate pieces of information to map out familiar environments and navigate them. The theory was prompted by practical improvements that were made to the navigation system of robots that were having problems with some navigational tasks. QUT's Michael Milford says that Norwegian researchers recently discovered new cells in the brains of rats that are arranged in a grid and fire every time a rat is in one of a number of locations. Preliminary evidence also suggests that other animals, including humans, have certain cells that fire only when they are in a certain place. A person who may not be paying attention when exiting an elevator would begin to think he or she is on the second floor when seeing a Coke machine and then a photocopier. "We are postulating that the 'grid cells' help put these two pieces of information together to tell you you're on the second floor," Milford says. "In this study we are able to enhance our understanding of the brain by providing insights into how the brain might solve a common problem faced by both mobile robots and animals."

View Full Article

Blog: Algorithm Pioneer Wins Kyoto Prize

Algorithm Pioneer Wins Kyoto Prize
EE Times (11/12/10) R. Colin Johnson

Eotvos Lorand University professor Laszlo Lovasz, who has solved several information technology (IT) problems using graph theory, has been awarded the Kyoto Prize. "Graph theory represents a different approach to optimization problems that uses geometry to compute results instead of differential equations," says Lovasz. "It turns out that very large networks in many different fields can be described by graphs, from cryptography to physical systems." His work has led to breakthroughs in RSA encryption technology, 4G channel capacity, extending the point-to-point IT of Claude Shannon, and the weak perfect graph conjecture. Lovasz may be best known for the breakthrough principles called the "Lovasz local lemma" and the "LLL-algorithm," which are widely used in cryptography, and for the multiple-input and multiple-output wireless communications scheme. The Kyoto Prize was founded by Kyocera chairman Kazuo Inamori in 1984 and comes with a $550,000 award.

View Full Article

Monday, November 8, 2010

Blog: The Ethical Robot

The Ethical Robot
University of Connecticut (11/08/10) Christine Buckley; Bret Eckhardt

University of Connecticut professor Susan Anderson and University of Hartford computer scientist Michael Anderson have programmed a robot to behave ethically. Their work is part of a relatively new field of research known as machine ethics. "There are machines out there that are already doing things that have ethical import, such as automatic cash withdrawal machines, and many others in the development stages, such as cars that can drive themselves and eldercare robots," says Susan Anderson. Machine ethics combines artificial intelligence with ethical theory to determine how to program machines to behave ethically. The robot, called Nao, is programmed with an ethical principle that determines how often to remind people to take their medicine and when to notify a doctor when they do not comply. "We should think about the things that robots could do for us if they had ethics inside them," says Michael Anderson. Interacting with robots that have been programmed to behave ethically could inspire humans to behave more ethically, says Susan Anderson.

View Full Article

Blog: Part Moth, Part Machine: Cyborgs Are on the Move

Part Moth, Part Machine: Cyborgs Are on the Move
New Scientist (11/08/10) Duncan Graham-Rowe

Researchers are developing methods to produce complex behavior from robots by tapping into the nervous system of living organisms and using algorithms that already exist in nature. For example, Tokyo Institute of Technology researchers have developed a cyborg moth that uses chemical plume tracking to locate the source of certain pheromones. The researchers immobilized a moth on a small wheeled robot and placed two recording electrodes into nerves running down its neck to monitor commands the moth uses to steer. By rerouting these signals to motors in the robot, the researchers found that they could emulate the moth's plume-tracking behavior. Researchers also hope to recreate biological circuits in silicon, says Northwestern University's Ferdinando Mussa-Ivaldi. Scientists have made progress toward this goal with central pattern generators (CPGs), which are a type of behavioral circuit in the human brain and spine that carry out routine tasks with little or no conscious input, such as walking or grasping an object. Johns Hopkins University's Ralph Etienne-Cummings has used recordings of CPGs taken from a lamprey to generate walking motions in a pair of robotic legs.

View Full Article - May Require Free Registration

Friday, November 5, 2010

Blog: Gartner Report: The Future of Information Security is Context Aware and Adaptive

Note the futility of following the static approach to security. Another important issue, probably covered in the report, is the false sense of security that comes from depending on a static security environment.

--Peter



http://img.en25.com/eloquaimages/tinydot.gif

Dear Peter,

Does your current intrusion prevention system (IPS) provide you with contextual awareness to ensure you can accurately identify real threats in real-time? In the Gartner report, The Future of Information Security is Context Aware and Adaptive, Neil MacDonald discusses how today's static security infrastructure no longer protects networks against growing dynamic threats and recommends that companies "begin the transformation to context-aware and adaptive security infrastructure now as you replace legacy static security infrastructure, such as firewalls, and Web security gateway and endpoint protection platforms."

The report also describes why network security providers must move to contextually aware and adaptive security.

Click the link below to download the report:

The Future of Information Security is Context Aware and Adaptive

The Sourcefire® next-generation IPS not only provides context to vulnerabilities and attacks, but it also adapts to the network in real time. The Sourcefire next-generation IPS provides IT professionals with 100% network visibility and the ability to reduce actionable events by up to 99.99%.

Requirements

Traditional IPS

Sourcefire Next-generation IPS

Contextual Network Awareness

Static, blindly enforces predefined policies; lacks contextual information to know which events are relevant and which are not

Monitors network in real time and collects contextual information regarding the devices, applications, and services deployed to identify and prioritize potential vulnerabilities with speed and accuracy

Adaptive Security

Closed architecture with a one-size-fits-all approach

Open architecture with customized, automated tuning and impact assessment based on real-time network changes

Behavior Awareness

Unable to detect network behavior

Establishes "normal" traffic baselines and detects network anomalies in real time

Application Awareness

Unable to detect applications

Capable of detecting hundreds of applications accessing network resources; provides Policy enforcement Point that enables users to block specific applications

Identity Awareness

Does not provide identity information, forcing users to manually search for information

Enables users to correlate threat, end-point, and network intelligence with user identity information

Virtual Security

Static security based on physical attributes

Extends visibility and security into the far corners of the network through virtual management console and IPS


Click here to view the full report

http://img.en25.com/eloquaimages/tinydot.gif

©2010 Sourcefire, Inc.

The Gartner Report described above (ID Number:G00200385, 14 May 2010) represents data, research opinion or viewpoints published, as part of a syndicated subscription service available only to clients, by Gartner, Inc., a corporation organized under the laws of the State of Delaware, USA, and its subsidiaries ("Gartner"), and are not representations of fact. Each Gartner Report speaks as of its original publication date (and not as of the date of this research report) and the opinions expressed in the Gartner Reports are subject to change without notice. Gartner is not responsible, nor shall it have any liability, to any reader of this research report for errors, omissions or inadequacies in, or for any interpretations of, or for any calculations based upon data contained in, the Gartner Reports or any excerpts thereof.

http://app.en25.com/e/FooterImages/FooterImage1.aspx?elq=3095e70680974776b30a817d2b381f99&siteid=302

Thursday, November 4, 2010

Blog: Metasploit and SCADA exploits: dawn of a new era?

Metasploit and SCADA exploits: dawn of a new era?

By Ryan Naraine | November 4, 2010, 11:23am PDT

On 18 October, 2010 a significant event occurred concerning threats to SCADA (supervisory control and data acquisition) environments. Let’s think through the ramifications. That event is the addition of a zero-day exploit for the RealFlex RealWin SCADA software product into the Metasploit repository.

Wednesday, November 3, 2010

Blog: The bigger the system, the greater the chance of failure

The bigger the system, the greater the chance of failure

By Joe McKendrick

November 3, 2010, 7:00pm PDT

IT projects will see more success as smaller, bite-size chunks, especially since software has reached a state of complexity far beyond the ability of any individual.

READ FULL STORY

Blog: New Google Tool Makes Websites Twice as Fast

New Google Tool Makes Websites Twice as Fast
Technology Review (11/03/10) Erica Naone

Google has released mod_pagespeed, free software for Apache servers that could make many Web sites load twice as fast. Once installed, the software spontaneously determines way to optimize a Web site's performance. "We think making the whole Web faster is critical to Google's success," says Google's Richard Rabbat. The tool could be especially useful to small Web site operators and anyone that uses content management systems to operate their Web sites, since they often lack the technical savvy and time needed to make their own speed improvements to Web server software. During testing, mod_pagespeed was able to make some Web sites load three times faster, depending on how much optimization had already been done. The program builds on Google's existing Page Speed program, which measures the speed at which Web sites load and offers suggestions on how to make them load faster.

View Full Article

Tuesday, November 2, 2010

Blog: A Software Application Recognizes Human Emotions From Conversation Analysis

A Software Application Recognizes Human Emotions From Conversation Analysis
Universidad Politecnica de Madrid (Spain) (11/02/10) Eduardo Martinez

Researchers at the Universidad Politecnica de Madrid have developed an application that can recognize human emotion from automated voice analysis. The program, based on a fuzzy logic tool called RFuzzy, analyzes a conversation and can determine whether the speaker is sad, happy, or nervous. If the emotion is unclear, the program can specify how close the speaker is to each emotion in terms of a percentage. RFuzzy also can reason with subjective concepts such as high, low, fast, and slow. The researchers say RFuzzy, which was written in Prolog, also could be used in conversation analysis and robot intelligence applications. For example, RFuzzy was used to program robots that participated in the RoboCupSoccer league. Because RFuzzy's logical mechanisms are flexible, its analysis can be interpreted based on logic rules that use measurable parameters, such as volume, position, distance from the ball, and speed.

View Full Article

Monday, November 1, 2010

Blog: New Help on Testing for Common Cause of Software Bugs

New Help on Testing for Common Cause of Software Bugs
Government Computer News (11/01/10) William Jackson

As part of the Automated Combinatorial Testing for Software (ACTS) program, the U.S. National Institute of Standards and Technology (NIST) has developed algorithms for automated testing of the multiple variables in software that can cause security faults. Research has shown that at least 89 percent of security faults are caused by combinations of no more than four variables, and nearly all are caused by no more than six variables, according to NIST. "This finding has important implications for testing because it suggests that testing combinations of parameters can provide highly effective fault detection," NIST says. The ACTS program is a collaborative effort by NIST, the U.S. Air Force, the University of Texas at Arlington, George Mason University, Utah State University, the University of Maryland, and North Carolina State University to produce methods and tools to generate tests for any number of variable combinations.

View Full Article

Blog Archive