IBM Chip Breakthrough May Lead to Exascale Supercomputers
Computerworld (11/30/10) Agam Shah
IBM's new CMOS Integrated Silicon Nanophotonics technology boosts the data transfer rate between computer chips using pulses of light, a development that could increase the performance of supercomputers by a thousand times or more. CMOS Integrated Silicon Nanophotonics combines electrical and optical components on one piece of silicon. The new technology can replace the copper wires that are used in most chips today. The integrated silicon converts electrical signals into pulses of light, making the communication between chips faster, says IBM researcher Will Green. He says the photonics technology could boost supercomputing calculations to speeds approaching an exaflop, which IBM hopes to develop into an exaflop computer by 2020. "This is an interesting milestone for system builders [who are] looking at building ... exascale systems in 10 years," Green says. IBM also plans to use the optics technology to develop new types of transistors. "The nice thing about it is we have a platform which allows us to address many different places simultaneously," he says.
Tuesday, November 30, 2010
Blog: IBM Chip Breakthrough May Lead to Exascale Supercomputers
Tuesday, November 16, 2010
Blog: ECS Researcher Highlights Need for Transparency on the Web
ECS Researcher Highlights Need for Transparency on the Web
University of Southampton (United Kingdom) (11/16/10) Joyce Lewis
The complex flows of information on the Web make it difficult to determine where information originates from, says University of Southampton professor Luc Moreau. "This is a challenge since we want to be able to establish the exact source of information, we want to decide whether information has been altered, and by whom, we want to corroborate and possibly reproduce such information, and ultimately we want to decide whether the information comes from a trustworthy source," Moreau says. The solution lies in provenance, which focuses on establishing that an object has not been forged or altered, and could apply to computer-generated data. He says enabling users to determine where data comes from and decide if it is trustworthy will lead to a new generation of Web services that are capable of producing trusted information. Moreau notes that systems would become transparent as a result of provenance. "Our aim, with the community of researchers, is to establish a standard method to ascertain the provenance of information on the Web," he says.
Blog: 'Chaogates' Hold Promise for the Semiconductor Industry
'Chaogates' Hold Promise for the Semiconductor Industry
EurekAlert (11/16/10) Jason Socrates Bardi
Researchers have created alternative logic gates, dubbed chaogates, by selecting desired patterns offered by a chaotic system, and using a subset to map system inputs to desired outputs. The process offers a way to use the richness of nonlinear dynamics to design computing devices with the capacity to reconfigure into a range of logic gates. "Chaogates are the building block of new, chaos-based computer systems that exploit the enormous pattern formation properties of chaotic systems for computation," says Arizona State University's William Ditto. "Imagine a computer that can change its own internal behavior to create a billion custom chips a second based on what the user is doing that second--one that can reconfigure itself to be the fastest computer for that moment, for your purpose." Ditto says chaogates offer advantages for gaming, secure computer chips, and custom, morphable gaming chips. He notes that integrated circuits using chaogates can be manufactured using existing production systems, and they can incorporate standard logic, memory, and chaogates on the same device.
Monday, November 15, 2010
Blog: Rensselaer Team Shows How to Analyze Raw Government Data
Rensselaer Team Shows How to Analyze Raw Government Data
RPI News (11/15/10) Marshall Hoffman; Mark Marchand
Researchers at Rensselaer Polytechnic Institute's Tetherless World Research Constellation have developed a method for finding relationships buried within government data, using mash-up technology that can combine it to identify new relationships. "We're working on designing simple yet robust Web technologies that allow someone with absolutely no expertise in Web Science or semantic programming to pull together data sets from Data.gov and elsewhere and weave them together in a meaningful way," says Rensselaer professor Deborah McGuinness. The approach also enables U.S. government agencies to share information more readily. The researchers developed a Web site that provides examples of what the approach can accomplish. The RPI researchers used Semantic Web technologies, enabling multiple data sets to be linked even when the underlying structure is different. "Data.gov mandates that all information is accessible from the same place, but the data is still in a hodgepodge of different formats using differing terms, and therefore challenging at best to analyze and take advantage of," says Rensselaer professor James Hendler. "We are developing techniques to help people mine, mix, and mash-up this treasure trove of data, letting them find meaningful information and interconnections."
Friday, November 12, 2010
Blog: Time to blow up best practices myths
Time to blow up best practices myths
By Dennis Howlett | November 12, 2010, 11:25am PST
I’m not sure if this is a trend but I am noticing the term ‘best practices’ turning up in many a presentation. It kinda goes like this: ‘We know your implementation is failing but if you follow these best practices, then hey presto, good karma will be magically restored to all.’ The polite way of characterizing this is ‘piffle.’ The less polite way is per this Tweet:
@SAP_Jarret The term ‘best practice’ is - Grade A BS. A best practice might exist for 1,2 co’s rarely if ever for everyone
This Tweet prompted SAP consultant Nathan Genez to get up on his hind legs on the SAP Community Netowrk and lay into the expression. He starts:
Best Practices are merely a guide or process that is believed to be more effective than an alternative processes. Note that they are not *the* solution even though most everyone in the industry equates one with the other. When interpreted from this more moderate (realistic?) viewpoint, they serve as a good reference point for SAP customers. Considering the large number of SAP projects that fail to live up to pre-implementation expectations and deliver sub-optimal solutions, it would seem that the industry would be falling over itself to continually refine these best practices. Part of that process would be to correctly interpret the phrase from the get-go but the industry doesn’t seem to care about that.
… But then in comes the consultants. As Nathan describes:
I dislike the phrase because I routinely see consultants using it as a shield. By that, I mean that they use the phrase “its best practice” as a way to justify what is, in fact, just their opinion. This seems to come mostly from people who can’t justify their answer on their own. They can’t explain the rationale behind why their solution is better / easier / quicker / more stable / etc. Either they don’t fully understand the functionality or process in question, or they aren’t aware of all of the alternative solutions that are available, and therefore can’t justify their answer based on merit. They take the easy way out… they recommend a course of action based on the little that they know and then append “its best practice” to it as if this will legitimze the inaccuracies of their answer. Then they sweat bullets as they pray that the other party won’t press them on the issue.
Nathan’s argument takes us to a level I believe is sorely under-estimated. When you look at the footprint that an ERP covers it may, and I say may, reach 30-45% of required functionality. It should therefore be obvious that what masquerades as a claimed best practice needs careful examination. Too often, customers are blinded by Jar-Gon and then wonder what went wrong.
Blog: Rats to Robots--Brain's Grid Cells Tell Us How We Navigate
Rats to Robots--Brain's Grid Cells Tell Us How We Navigate
Queensland University of Technology (11/12/10) Niki Widdowson
Queensland University of Technology (QUT) robotics researchers have formulated a theory on how the brain combines separate pieces of information to map out familiar environments and navigate them. The theory was prompted by practical improvements that were made to the navigation system of robots that were having problems with some navigational tasks. QUT's Michael Milford says that Norwegian researchers recently discovered new cells in the brains of rats that are arranged in a grid and fire every time a rat is in one of a number of locations. Preliminary evidence also suggests that other animals, including humans, have certain cells that fire only when they are in a certain place. A person who may not be paying attention when exiting an elevator would begin to think he or she is on the second floor when seeing a Coke machine and then a photocopier. "We are postulating that the 'grid cells' help put these two pieces of information together to tell you you're on the second floor," Milford says. "In this study we are able to enhance our understanding of the brain by providing insights into how the brain might solve a common problem faced by both mobile robots and animals."
Blog: Algorithm Pioneer Wins Kyoto Prize
Algorithm Pioneer Wins Kyoto Prize
EE Times (11/12/10) R. Colin Johnson
Eotvos Lorand University professor Laszlo Lovasz, who has solved several information technology (IT) problems using graph theory, has been awarded the Kyoto Prize. "Graph theory represents a different approach to optimization problems that uses geometry to compute results instead of differential equations," says Lovasz. "It turns out that very large networks in many different fields can be described by graphs, from cryptography to physical systems." His work has led to breakthroughs in RSA encryption technology, 4G channel capacity, extending the point-to-point IT of Claude Shannon, and the weak perfect graph conjecture. Lovasz may be best known for the breakthrough principles called the "Lovasz local lemma" and the "LLL-algorithm," which are widely used in cryptography, and for the multiple-input and multiple-output wireless communications scheme. The Kyoto Prize was founded by Kyocera chairman Kazuo Inamori in 1984 and comes with a $550,000 award.
Monday, November 8, 2010
Blog: The Ethical Robot
The Ethical Robot
University of Connecticut (11/08/10) Christine Buckley; Bret Eckhardt
University of Connecticut professor Susan Anderson and University of Hartford computer scientist Michael Anderson have programmed a robot to behave ethically. Their work is part of a relatively new field of research known as machine ethics. "There are machines out there that are already doing things that have ethical import, such as automatic cash withdrawal machines, and many others in the development stages, such as cars that can drive themselves and eldercare robots," says Susan Anderson. Machine ethics combines artificial intelligence with ethical theory to determine how to program machines to behave ethically. The robot, called Nao, is programmed with an ethical principle that determines how often to remind people to take their medicine and when to notify a doctor when they do not comply. "We should think about the things that robots could do for us if they had ethics inside them," says Michael Anderson. Interacting with robots that have been programmed to behave ethically could inspire humans to behave more ethically, says Susan Anderson.
Blog: Part Moth, Part Machine: Cyborgs Are on the Move
Part Moth, Part Machine: Cyborgs Are on the Move
New Scientist (11/08/10) Duncan Graham-Rowe
Researchers are developing methods to produce complex behavior from robots by tapping into the nervous system of living organisms and using algorithms that already exist in nature. For example, Tokyo Institute of Technology researchers have developed a cyborg moth that uses chemical plume tracking to locate the source of certain pheromones. The researchers immobilized a moth on a small wheeled robot and placed two recording electrodes into nerves running down its neck to monitor commands the moth uses to steer. By rerouting these signals to motors in the robot, the researchers found that they could emulate the moth's plume-tracking behavior. Researchers also hope to recreate biological circuits in silicon, says Northwestern University's Ferdinando Mussa-Ivaldi. Scientists have made progress toward this goal with central pattern generators (CPGs), which are a type of behavioral circuit in the human brain and spine that carry out routine tasks with little or no conscious input, such as walking or grasping an object. Johns Hopkins University's Ralph Etienne-Cummings has used recordings of CPGs taken from a lamprey to generate walking motions in a pair of robotic legs.
View Full Article - May Require Free Registration
Friday, November 5, 2010
Blog: Gartner Report: The Future of Information Security is Context Aware and Adaptive
Note the futility of following the static approach to security. Another important issue, probably covered in the report, is the false sense of security that comes from depending on a static security environment.
--Peter
|
Thursday, November 4, 2010
Blog: Metasploit and SCADA exploits: dawn of a new era?
Metasploit and SCADA exploits: dawn of a new era?
By Ryan Naraine | November 4, 2010, 11:23am PDT
On 18 October, 2010 a significant event occurred concerning threats to SCADA (supervisory control and data acquisition) environments. Let’s think through the ramifications. That event is the addition of a zero-day exploit for the RealFlex RealWin SCADA software product into the Metasploit repository.
Wednesday, November 3, 2010
Blog: The bigger the system, the greater the chance of failure
The bigger the system, the greater the chance of failure By Joe McKendrick November 3, 2010, 7:00pm PDT | |
IT projects will see more success as smaller, bite-size chunks, especially since software has reached a state of complexity far beyond the ability of any individual. |
Blog: New Google Tool Makes Websites Twice as Fast
New Google Tool Makes Websites Twice as Fast
Technology Review (11/03/10) Erica Naone
Google has released mod_pagespeed, free software for Apache servers that could make many Web sites load twice as fast. Once installed, the software spontaneously determines way to optimize a Web site's performance. "We think making the whole Web faster is critical to Google's success," says Google's Richard Rabbat. The tool could be especially useful to small Web site operators and anyone that uses content management systems to operate their Web sites, since they often lack the technical savvy and time needed to make their own speed improvements to Web server software. During testing, mod_pagespeed was able to make some Web sites load three times faster, depending on how much optimization had already been done. The program builds on Google's existing Page Speed program, which measures the speed at which Web sites load and offers suggestions on how to make them load faster.
Tuesday, November 2, 2010
Blog: A Software Application Recognizes Human Emotions From Conversation Analysis
A Software Application Recognizes Human Emotions From Conversation Analysis
Universidad Politecnica de Madrid (Spain) (11/02/10) Eduardo Martinez
Researchers at the Universidad Politecnica de Madrid have developed an application that can recognize human emotion from automated voice analysis. The program, based on a fuzzy logic tool called RFuzzy, analyzes a conversation and can determine whether the speaker is sad, happy, or nervous. If the emotion is unclear, the program can specify how close the speaker is to each emotion in terms of a percentage. RFuzzy also can reason with subjective concepts such as high, low, fast, and slow. The researchers say RFuzzy, which was written in Prolog, also could be used in conversation analysis and robot intelligence applications. For example, RFuzzy was used to program robots that participated in the RoboCupSoccer league. Because RFuzzy's logical mechanisms are flexible, its analysis can be interpreted based on logic rules that use measurable parameters, such as volume, position, distance from the ball, and speed.
Monday, November 1, 2010
Blog: New Help on Testing for Common Cause of Software Bugs
New Help on Testing for Common Cause of Software Bugs
Government Computer News (11/01/10) William Jackson
As part of the Automated Combinatorial Testing for Software (ACTS) program, the U.S. National Institute of Standards and Technology (NIST) has developed algorithms for automated testing of the multiple variables in software that can cause security faults. Research has shown that at least 89 percent of security faults are caused by combinations of no more than four variables, and nearly all are caused by no more than six variables, according to NIST. "This finding has important implications for testing because it suggests that testing combinations of parameters can provide highly effective fault detection," NIST says. The ACTS program is a collaborative effort by NIST, the U.S. Air Force, the University of Texas at Arlington, George Mason University, Utah State University, the University of Maryland, and North Carolina State University to produce methods and tools to generate tests for any number of variable combinations.
Blog Archive
-
►
2012
(35)
- ► April 2012 (13)
- ► March 2012 (16)
- ► February 2012 (3)
- ► January 2012 (3)
-
►
2011
(118)
- ► December 2011 (9)
- ► November 2011 (11)
- ► October 2011 (7)
- ► September 2011 (13)
- ► August 2011 (7)
- ► April 2011 (8)
- ► March 2011 (11)
- ► February 2011 (12)
- ► January 2011 (15)
-
▼
2010
(183)
- ► December 2010 (16)
-
▼
November 2010
(15)
- Blog: IBM Chip Breakthrough May Lead to Exascale S...
- Blog: ECS Researcher Highlights Need for Transpare...
- Blog: 'Chaogates' Hold Promise for the Semiconduct...
- Blog: Rensselaer Team Shows How to Analyze Raw Gov...
- Blog: Time to blow up best practices myths
- Blog: Rats to Robots--Brain's Grid Cells Tell Us H...
- Blog: Algorithm Pioneer Wins Kyoto Prize
- Blog: The Ethical Robot
- Blog: Part Moth, Part Machine: Cyborgs Are on the ...
- Blog: Gartner Report: The Future of Information Se...
- Blog: Metasploit and SCADA exploits: dawn of a new...
- Blog: The bigger the system, the greater the chanc...
- Blog: New Google Tool Makes Websites Twice as Fast
- Blog: A Software Application Recognizes Human Emot...
- Blog: New Help on Testing for Common Cause of Soft...
- ► October 2010 (15)
- ► September 2010 (25)
- ► August 2010 (19)
- ► April 2010 (21)
- ► March 2010 (7)
- ► February 2010 (6)
- ► January 2010 (6)
-
►
2009
(120)
- ► December 2009 (5)
- ► November 2009 (12)
- ► October 2009 (2)
- ► September 2009 (3)
- ► August 2009 (16)
- ► April 2009 (4)
- ► March 2009 (20)
- ► February 2009 (9)
- ► January 2009 (19)
-
►
2008
(139)
- ► December 2008 (15)
- ► November 2008 (16)
- ► October 2008 (17)
- ► September 2008 (2)
- ► August 2008 (2)
- ► April 2008 (12)
- ► March 2008 (25)
- ► February 2008 (16)
- ► January 2008 (6)
-
►
2007
(17)
- ► December 2007 (4)
- ► November 2007 (4)
- ► October 2007 (7)
Blog Labels
- research
- CSE
- security
- software
- web
- AI
- development
- hardware
- algorithm
- hackers
- medical
- machine learning
- robotics
- data-mining
- semantic web
- quantum computing
- Cloud computing
- cryptography
- network
- EMR
- search
- NP-complete
- linguistics
- complexity
- data clustering
- optimization
- parallel
- performance
- social network
- HIPAA
- accessibility
- biometrics
- connectionist
- cyber security
- passwords
- voting
- XML
- biological computing
- neural network
- user interface
- DNS
- access control
- firewall
- graph theory
- grid computing
- identity theft
- project management
- role-based
- HTML5
- NLP
- NoSQL
- Python
- cell phone
- database
- java
- open-source
- spam
- GENI
- Javascript
- SQL-Injection
- Wikipedia
- agile
- analog computing
- archives
- biological
- bots
- cellular automata
- computer tips
- crowdsourcing
- e-book
- equilibrium
- game theory
- genetic algorithm
- green tech
- mobile
- nonlinear
- p
- phone
- prediction
- privacy
- self-book publishing
- simulation
- testing
- virtual server
- visualization
- wireless