Java Is Becoming the New Cobol
InfoWorld (12/28/07) Snyder, Bill
Java is becoming less popular with developers as many are switching to Ruby on Rails, PHP, AJAX, and Microsoft's .Net to develop rich Internet applications. Many developers feel that Java slows them down. Peter Thoneny, CEO of Twiki.net, which produces a certified version of the open source Twiki wiki-platform software, says Java promised to solve incompatibility problems across platforms, but the different versions and different downloads of Java are creating complications. Ofer Ronen, CEO of Sendori, which routes domain traffic to online advertisers and ad networks, says languages such as Ruby offer pre-built structures such as shopping carts that would have to be built from scratch with Java. Zephyr CEO Samir Shah says Java's user-interface capabilities and memory footprint simply do not measure up and put it at a serious disadvantage in regards to mobile application development. Nevertheless, developers and analysts agree that Java is still going strong in internally developed enterprise apps. "On the back end, there is still a substantial amount of infrastructure available that makes Java a very strong contender," Shah says.
Click Here to View Full Article
Friday, December 28, 2007
Saturday, December 22, 2007
Security: Wi-Fi Routers Are Vulnerable to Viruses
Wi-Fi Routers Are Vulnerable to Viruses
New Scientist (12/22/07) Merali, Zeeya
Indiana University in Bloomington researcher Steven Myers has been investigating how a virus could be spread between wireless routers. "We forget that routers are mini-computers," Myers says. "They have memory, they are networked, and they are programmable." However, routers are not usually scanned for viruses or protected by firewalls, and while Myers says there are no known viruses that target routers, they are still easy targets. Routers within about 100 meters would be able to spread viruses to one another and create a vast network for viruses. While routers normally do not communicate with each other, it would be easy for hackers to create a virus that enables routers to communicate. Myers used records on the location of Wi-Fi routers around Chicago, Manhattan, San Francisco, Boston, and parts of Indianapolis to create a simulation of how a router attack might spread. In each simulated city, viruses were able to jump between routers lacking high-security encryption within 45 meters of each other. The virus spread surprisingly fast, with most of the tens of thousands of routers becoming infected within 48 hours. The geography of the cities affected how the virus spread, with rivers and bays acting as "natural firewalls." Routers can be protected by changing the password from the default setting and enabling high-security WPA encryption. University of Cambridge computer scientist Ross Anderson says the study exposes a more significant problem in that all electronics, including phones, routers, and even microwaves, are being built with software that could potentially become infected.
Click Here to View Full Article
New Scientist (12/22/07) Merali, Zeeya
Indiana University in Bloomington researcher Steven Myers has been investigating how a virus could be spread between wireless routers. "We forget that routers are mini-computers," Myers says. "They have memory, they are networked, and they are programmable." However, routers are not usually scanned for viruses or protected by firewalls, and while Myers says there are no known viruses that target routers, they are still easy targets. Routers within about 100 meters would be able to spread viruses to one another and create a vast network for viruses. While routers normally do not communicate with each other, it would be easy for hackers to create a virus that enables routers to communicate. Myers used records on the location of Wi-Fi routers around Chicago, Manhattan, San Francisco, Boston, and parts of Indianapolis to create a simulation of how a router attack might spread. In each simulated city, viruses were able to jump between routers lacking high-security encryption within 45 meters of each other. The virus spread surprisingly fast, with most of the tens of thousands of routers becoming infected within 48 hours. The geography of the cities affected how the virus spread, with rivers and bays acting as "natural firewalls." Routers can be protected by changing the password from the default setting and enabling high-security WPA encryption. University of Cambridge computer scientist Ross Anderson says the study exposes a more significant problem in that all electronics, including phones, routers, and even microwaves, are being built with software that could potentially become infected.
Click Here to View Full Article
Tuesday, December 11, 2007
Security: DNS Attack Could Signal Phishing 2.0
DNS Attack Could Signal Phishing 2.0
Robert McMillan, IDG News Service
Researchers at Google and the Georgia Institute of Technology are studying a virtually undetectable form of attack that quietly controls where victims go on the Internet
The study, set to be published in February, takes a close look at "open recursive" DNS servers, which are used to tell computers how to find each other on the Internet by translating domain names like google.com into numerical Internet Protocol addresses. Criminals are using these servers in combination with new attack techniques to develop a new generation of phishing attacks.
…
The Georgia Tech and Google researchers estimate that as many as 0.4 percent, or 68,000, open-recursive DNS servers are behaving maliciously, returning false answers to DNS queries. They also estimate that another two percent of them provide questionable results. Collectively, these servers are beginning to form a "second secret authority" for DNS that is undermining the trustworthiness of the Internet, the researchers warned.
http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9052198
Robert McMillan, IDG News Service
Researchers at Google and the Georgia Institute of Technology are studying a virtually undetectable form of attack that quietly controls where victims go on the Internet
The study, set to be published in February, takes a close look at "open recursive" DNS servers, which are used to tell computers how to find each other on the Internet by translating domain names like google.com into numerical Internet Protocol addresses. Criminals are using these servers in combination with new attack techniques to develop a new generation of phishing attacks.
…
The Georgia Tech and Google researchers estimate that as many as 0.4 percent, or 68,000, open-recursive DNS servers are behaving maliciously, returning false answers to DNS queries. They also estimate that another two percent of them provide questionable results. Collectively, these servers are beginning to form a "second secret authority" for DNS that is undermining the trustworthiness of the Internet, the researchers warned.
http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9052198
Saturday, December 1, 2007
Software: Computing in a Parallel Universe
Computing in a Parallel Universe
American Scientist (12/07) Vol. 95, No. 6, P. 476; Hayes, Brian
Multicore chips that facilitate parallel processing will require a major rethinking in program design, writes Brian Hayes. Software for parallel processors is vulnerable to subtle bugs that cannot manifest themselves in strictly sequential programs. Running correct concurrent programs is possible, but a key challenge is that running the same set of programs on the same set of inputs can entail different results depending on the precise timing of events. One concept for addressing this problem is to have the operating system manage the allocation of tasks to processors and balance the workload, which is currently the chief strategy with time-sliced multiprocessing and dual-core chips. Another tactic is to assign this responsibility to the compiler, which, like the earlier strategy, would be the job of expert programmers. But making the most of parallel computing requires all programmers to deal with the problems of creating programs that run efficiently and properly on multicore systems. "We have a historic opportunity to clean out the closet of computer science, to throw away all those dusty old sorting algorithms and the design patterns that no longer fit," Hayes concludes. "We get to make a fresh start."
Click Here to View Full Article
American Scientist (12/07) Vol. 95, No. 6, P. 476; Hayes, Brian
Multicore chips that facilitate parallel processing will require a major rethinking in program design, writes Brian Hayes. Software for parallel processors is vulnerable to subtle bugs that cannot manifest themselves in strictly sequential programs. Running correct concurrent programs is possible, but a key challenge is that running the same set of programs on the same set of inputs can entail different results depending on the precise timing of events. One concept for addressing this problem is to have the operating system manage the allocation of tasks to processors and balance the workload, which is currently the chief strategy with time-sliced multiprocessing and dual-core chips. Another tactic is to assign this responsibility to the compiler, which, like the earlier strategy, would be the job of expert programmers. But making the most of parallel computing requires all programmers to deal with the problems of creating programs that run efficiently and properly on multicore systems. "We have a historic opportunity to clean out the closet of computer science, to throw away all those dusty old sorting algorithms and the design patterns that no longer fit," Hayes concludes. "We get to make a fresh start."
Click Here to View Full Article
Subscribe to:
Posts (Atom)
Blog Archive
-
►
2012
(35)
- ► April 2012 (13)
- ► March 2012 (16)
- ► February 2012 (3)
- ► January 2012 (3)
-
►
2011
(118)
- ► December 2011 (9)
- ► November 2011 (11)
- ► October 2011 (7)
- ► September 2011 (13)
- ► August 2011 (7)
- ► April 2011 (8)
- ► March 2011 (11)
- ► February 2011 (12)
- ► January 2011 (15)
-
►
2010
(183)
- ► December 2010 (16)
- ► November 2010 (15)
- ► October 2010 (15)
- ► September 2010 (25)
- ► August 2010 (19)
- ► April 2010 (21)
- ► March 2010 (7)
- ► February 2010 (6)
- ► January 2010 (6)
-
►
2009
(120)
- ► December 2009 (5)
- ► November 2009 (12)
- ► October 2009 (2)
- ► September 2009 (3)
- ► August 2009 (16)
- ► April 2009 (4)
- ► March 2009 (20)
- ► February 2009 (9)
- ► January 2009 (19)
-
►
2008
(139)
- ► December 2008 (15)
- ► November 2008 (16)
- ► October 2008 (17)
- ► September 2008 (2)
- ► August 2008 (2)
- ► April 2008 (12)
- ► March 2008 (25)
- ► February 2008 (16)
- ► January 2008 (6)
Blog Labels
- research
- CSE
- security
- software
- web
- AI
- development
- hardware
- algorithm
- hackers
- medical
- machine learning
- robotics
- data-mining
- semantic web
- quantum computing
- Cloud computing
- cryptography
- network
- EMR
- search
- NP-complete
- linguistics
- complexity
- data clustering
- optimization
- parallel
- performance
- social network
- HIPAA
- accessibility
- biometrics
- connectionist
- cyber security
- passwords
- voting
- XML
- biological computing
- neural network
- user interface
- DNS
- access control
- firewall
- graph theory
- grid computing
- identity theft
- project management
- role-based
- HTML5
- NLP
- NoSQL
- Python
- cell phone
- database
- java
- open-source
- spam
- GENI
- Javascript
- SQL-Injection
- Wikipedia
- agile
- analog computing
- archives
- biological
- bots
- cellular automata
- computer tips
- crowdsourcing
- e-book
- equilibrium
- game theory
- genetic algorithm
- green tech
- mobile
- nonlinear
- p
- phone
- prediction
- privacy
- self-book publishing
- simulation
- testing
- virtual server
- visualization
- wireless