In D.C.'s Web Voting Test, the Hackers Were the Good Guys
Washington Post (10/30/10) Jeremy Epstein; David Jefferson; Barbara Simons
Washington, D.C., held an Internet voting experiment in September during which a team of University of Michigan hackers successfully penetrated election computers and rigged the electoral outcome, demonstrating the extreme national security hazards of online voting, write SRI International computer scientist Jeremy Epstein, Lawrence Livermore National Laboratory researcher David Jefferson, and former ACM president Barbara Simons. They say the test verifies that Internet voting systems can be assaulted from anywhere by any malicious individual or entity, and effective defense is a virtual impossibility. Worse still, a cyberattack against an election may be completely invisible to election officials. The computer security community agrees that no secure Internet voting framework exists for public elections, and that simply correcting the problems highlighted by the recent experiment will not ensure the system's security. Epstein, Jefferson, and Simons contend that the hacker team "has done our nation an enormous service" by calling attention to the dangers of Internet voting, but they lament that more than 30 U.S. states are permitting the use of online voting systems in the midterm elections, despite the lessons learned from the D.C. experiment.
Saturday, October 30, 2010
Blog: In D.C.'s Web Voting Test, the Hackers Were the Good Guys
Thursday, October 28, 2010
Blog: Computer Scientists Make Progress on Math Puzzle
Computer Scientists Make Progress on Math Puzzle
UT Dallas News (10/28/10) David Moore
University of Texas at Dallas (UTD) professors Linda Morales and Hal Sudborough have made progress on the Topswops mathematical puzzle. Stanford University computer scientist Donald Knuth previously proved an exponential upper bound on the number of Topswops steps, but Morales and Sudborough proved a lower bound that is better than that proposed in Knuth's conjecture. "What I find fascinating about a problem such as bounding the Topswops function is connected to its simplicity, to its fundamental nature, and to the complexity and difficulty of finding an answer," Sudborough says. "Our research uncovered permutations whose iterate sequences have a fascinating structure, which upon analysis have revealed hitherto unknown lower bounds for the problem," Morales says. Knuth called their proof technique both "elegant" and "amazing." "There is much more to learn from the problem," Morales says. "We have tantalizing hints of more revelations just waiting to be uncovered."
Monday, October 25, 2010
Blog: 7 Programming Languages on the Rise
7 Programming Languages on the Rise
InfoWorld (10/25/10) Peter Wayner
Seven increasingly popular niche programming languages offer features that cannot be found in the dominant languages. For example, Python has gained popularity in scientific labs. "Scientists often need to improvise when trying to interpret results, so they are drawn to dynamic languages which allow them to work very quickly and see results almost immediately," says Python's creator Guido von Rossum. Many Wall Street firms also rely on Python because they like to hire university scientists to work on complex financial analysis problems. Meanwhile, Ruby is becoming popular for prototyping. Ruby sites are devoted to cataloging data that can be stored in tables. MatLab was originally designed for mathematicians to solve systems of linear equations, but it also has found a following in the enterprise because of the large volumes of data that organizations need to analyze. Although JavaScript is not a new programming language, new applications for JavaScript are constantly in development. For example, CouchDB uses JavaScript's Map and Reduce functions to help bring harmony to both client and server-side programming. Other popular niche languages include R, which also is known as S and S-Plus, Erlang, Cobol, and CUDA.
Sunday, October 24, 2010
Blog: As E-Voting Comes of Age, Security Fears Mount
As E-Voting Comes of Age, Security Fears Mount
Agence France-Presse (10/24/10) Rob Lever
New technologies that allow voters to cast ballots using the Internet or other electronic means are gaining popularity in the United States and elsewhere, despite growing security concerns. Thirty-three U.S. states are allowing some email, fax, or online ballots in 2010, according to the Verified Voting Foundation (VVF). These systems have the potential to increase voter participation but their security remains in question. For example, University of Michigan computer scientists recently hacked into a Washington D.C. pilot Internet voting system and changed the password directing the system to play the university fight song. "Within the first three hours or so of looking at the code we found the first open door and within 36 hours we had taken control of the system," according to Michigan professor Alex Halderman. He says that during the attack they discovered that hackers from Iran and China also were trying to hack into the system. "After this, there can be no doubt that the burden of proof in the argument over the security of Internet voting systems has definitely shifted to those who claim that the systems can be made secure," says VVF chairman David Jefferson.
Friday, October 22, 2010
Blog: D.C. Hacking Raises Questions About Future of Online Voting
D.C. Hacking Raises Questions About Future of Online Voting
Stateline.org (10/22/10) Sean Greene
Washington, D.C.'s failure to prevent a team of University of Michigan computer scientists from taking control of its online voting Web site has called into the question the future of electronic voting. Michigan professor J. Alex Halderman noted that his team had taken complete control over the elections board's server. Although some experts say the incident proves that the Internet, in its current state, cannot support secure online voting, others still see potential in the technology. For example, Arizona and eight counties in West Virginia are planning to go ahead with online voting experiments on November 2. "All an attacker has to find is one hole in a system to mount a serious attack," warns University of California, Berkeley researcher Joe Hall. Washington D.C. used a system based on open source software, believing that it provides the transparency necessary for elections. The West Virginia counties are using proprietary software that officials say should be more resistant to hackers. Rokey Suleman, executive director of the D.C. board of elections, says the hacking incident is an opportunity to improve the technology. "We are not disappointed that this occurred," Suleman says. "It is an opportunity for the computer science community to work with us."
Wednesday, October 20, 2010
Blog: New Search Method Tracks Down Influential Ideas
New Search Method Tracks Down Influential Ideas
Princeton University (10/20/10) Chris Emery
Princeton University computer scientists have developed a method that uses computer algorithms to trace the origins and spread of ideas, which they say could make it easier to measure the influence of scholarly papers, news stories, and other information sources. The algorithms analyze how language changes over time within a group of documents and determines which documents had the most influence. "The point is being able to manage the explosion of information made possible by computers and the Internet," says Princeton professor David Blei. He says the search method could eventually enable historians, political scientists, and other scholars to study how ideas originate and spread. The Princeton method enables computers to analyze the actual text of documents, instead of focusing on citations, to see how the language changes over time. "We are also exploring the idea that you can find patterns in how language changes over time," Blei says.
Monday, October 18, 2010
Blog: Analyzing Almost 10 Million Tweets, Research Finds Public Mood Can Predict Dow Days in Advance
Analyzing Almost 10 Million Tweets, Research Finds Public Mood Can Predict Dow Days in Advance
IU News Room (10/18/10) Steve Chaplin
Indiana University researchers found that by analyzing millions of tweets they can predict the movement of the Dow Jones Industrial Average (DJIA) up to a week in advance with near 90 percent accuracy. Indiana professor Johan Bollen and Ph.D. candidate Huina Mao used two mood-tracking tools to analyze the text content of more than 9.8 million Twitter feeds and compared the public mood to the DJIA's closing value. One tool, called OpinionFinder, analyzed the tweets to give a positive or negative daily time series of public mood. The other tool, Google-Profile of Mood States, measures the mood of tweets in six dimensions--calm, alert, sure, vital, kind, and happy. The two tools gave the researchers seven public mood time series that could be matched against a similar daily time series for the DJIA closing. "What we found was an accuracy of 87.6 percent in predicting the daily up and down changes in the closing values of the Dow Jones Industrial Average," Bollen says. The researchers demonstrated that public mood can significantly improve the accuracy of the basic models currently used to predict Dow Jones closing values by implementing a prediction model called a Self-Organizing Fuzzy Neural Network.
Sunday, October 17, 2010
Blog: HIMSS Analytics, the 8 stages to creating a paperless patient record environment
HIMSS Analytics, the authoritative source on EMR Adoption trends, devised the EMR Adoption Model to track EMR progress at hospitals and health systems. The EMRAM scores hospitals in the HIMSS Analytics Database on their progress in completing the 8 stages to creating a paperless patient record environment.
Thursday, October 14, 2010
Blog: Faster Websites, More Reliable Data
Faster Websites, More Reliable Data
MIT News (10/14/10) Larry Hardesty
Massachusetts Institute of Technology (MIT) researchers have developed TxCache, a database caching system that eliminates certain types of asymmetric data retrieval while making database caches easier to program. TxCache is designed to solve the problem of making sure that data cached on local servers is as current as the data stored in the master database. The MIT system can handle transactions, sets of computations that are treated as a block, which means that none of the computations will be performed unless all of them can be performed. TxCache makes it easier for programmers to manage caches, says MIT graduate student Dan Ports, who led the system's development along with professor Barbara Liskov, who received the 2008 ACM A.M. Turing Award. Ports says TxCache ensures that programmers can change variables in a line of code just once, and have the cached copies be automatically updated everywhere. The system has to track what data are cached where, and which data depend on each other, Liskov says. The researchers say that during testing, Websites using TxCache were more than five times faster.
Blog: Five tips to learn from failure
Five tips to learn from failure By Michael Krigsman October 14, 2010, 5:56am PDT | |
This five-point advisory list offers a great start to organizations that want to improve IT project success rates. |
Wednesday, October 6, 2010
Blog: W3C: Hold Off on Deploying HTML5 in Websites
W3C: Hold Off on Deploying HTML5 in Websites
InfoWorld (10/06/10) Paul Krill
The World Wide Web Consortium (W3C) says that HTML5, which has gained the support of Microsoft, Apple, and Google, is still not ready for deployment to Web sites. "The problem we're facing right now is there is already a lot of excitement for HTML5, but it's a little too early to deploy it because we're running into interoperability issues," including differences between video on devices, says W3C's Philippe Le Hegaret. Companies can now deploy HTML5 in their applications or in intranets, where a rendering engine can be controlled, but it is not feasible on the open Web right now, Le Hegaret says. When finished, HTML5 will support a variety of modern Web applications, and Le Hegaret notes it is now being viewed as a "game changer." "What's happening is the industry is realizing that HTML5 is going to be real," he says. However, Le Hegaret says W3C still plans to make some API changes, and notes that HTML5 also does not yet work across all browsers. "We basically want to be feature-complete by mid-2011," he says.
Blog: D.C. Web Voting Flaw Could Have Led to Compromised Ballots
D.C. Web Voting Flaw Could Have Led to Compromised Ballots
Computerworld (10/06/10) Jaikumar Vijayan
University of Michigan researchers recently found a major security flaw in Washington, D.C.'s new Digital Vote by Mail system, which enabled them to access, modify, and replace marked ballots in the system. The shell injection flaw in the ballot upload function allowed the researchers to access usernames, passwords, and the public key used to encrypt ballots, according to Michigan professor Alex Halderman. He also says the researchers were able to install a backdoor on the server, which enabled them to view the recorded votes and the names of the voters. "If this particular problem had not existed, I'm confident that we would have found another way to attack the system," Halderman says. The Digital Vote by Mail system is designed to let military personnel and overseas U.S. civilians receive and cast ballots over the Internet using a pre-provided PIN to authenticate themselves. In response to the discovery of the security flaws, D.C.'s Board of Election and Ethics announced that voters will not be allowed to use Digital Vote by Mail to send back ballots.
Blog: Stopping Malware: BLADE Software Eliminates "Drive-By Downloads" From Malicious Websites
Stopping Malware: BLADE Software Eliminates "Drive-By Downloads" From Malicious Websites
Georgia Tech Research News (10/06/10) Abby Vogel Robinson
Georgia Tech researchers have developed Block All Drive-By Download Exploits (BLADE), a browser-independent tool that eliminates drive-by download threats. "BLADE is an effective countermeasure against all forms of drive-by download malware installs because it is vulnerability and exploit agnostic," says Georgia Tech professor Wenke Lee. In testing, BLADE blocked all drive-by malware installation attempts from the more than 1,900 malicious Web sites tested. "BLADE monitors and analyzes everything that is downloaded to a user's hard drive to cross-check whether the user authorized the computer to open, run, or store the file on the hard drive," says Georgia Tech graduate student Long Lu. Testing found that Adobe Reader, Java, and Adobe Flash were the most frequently targeted applications. "BLADE requires a user's browser to be configured to require explicit consent before executable files are downloaded, so if this option is disabled by the user, then BLADE will not be able to protect that user's Web surfing activities," Lee notes.
Friday, October 1, 2010
Blog: Professor Wendy Hall Speaks [on Web Science; why Semantic Web failed to emerge]
Professor Wendy Hall Speaks
Inquirer (UK) (10/01/10) Wendy M. Grossman
Wendy Hall, dean of the University of Southampton's new Faculty of Physical and Applied Sciences, helped found the Web Science Research Initiative for the purpose of facilitating a formal understanding of the Web. "Partly what I'm passionate about in Web science ... [is] what makes the Web what it is, how it evolves and will evolve, what are the scenarios that could kill it or change it in ways that would be detrimental to its use," she says. Hall says the need for the initiative was demonstrated when she discussed with Web pioneer Tim Berners-Lee why the semantic Web has failed to emerge. Berners-Lee determined that the semantic Web concept was commandeered by the artificial intelligence community for the purpose of solving massive problems, when what was needed was for the data to be freed up so that people's use of it could be observed. "What creates the Web are us who put the content on it, and that's not natural or engineered," Hall says. Among the subjects she says can be explored through Web science is how to construct a better future Web by predicting what people would and would not do with it.
Blog: Regulators Blame Computer Algorithm for Stock Market 'Flash Crash'
Regulators Blame Computer Algorithm for Stock Market 'Flash Crash'
Computerworld (10/01/10) Lucas Mearian
A joint investigation by the U.S. Securities and Exchange Commission and the Commodity Futures Trading Commission has issued a report calling the May 6 stock market flash crash the responsibility of an automated trade execution system that inundated the Chicago Mercantile Exchange's Globex electronic trading platform with a major sell order that triggered a nearly 1,000-point plunge in the Dow Jones Industrial Average in a half hour. "[A] large fundamental trader chose to execute this sell program via an automated execution algorithm that was programmed to feed orders into the June 2010 E-Mini market to target an execution rate set to 9 percent of the trading volume calculated over the previous minute, but without regard to price or time," the report says. "The execution of this sell program resulted in the largest net change in daily position of any trader in the E-Mini since the beginning of the year." The study found that under strained market conditions, the automated execution of a big sell order can induce extreme price movements, particularly if price is not factored in by the automated execution algorithm. "Moreover, the interaction between automated execution programs and algorithmic trading strategies can quickly erode liquidity and result in disorderly markets," the report concludes.
Blog Archive
-
►
2012
(35)
- ► April 2012 (13)
- ► March 2012 (16)
- ► February 2012 (3)
- ► January 2012 (3)
-
►
2011
(118)
- ► December 2011 (9)
- ► November 2011 (11)
- ► October 2011 (7)
- ► September 2011 (13)
- ► August 2011 (7)
- ► April 2011 (8)
- ► March 2011 (11)
- ► February 2011 (12)
- ► January 2011 (15)
-
▼
2010
(183)
- ► December 2010 (16)
- ► November 2010 (15)
-
▼
October 2010
(15)
- Blog: In D.C.'s Web Voting Test, the Hackers Were ...
- Blog: Computer Scientists Make Progress on Math Pu...
- Blog: 7 Programming Languages on the Rise
- Blog: As E-Voting Comes of Age, Security Fears Mount
- Blog: D.C. Hacking Raises Questions About Future o...
- Blog: New Search Method Tracks Down Influential Ideas
- Blog: Analyzing Almost 10 Million Tweets, Research...
- Blog: HIMSS Analytics, the 8 stages to creating a ...
- Blog: Faster Websites, More Reliable Data
- Blog: Five tips to learn from failure
- Blog: W3C: Hold Off on Deploying HTML5 in Websites
- Blog: D.C. Web Voting Flaw Could Have Led to Compr...
- Blog: Stopping Malware: BLADE Software Eliminates ...
- Blog: Professor Wendy Hall Speaks [on Web Science;...
- Blog: Regulators Blame Computer Algorithm for Stoc...
- ► September 2010 (25)
- ► August 2010 (19)
- ► April 2010 (21)
- ► March 2010 (7)
- ► February 2010 (6)
- ► January 2010 (6)
-
►
2009
(120)
- ► December 2009 (5)
- ► November 2009 (12)
- ► October 2009 (2)
- ► September 2009 (3)
- ► August 2009 (16)
- ► April 2009 (4)
- ► March 2009 (20)
- ► February 2009 (9)
- ► January 2009 (19)
-
►
2008
(139)
- ► December 2008 (15)
- ► November 2008 (16)
- ► October 2008 (17)
- ► September 2008 (2)
- ► August 2008 (2)
- ► April 2008 (12)
- ► March 2008 (25)
- ► February 2008 (16)
- ► January 2008 (6)
-
►
2007
(17)
- ► December 2007 (4)
- ► November 2007 (4)
- ► October 2007 (7)
Blog Labels
- research
- CSE
- security
- software
- web
- AI
- development
- hardware
- algorithm
- hackers
- medical
- machine learning
- robotics
- data-mining
- semantic web
- quantum computing
- Cloud computing
- cryptography
- network
- EMR
- search
- NP-complete
- linguistics
- complexity
- data clustering
- optimization
- parallel
- performance
- social network
- HIPAA
- accessibility
- biometrics
- connectionist
- cyber security
- passwords
- voting
- XML
- biological computing
- neural network
- user interface
- DNS
- access control
- firewall
- graph theory
- grid computing
- identity theft
- project management
- role-based
- HTML5
- NLP
- NoSQL
- Python
- cell phone
- database
- java
- open-source
- spam
- GENI
- Javascript
- SQL-Injection
- Wikipedia
- agile
- analog computing
- archives
- biological
- bots
- cellular automata
- computer tips
- crowdsourcing
- e-book
- equilibrium
- game theory
- genetic algorithm
- green tech
- mobile
- nonlinear
- p
- phone
- prediction
- privacy
- self-book publishing
- simulation
- testing
- virtual server
- visualization
- wireless