Computer Automatically Deciphers Ancient Language
MIT News (06/30/10) Hardesty, Larry
Researchers at the Massachusetts Institute of Technology (MIT) and the University of Southern California have developed a computer system that deciphered much of the ancient Semitic language of Ugaritic in a matter of hours. The researchers say archeologists can use the system to decipher other ancient languages and it also could help expand the number of languages that automated translation systems can handle. "We iterate through the data hundreds of times, thousands of times, and each time, our guesses have higher probability, because we're actually coming closer to a solution where we get more consistency," says MIT's Ben Snyder. To decipher a language the system makes several assumptions, including that it is related to another language, that there is a way to systematically map the alphabet of language onto another, and that the language's words share at least some roots with each other. "Each language has its own challenges," says MIT professor Regina Barzilay. "Most likely, a successful decipherment would require one to adjust the method for the peculiarities of a language."
Wednesday, June 30, 2010
Blog: Computer Automatically Deciphers Ancient Language
Tuesday, June 22, 2010
Blog: Data Mining Algorithm Explains Complex Temporal Interactions Among Genes
Data Mining Algorithm Explains Complex Temporal Interactions Among Genes
Virginia Tech News (06/22/10) Trulove, Susan
Researchers at Virginia Tech (VT), New York University (NYU), and the University of Milan have developed Gene Ontology based Algorithmic Logic and Invariant Extractor (GOALIE), a data-mining algorithm that can automatically reveal how biological processes are coordinated in time. GOALIE reconstructs temporal models of cellular processes from gene expression data. The researchers developed and applied the algorithm to time-course gene expression datasets from budding yeast. "A key goal of GOALIE is to be able to computationally integrate data from distinct stress experiments even when the experiments had been conducted independently," says VT professor Naren Ramakrishnan. NYU professor Bud Mishra notes GOALIE also can extract entire formal models that can then be used for posing biological questions and reasoning about hypotheses. The researchers hope the tool can be used to study disease progression, aging, host-pathogen interactions, stress responses, and cell-to-cell communication.
Monday, June 21, 2010
Blog: Blogs and Tweets Could Predict the Future
Blogs and Tweets Could Predict the Future
New Scientist (06/21/10) Giles, Jim
Forecasts about social and economic trends could be generated through the analysis of blogs and tweets, building on earlier research by Google and others to mine the frequency of specific search terms to outline purchasing patterns. With blogs and tweets added to the equation, trends other than buying behavior--such as political sentiment and stock market patterns--could possibly be predicted. For instance, researchers at the University of Illinois at Urbana-Champaign were able to forecast stock market behavior by using more than 20 million blog posts to build an "Anxiety Index" that measures the frequency with which a range of words associated with apprehension, such as "nervous," show up in the posts. The appearance of these terms correlated with lower stock prices. Tools that quantify the national mood could prove useful to stock traders, who will be more likely to refrain from taking risks if they know consumers are fraught with pessimism, for example. Researchers say that Web data analysis methods could be used to make even more accurate predictions as researchers devise more refined techniques for measuring the emotional content of blogs and tweets.
Friday, June 18, 2010
Blog: Is Cloud Computing Fast Enough for Science?
Is Cloud Computing Fast Enough for Science?
Government Computer News (06/18/10) Lipowicz, Alice
The U.S. Department of Energy's (DOE's) Magellan cloud computing testbed has shown that commercially available clouds suffer in performance when operating message passing interface (MPI) applications such as weather calculations. "For the more traditional MPI applications there were significant slowdowns, over a factor of 10," says National Energy Research Scientific Computing's Kathy Yelick. However, for computations that can be performed serially, such as genomics calculations, there was little or no deterioration in performance in a commercial cloud, Yelick says. DOE is using the Magellan project to explore a wide range of scientific issues regarding cloud computing, and to advise DOE how to incorporate cloud computing into its research. "Our goal is to inform DOE and the scientists and industry what is the sweet spot for cloud computing in science; what do you need to do to configure a cloud for science, how do to manage it, what is the business model, and do you need to buy your own cloud," Yelick says.
Blog: Fighting Back Against Web Attacks
Fighting Back Against Web Attacks
BBC News (06/18/10)
The bugs and vulnerabilities in the kits used by cybercriminals could be exploited to identify hackers and even launch a counterattack, according to Tehtri Security's Laurent Oudot. The French computer security researcher has studied high-tech criminals' malware kits, which are widely available online, and concludes that the attack tools have vulnerabilities that should be easy to exploit. During a presentation at a recent security conference in Singapore, Oudot provided details on 13 unpatched loopholes in popular malware kits that have been used to attack Web sites. The vulnerabilities in the malware kits could be used to obtain more information on attackers, perhaps identify them, steal their tools and methods, or even follow their trail back to their own computers, according to Oudot. He acknowledges that using the loopholes might "lead to legal issues," but says the strategy should "open new way[s] to think about [information technology] security worldwide."
Wednesday, June 16, 2010
Blog: Why Can't Johnny Develop Secure Software?
Why Can't Johnny Develop Secure Software?
Dark Reading (06/16/10) Wilson, Tim
Despite a wealth of security knowledge and developers' access to advanced tools, many software security risks remain. Analysts say that vulnerabilities arise because many software developers do not understand how to build security into their code. "There's a lot more acceptance of security as part of the process now, but historically developers have never been responsible for security," says Fortify chief scientist Brian Chess. Although there have been several initiatives aimed at educating developers about secure software development practices, "the talent coming out of schools right now doesn't have the security knowledge it needs," says SAFECode executive director Paul Kurtz. Some organizations are implementing secure development frameworks, such as the Building Security In Maturity Model (BSIMM), which impose secure best practices throughout the entire development team. "BSIMM is a good strategy if you have a formalized software development process," Chess says. The goal of the frameworks is to help developers identify and remediate the most common coding errors and fix them during development, rather than waiting until after the code is complete.
Tuesday, June 15, 2010
Blog: 10 R&D Cybersecurity Initiatives Congress Seeks
10 R&D Cybersecurity Initiatives Congress Seeks
GovInfoSecurity.com (06/15/10) Chabrow, Eric
The Protecting Cyberspace as a National Asset Act of 2010, which was recently introduced in the U.S. Senate, lists 10 research and development (R&D) initiatives the government would support to secure information systems and networks. The bill would advance the development and deployment of secure versions of fundamental Internet protocols and architectures. A main goal is to improve technologies for detecting and analyzing attacks as well as improving mitigation and recovery methodologies. The bill also would back the development of infrastructure and tools to support cybersecurity R&D efforts. The government wants to assist the development of technology to reduce vulnerabilities in process control systems, as well as understand human behavioral factors that can affect cybersecurity technology and practices. Public officials also hope to test, evaluate, and facilitate the transfer of technologies associated with the engineering of less vulnerable software. Other research areas include assisting in the development of identity management and of technologies designed to increase the security of telecommunications networks. Finally, the government wants to advance the protection of privacy and civil liberties in cybersecurity technology and practices.
Blog: Get Smart: Targeting Phone Security Flaws
Get Smart: Targeting Phone Security Flaws
Wall Street Journal (06/15/10) P. B1; Ante, Spencer E.
As mobile devices such as smartphones and tablet computers become increasingly popular, computer researchers and hackers are discovering more security holes. Data from the National Vulnerability Database shows that last year security experts identified 30 security flaws in the software and operating systems of smartphones made by Apple, Nokia, and Research in Motion, nearly twice as many as the year before. "Manufacturers are not necessarily thinking of abuses and vulnerabilities," says Purdue University professor Eugene H. Spafford. "Instead, they are thinking of the opportunities and how to push adoption." The National Vulnerability Database also reveals that vulnerabilities in the networks and applications that run on mobile devices are on the rise. For example, the mobile version of Apple's Safari browser had 22 vulnerabilities, up from five in 2008. Smartphone manufacturers try to keep out hackers using sandboxing techniques, which prevent third-party applications from accessing specific data. But Swiss software engineer Nicolas Seriot recently published a paper indicating that such systems can be breached.
Monday, June 14, 2010
Blog: What Is IBM's Watson?
What Is IBM's Watson?
New York Times Magazine (06/14/10) Thompson, Clive
IBM artificial intelligence (AI) researchers have labored for the last three years to create a machine that can understand questions in natural language and respond quickly with precise, factual answers. The result is the Watson supercomputer, which has been pitted against human players in Jeopardy tournaments. Watson is the product of IBM's grand challenge to meet the real-world necessity of precise question-answering, and key to its development was the shift in AI research toward statistical computation of vast corpora of documents. This shift was facilitated by the decreasing cost of computing power, an explosion of online text generation, and the development of linguistic tools that helped machines puzzle through language. Watson was fed millions of documents, and the system can tackle a Jeopardy clue thousands of times concurrently using more than 100 algorithms simultaneously. However, creative wordplay can sometimes trip Watson up. On the other hand, its lack of emotion and stress is an advantage over human players. IBM's David Ferrucci believes Watson may be modeling certain ways that the human brain processes language. IBM's John Kelly thinks that Watson's question-answering capabilities could aid greatly in rapid decision-making.
Friday, June 11, 2010
Blog: Python Language Upgrade Slithers Toward Final Release
Python Language Upgrade Slithers Toward Final Release
InfoWorld (06/11/10) Krill, Paul
Developers of Python 2.7 offered a release candidate for the last upgrade in the legacy 2.x dynamic language line earlier in June, and plan to make a finished version available July 3, says Python Software Foundation chairman Steve Holden. "We anticipate a long period of end-of-life support--most likely at least five years but certainly beyond the normal two years," he says. Many Python programmers have already moved to the 3.x line because developers wanted to make some dramatic changes to the language while still maintaining its essence. "A number of 3.1 features have been back-ported [to Python 2.7], including set literals, dictionary, and set comprehensions--an easy way of programmatically generating data--and the new 'io' module," Holden says. The 3.x line is not compatible with the 2.x line, but its features will help ease the migration when the time comes to move to the upgrade. Developers are working on Python 3.2, which will include a rewrite of the Global Interpreter Lock to ensure thread consistency, with a final release expected in December.
Thursday, June 10, 2010
Blog: AI That Picks Stocks Better Than the Pros
AI That Picks Stocks Better Than the Pros
Technology Review (06/10/10) Mims, Christopher
Iona College's Robert P. Schumaker and University of Arizona's Hsinchun Chen have developed AZFinText, a system for predicting stock price fluctuations by analyzing large quantities of financial news stories along with minute-by-minute stock price data. AZFinText will buy every stock it believes will move more than one percent beyond its current price in the next 20 minutes. Schumaker and Chen minimized the amount of text the system has to parse by reducing all the financial articles into words falling into specific categories of information. The system focuses on proper nouns and combines information about their frequency with stock prices at the moment the news article is released. Using a machine-learning algorithm on historical data, AZFinText looks for correlations that can be used to predict future stock prices.
Wednesday, June 9, 2010
Blog: Protecting Privacy: Make the Data 'Fade Away'
Protecting Privacy: Make the Data 'Fade Away'
University of Twente (Netherlands) (06/09/10)
Personal information can be protected by having it gradually fade away over time "like footprints in the sand," says the University of Twente's Harold van Heerde, who discusses storage structures, indexing methods, and log mechanisms in his dissertation, "Privacy-Aware Data Management by Means of Data Degradation: Making Private Information Less Sensitive Over Time." The dissertation shows that data degradation can be implemented with an acceptable loss of performance. Van Heerde believes total security might be impossible to achieve, so he wants to shift the discussion to what information should be stored, why it should be stored, and how long it should be stored. Web sites would have to consider the usefulness of the data they want to hold as they make prior agreements with their users. Although current databases are optimized for long-term data storage and access, new techniques would be needed to enable the information to be efficiently and irretrievably erased.
Monday, June 7, 2010
Blog: The Grill: Fred Brooks
The Grill: Fred Brooks
Computerworld (06/07/10) Fitzgerald, Michael
Fred Brooks, who was project manager for the IBM System/360 and the lead designer of its operating system, says that software developers should plan on continuously iterating on their design. He says the central argument of his new book, "The Design of Design: Essays From a Computer Scientist," is that programmers would be wise to study issues beyond software. Brooks notes that "there are these invariants across mediums in which one designs. Let's try to identify these invariants and learn from the older design businesses." Brooks contends that the growing complexity of design has necessitated a shift toward team design, and he says the design of something new should begin with the selection of a chief designer who is granted authority over the design's parameters. Brooks laments the current state of U.S. computer science education, and cites a lack of preparation in elementary and especially middle school, where the cultivation of a foundation in mathematics is essential. He traces the dearth of educators to a lack of appropriate recognition and pay levels relative to other professions.
Blog: Researchers: Poor Password Practices Hurt Security for All
Researchers: Poor Password Practices Hurt Security for All
IDG News Service (06/07/10) Heichler, Elizabeth
University of Cambridge researchers recently completed a large study of password-protected Web sites and found that a lack of industry standards harms end-user security. Weak implementations of password authentication at low-level sites compromises the protections offered by higher-security sites because individuals reuse passwords, write Cambridge researchers Joseph Bonneau and Soren Preibusch. Attackers can use low-security Web sites such as news outlets to learn passwords associated with specific email addresses, and then use those passwords to access higher-security sites such as e-commerce vendors, Bonneau says. Based on data collected from 150 Web sites, the researchers say they found widespread, poor design choices, inconsistencies, and mistakes. "Sites' decisions to collect passwords can be viewed as a tragedy of the commons, with competing Web sites collectively depleting users' capacity to remember secure passwords," write the researchers. More than 75 percent of sites examined failed to provide users with feedback or advice on choosing a secure password. The researchers also found widespread weaknesses in how passwords are submitted to the server when users log in.
Blog: Open Source Could Mean an Open Door for Hackers
Open Source Could Mean an Open Door for Hackers
Technology Review (06/07/10) Lemos, Robert
Flaws in open source software are exploited more quickly and more often than flaws in closed software systems, according to a paper by Boston College (BC) researchers that analyzed two years of attack data. "If you think about this whole thing as a game between the good guys and the bad guys, by reducing the effort for the bad guys, there is much greater incentive for them to exploit targets earlier and hit more firms," says BC professor Sam Ransbotham. The researchers used alert data taken from intrusion-detection systems managed on behalf of 960 companies by SecureWorks. Ransbotham also found a correlation between the existence of signatures, which are used by various security products to match a known pattern with a flaw, and earlier attacks, suggesting that the updates used to improve defenses actually help the attackers. "That tells me that there is something about having that signature that is helping people ... giving them a clue about how to exploit the vulnerability," he says.
Saturday, June 5, 2010
Blog: All Eyes and Ears on March of the Cyborgs
All Eyes and Ears on March of the Cyborgs
Sydney Morning Herald (Australia) (06/05/10) Smith, Deborah
Medical implants such as heart pacemakers, cochlear implants for the deaf, and brain implants for those with Parkinson's disease are just the beginning of the process of cyborgization, the development of high-tech implants and prostheses that will benefit many people, says Australian National University professor Roger Clarke. On the horizon are bionic eyes that enable the blind to see and muscle implants that could allow paraplegics to stand and even walk, says the University of Melbourne's Rob Shepherd. "The field of medical bionics is rapidly expanding,'' Shepherd says. ''The thing we get excited about is that Australia is at the forefront.'' Researchers also are developing electrically conducting plastics that could stimulate and guide nerve fibers to repair spinal cords. "Cyborgization will give rise to demands for new rights,'' Clark notes, such as using devices to enhance, not just restore, function.
Friday, June 4, 2010
Blog: Free, Open Virtual Laboratory for Infectious Diseases
Free, Open Virtual Laboratory for Infectious Diseases
ICT Results (06/04/10)
A European research team has developed a virtual laboratory designed to help doctors match drugs to patients and make treatments more effective. The ViroLab Virtual Laboratory uses machine learning, data mining, grid computing, modeling, and simulation technologies to convert the content of millions of scientific journal articles, databases, and patients' medical histories into knowledge that can be used for treatment. "ViroLab finds new pathways for treatment by integrating different kinds of data, from genetic information and molecular interactions within the body, measured in nanoseconds, up to sociological interactions on the epidemiological level spanning years of disease progression," says University of Amsterdam professor Peter Sloot. The system continuously crawls grid-connected databases of virological, immunological, clinical, genetic, and experimental data and extracts information from scientific journal articles. The ViroLab Virtual Laboratory also could be used to create personalized drug rankings to aid in the treatment of people suffering from diseases.
Wednesday, June 2, 2010
Blog: HP Researcher Predicts Memory-Centric Processors
HP Researcher Predicts Memory-Centric Processors
EE Times (06/02/10) Merritt, Rick
Hewlett-Packard (HP) researchers are studying ways to make memristor processors the centerpiece of future server designs. The researchers found that low-power processors are superior for some data center workloads and determined that various workloads need different kinds of designs. "Re-thinking the balance of computer, storage, and communications will happen, and it will have big implications," says HP researcher Partha Ranganathan. As part of that rebalancing, HP developed the nanostore, a three-dimensional stack of processor cores connected to nonvolatile memory cores such as HP's memristors. "We've run some experiments [on the nanostore concept], and found the new approach is a factor of 10 better [performance] for the same energy or dollars," Ranganathan says. HP Labs has identified three kinds of server designs that are optimized for different kinds of data center workloads and has created metrics to match workloads to the various designs. Nanostores could emerge to speed switching and routing of traffic running in virtual machines within two years, Ranganathan says.
Blog: Toshiba Invention Brings Quantum Computing Closer
Toshiba Invention Brings Quantum Computing Closer
Reuters (06/02/10) Hirschler, Ben
Researchers at Toshiba's research center in Cambridge, England, have designed a device that could open the way to super-fast quantum computing through the development of ultra-powerful semiconductors. Toshiba's Entangled Light Emitting Diode (ELED) is an easy-to-assemble device that can be connected to a battery to generate entangled light on an as-needed basis. Quantum computers based on optical processes require a large number of entangled photons, and producing entangled light has up to now been limited to bulky lasers. The ELED device employs standard semiconductor technology and is fashioned from gallium arsenide, a common material in optoelectronics. Although similar to conventional light-emitting diodes, the ELED contains a quantum dot that transforms electrical current into entangled light. "It's a big step because it means you can now start to integrate lots of devices on a single chip," says lead researcher Andrew Shields. He believes that basic quantum computing circuits that use ELED technology could be ready within five years.
Blog: DNA Logic Gates Herald Injectable Computers
DNA Logic Gates Herald Injectable Computers
New Scientist (06/02/10) McAlpine, Katie
Hebrew University of Jerusalem (HUJ) researchers have developed DNA-based logic gates that could carry out calculations inside the body and may lead to injectable biocomputers programmed to target diseases as they arise. "The biocomputer would sense biomarkers and immediately react by releasing counter-agents for the disease," says HUJ's Itamar Willner. The logic gates are formed from short strands of DNA and their complementary strands, in conjunction with some molecular machinery, mimic their electronic equivalent. Two strands act as the input--each represents a one when present or a zero when absent. DNA computing allows calculations to be carried out in parallel, if different types of logic gates are represented by different ingredients. The HUJ team also was able to create logic gates that calculate in sequence. The HUJ system reforms after every step, enabling long sequences of calculations to be carried out. "Being enzyme-free, it has potential in future diagnostic and medical applications," says the Weizmann Institute of Science's Benny Gil.
Blog Archive
-
►
2012
(35)
- ► April 2012 (13)
- ► March 2012 (16)
- ► February 2012 (3)
- ► January 2012 (3)
-
►
2011
(118)
- ► December 2011 (9)
- ► November 2011 (11)
- ► October 2011 (7)
- ► September 2011 (13)
- ► August 2011 (7)
- ► April 2011 (8)
- ► March 2011 (11)
- ► February 2011 (12)
- ► January 2011 (15)
-
▼
2010
(183)
- ► December 2010 (16)
- ► November 2010 (15)
- ► October 2010 (15)
- ► September 2010 (25)
- ► August 2010 (19)
-
▼
June 2010
(20)
- Blog: Computer Automatically Deciphers Ancient Lan...
- Blog: Data Mining Algorithm Explains Complex Tempo...
- Blog: Blogs and Tweets Could Predict the Future
- Blog: Is Cloud Computing Fast Enough for Science?
- Blog: Fighting Back Against Web Attacks
- Blog: Why Can't Johnny Develop Secure Software?
- Blog: 10 R&D Cybersecurity Initiatives Congress Seeks
- Blog: Get Smart: Targeting Phone Security Flaws
- Blog: What Is IBM's Watson?
- Blog: Python Language Upgrade Slithers Toward Fina...
- Blog: AI That Picks Stocks Better Than the Pros
- Blog: Protecting Privacy: Make the Data 'Fade Away'
- Blog: The Grill: Fred Brooks
- Blog: Researchers: Poor Password Practices Hurt Se...
- Blog: Open Source Could Mean an Open Door for Hackers
- Blog: All Eyes and Ears on March of the Cyborgs
- Blog: Free, Open Virtual Laboratory for Infectious...
- Blog: HP Researcher Predicts Memory-Centric Proces...
- Blog: Toshiba Invention Brings Quantum Computing C...
- Blog: DNA Logic Gates Herald Injectable Computers
- ► April 2010 (21)
- ► March 2010 (7)
- ► February 2010 (6)
- ► January 2010 (6)
-
►
2009
(120)
- ► December 2009 (5)
- ► November 2009 (12)
- ► October 2009 (2)
- ► September 2009 (3)
- ► August 2009 (16)
- ► April 2009 (4)
- ► March 2009 (20)
- ► February 2009 (9)
- ► January 2009 (19)
-
►
2008
(139)
- ► December 2008 (15)
- ► November 2008 (16)
- ► October 2008 (17)
- ► September 2008 (2)
- ► August 2008 (2)
- ► April 2008 (12)
- ► March 2008 (25)
- ► February 2008 (16)
- ► January 2008 (6)
-
►
2007
(17)
- ► December 2007 (4)
- ► November 2007 (4)
- ► October 2007 (7)
Blog Labels
- research
- CSE
- security
- software
- web
- AI
- development
- hardware
- algorithm
- hackers
- medical
- machine learning
- robotics
- data-mining
- semantic web
- quantum computing
- Cloud computing
- cryptography
- network
- EMR
- search
- NP-complete
- linguistics
- complexity
- data clustering
- optimization
- parallel
- performance
- social network
- HIPAA
- accessibility
- biometrics
- connectionist
- cyber security
- passwords
- voting
- XML
- biological computing
- neural network
- user interface
- DNS
- access control
- firewall
- graph theory
- grid computing
- identity theft
- project management
- role-based
- HTML5
- NLP
- NoSQL
- Python
- cell phone
- database
- java
- open-source
- spam
- GENI
- Javascript
- SQL-Injection
- Wikipedia
- agile
- analog computing
- archives
- biological
- bots
- cellular automata
- computer tips
- crowdsourcing
- e-book
- equilibrium
- game theory
- genetic algorithm
- green tech
- mobile
- nonlinear
- p
- phone
- prediction
- privacy
- self-book publishing
- simulation
- testing
- virtual server
- visualization
- wireless