Showing posts with label software. Show all posts
Showing posts with label software. Show all posts

Wednesday, April 18, 2012

Blog: New Julia Language Seeks to Be the C for Scientists

New Julia Language Seeks to Be the C for Scientists
InfoWorld (04/18/12) Paul Krill

Massachusetts Institute of Technology (MIT) researchers have developed Julia, a programming language designed for building technical applications. Julia already has been used for image analysis and linear algebra research. MIT developer Stefan Karpinski notes that Julia is a dynamic language, which he says makes it easier to program because it has a very simple programming model. "One of our goals explicitly is to have sufficiently good performance in Julia that you'd never have to drop down into C," Karpinski adds. Julia also is designed for cloud computing and parallelism, according to the Julia Web page. The programming language provides a simpler model for building large parallel applications via a global distributed address space, Karpinski says. Julia also could be good at handling predictive analysis, modeling problems, and graph analysis problems. "Julia's LLVM-based just-in-time compiler, combined with the language's design, allows it to approach and often match the performance of C and C++," according to the Julia Web page.

Tuesday, April 3, 2012

Blog: Programming Computers to Help Computer Programmers

Programming Computers to Help Computer Programmers
Rice University (04/03/12) Jade Boyd

Computer scientists from Rice University will participate in a project to create intelligent software agents that help people write code faster and with fewer errors. The Rice team will focus on robotic applications and how to verify that synthetic, computer-generated code is safe and effective, as part of the effort to develop automated program-synthesis tools for a variety of uses. "Programming is now done by experts only, and this needs to change if we are to use robots as helpers for humans," says Rice professor Lydia Kavraki. She also stresses that safety is critical. "You can only have robots help humans in a task--any task, whether mundane, dangerous, precise, or expensive--if you can guarantee that the behavior of the robot is going to be the expected one." The U.S. National Science Foundation is providing a $10 million grant to fund the five-year initiative, which is based at the University of Pennsylvania. Computer scientists at Rice and Penn have proposed a grand challenge robotic scenario of providing hospital staff with an automated program-synthesis tool for programming mobile robots to go from room to room, turn off lights, distribute medications, and remove medical waste.

Wednesday, March 28, 2012

Blog: Google Launches Go Programming Language 1.0

Google Launches Go Programming Language 1.0
eWeek (03/28/12) Darryl K. Taft

Google has released version 1.0 of its Go programming language, which was initially introduced as an experimental language in 2009. Google has described Go as an attempt to combine the development speed of working in a dynamic language such as Python with the performance and safety of a compiled language such as C or C++. "We're announcing Go version 1, or Go 1 for short, which defines a language and a set of core libraries to provide a stable foundation for creating reliable products, projects, and publications," says Google's Andrew Gerrand. He notes that Go 1 is the first release of Go that is available in supported binary distribution, identifying Linux, FreeBSD, Mac OS X, and Windows. Stability for users was the driving motivation for Go 1, and much of the work needed to bring programs up to the Go 1 standard can be automated with the go fix tool. A complete list of changes to the language and the standard library, documented in the Go 1 release notes, will be an essential reference for programmers who are migrating code from earlier versions of Go. There also is a new release of the Google App Engine SDK.

Tuesday, March 27, 2012

Blog: Google Working on Advanced Web Engineering

Google Working on Advanced Web Engineering
InfoWorld (03/27/12) Joab Jackson

Google is developing several advanced programming technologies to ease complex Web application development. "We're getting to the place where the Web is turning into a runtime integration platform for real components," says Google researcher Alex Russell. He says one major shortcoming of the Web is that technologies do not have a common component model, which slows code testing and reuse. Google wants to introduce low-level control elements without making the Web stack more confusing for novices. Google's efforts include creating a unified component model, adding classes to JavaScript, and creating a new language for Web applications. By developing a unified component model for Web technologies, Google is setting the stage for developers to "create new instances of an element and do things with it," Russell says. Google engineers also are developing a proposal to add classes to the next version of JavaScript. "We're getting to the place where we're adding shared language for things we're already doing in the platform itself," Russell says. Google also is developing a new language called Dart, which aims to provide an easy way to create small Web applications while providing the support for large, complex applications as well, says Google's Dan Rubel.

Wednesday, December 28, 2011

Blog: Five Open Source Technologies for 2012

Five Open Source Technologies for 2012
IDG News Service (12/28/11) Joab Jackson

Five open source projects could become the basis for new businesses and industries in 2012. Nginx, a Web server program, could become popular due to its ability to easily handle high-volume traffic. Nginx already is used on highly trafficked Web sites, and the next release, due in 2012, will be more pliable for shared hosting environments. The OpenStack cloud computing platform has gained support from several technology firms due to its scalability. "We're not talking about [using OpenStack to run a] cloud of 100 servers or even 1,000 servers, but tens of thousands of servers," says OpenStack Project Policy Board's Jonathan Bryce. Stig was designed for the unique workloads of social networking sites, according to its developers. The data store's architecture allows for inferential searching, enabling users and applications to look for connections between disparate pieces of information. Linux Mint was designed specifically for users who want a desktop operating system and do not want to learn more about how Linux works. The Linux Mint project is now the fourth most popular desktop operating system in the world. GlusterFS is one of the fastest growing storage software systems on the market, as downloads have increased by 300 percent in the last year.

Friday, November 11, 2011

Blog: HTML5: A Look Behind the Technology Changing the Web

HTML5: A Look Behind the Technology Changing the Web
Wall Street Journal (11/11/11) Don Clark

HTML5 is catching on as the online community embraces it. The programming standard allows data to be stored on a user's computer or mobile device so that Web apps can function without an Internet link. HTML5 also enables Web pages to boast jazzier images and effects, while objects can move on Web pages and respond to cursor movements. Audio is played without a plug-in on HTML5, and interactive three-dimensional effects can be created using a computer's graphics processor via WebGL technology. In addition, video can be embedded in a Web page without a plug-in, and interactive games can operate with just a Web browser without installing other software or plug-ins. Silicon Valley investor Roger McNamee projects that HTML5 will enable artists, media firms, and advertisers to differentiate their Web offerings in ways that were previously impractical. Binvisions.com reports that about one-third of the 100 most popular Web sites used HTML5 in the quarter that ended in September. Google, Microsoft, the Mozilla Foundation, and Opera Software are adding momentum to HTML5 by building support for the standard into their latest Web browsers.

Wednesday, November 9, 2011

Blog: Technology-induced medical errors: the wave of the future?

Technology-induced medical errors: the wave of the future?
By Denise Amrich, RN | November 9, 2011, 4:45am PST

Summary: Tuesday’s federal report addresses the strong need for safety in health IT without irresponsibly discouraging progress.

Electronic healthcare management is a really fascinating, promising topic, and most of the time, you hear people focusing on the improvements in patient care, as well as cost and time savings, partly because it helps make a case to get healthcare organizations on board with change.

The dark side of the topic is, of course, the less-often discussed and more threatening aspect of safety and security. Sometimes these fears are inflated for shock and horror or PR value. Sometimes they are glossed over. Rarely are they given credence or discussed in a detailed, productive manner. Scant attention has been paid to what harm may come from the widespread IT-ing of healthcare.

Wednesday, November 2, 2011

Blog: Major Breakthrough Improves Software Reliability and Security

Major Breakthrough Improves Software Reliability and Security
Columbia University (11/02/11)

Columbia University researchers have developed Peregrine, software designed to improve the reliability and security of multithreaded computer programs. "Our main finding in developing Peregrine is that we can make threads deterministic in an efficient and stable way: Peregrine can compute a plan for allowing when and where a thread can 'change lanes' and can then place barriers between the lanes, allowing threads to change lanes only at fixed locations, following a fixed order," says Columbia professor Junfeng Yang. "Once Peregrine computes a good plan without collisions for one group of threads, it can reuse the plan on subsequent groups to avoid the cost of computing a new plan for each new group." The researchers say the program gets at the root cause of software problems, enabling Peregrine to address all of the issues that are caused by nondeterminism. They note that Peregrine can handle data races or bugs, is very fast, and works with current hardware and programming languages.

Wednesday, October 12, 2011

Blog: Cops on the Trail of Crimes That Haven't Happened

Cops on the Trail of Crimes That Haven't Happened
New Scientist (10/12/11) Mellisae Fellet

The Santa Cruz, Calif., police department recently started field-testing Santa Clara University-developed software that analyzes where crime is likely to be committed. The software uses the locations of past incidents to highlight likely future crime scenes, enabling police to target and patrol those areas with the hope that their presence might stop the crimes from happening in the first place. The program, developed by Santa Clara researcher George Mohler, predicted the location and time of 25 percent of burglaries that occurred on any particular day in an area of Los Angeles in 2004 and 2005, using just the data on burglaries that had occurred before that day. The Santa Cruz police department is using the software to monitor 10 areas for residential burglaries, auto burglaries, and auto theft. If the program proves to be effective in thwarting crime in areas that are known for their high crime rates, it can be applied to other cities, says University of California, Los Angeles researcher Jeffrey Brantingham, who collaborated on the algorithm's development.

Tuesday, October 11, 2011

Blog: "Ghostwriting" the Torah?

"Ghostwriting" the Torah?
American Friends of Tel Aviv University (10/11/11)

Tel Aviv University (TAU) researchers have developed a computer algorithm that could help identify the different sources that contributed to the individual books of the Bible. The algorithm, developed by TAU professor Nachum Dershowitz, recognizes linguistic cues, such as word preference, to divide texts into probable author groupings. The researchers focused on writing style instead of subject or genre to avoid some of the problems that have vexed Bible scholars in the past, such as a lack of objectivity and complications caused by the multiple genres and literary forms found in the Bible. The software searches for and compares details that human scholars might have difficulty detecting, such as the frequency of the use of function words and synonyms, according to Dershowitz. The researchers tested the software by randomly mixing passages from the Hebrew books of Jeremiah and Ezekiel, and instructing the computer to separate them. The program was able to separate the passages with 99 percent accuracy, in addition to separating "priestly" materials from "non-priestly" materials. "If the computer can find features that Bible scholars haven't noticed before, it adds new dimensions to their scholarship," Dershowitz says.

Monday, October 10, 2011

Blog: Google Launches Dart as a JavaScript Killer

Google Launches Dart as a JavaScript Killer
IDG News Service (10/10/11) Joab Jackson

Google announced the launch of a preview version of Dart, an object-oriented Web programming language that has capabilities that resemble those of JavaScript but also addresses some of its scalability and organizational shortcomings. Google software engineer Lars Bak describes Dart as "a structured yet flexible language for Web programming." Dart is designed to be used for both quickly cobbling together small projects as well as for developing larger-scale Web applications. Programmers will be able to add variables without defining their data type or to define their data types. A compiler and a virtual machine, along with a set of basic libraries, are part of the preview version. Initially, programmers will have to compile their Dart creations to JavaScript, using a tool included in the Dart package, to get them to run on browsers. However, Google would like future Web browser to include a native Dart virtual machine for running Dart programs.

Friday, September 23, 2011

Blog: New Mathematical Model to Enable Web Searches for Meaning

New Mathematical Model to Enable Web Searches for Meaning
University of Hertfordshire (09/23/11) Paige Upchurch

University of Hertfordshire computer scientist Daoud Clarke has developed a mathematical model based on a theory of meaning that could revolutionize artificial intelligence technologies and enable Web searches to interpret the meaning of queries. The model is based on the idea that the meaning of words and phrases is determined by the context in which they occur. "This is an old idea, with its origin in the philosophy of Wittgenstein, and was later taken up by linguists, but this is the first time that someone has used it to construct a comprehensive theory of meaning," Clarke says. The model provides a way to represent words and phrases as sequences of numbers, known as vectors. "Our theory tells you what the vector for a phrase should look like in terms of the vectors for the individual words that make up the phrase," Clarke says. "Representing meanings of words using vectors allows fuzzy relationships between words to be expressed as the distance or angle between the vectors." He says the model could be applied to new types of artificial intelligence, such as determining the exact nature of a particular Web query.

Thursday, September 15, 2011

Blog: Intel Code Lights Road to Many-Core Future

Intel Code Lights Road to Many-Core Future
EE Times (09/15/11) Rick Merritt

Intel's release of open source code for a data-parallel version of Javascript seeks to help mainstream programmers who use scripting languages tap the power of multicore processors. Intel's Justin Rattner says in an interview that there will be multiple programming models, and Parallel JS encompasses one such model. The language enhances performance for data-intensive, browser-based apps such as photo and video editing and three-dimensional gaming running on Intel chips. Rattner describes Parallel JS as "a pretty important step that gets us beyond the prevailing view that once you are beyond a few cores, multicore chips are only for technical apps." A later iteration of Parallel JS also will exploit the graphics cores currently incorporated into Intel's latest processors. In addition, Intel is working on ways to enhance modern data-parallel tools that operate general-purpose programs on graphics processors, and those tools could be issued next year, Rattner says. He notes that beyond that, data-parallel methods require a more basic change to boost power efficiency by becoming more asynchronous.

Monday, September 12, 2011

Blog: In Plane View; using cluster analysis to discover what's normal

In Plane View
MIT News (09/12/11) Jennifer Chu

Massachusetts Institute of Technology professor John Hansman and colleagues have developed an airline health detection tool that identifies flight glitches without knowing ahead of time what to look for. The method uses cluster analysis, a type of data mining that filters data into subsets to find common patterns. Flight data outside the clusters is labeled as abnormal, enabling analysts to further inspect those reports to determine the nature of the anomaly. The researchers developed a data set from 365 flights that took place over one month. "The beauty of this is, you don't have to know ahead of time what 'normal' is, because the method finds what's normal by looking at the cluster," Hansman says. The researchers mapped each flight at takeoff and landing and found several flights that fell outside the normal range, mostly due to crew mistakes rather than mechanical flaws, according to Hansman. "To make sure that systems are safe in the future, and the airspace is safe, we have to uncover precursors of aviation safety accidents [and] these [cluster-based] analyses allow us to do that," says the U.S.'s National Aeronautics and Space Administration's Ashok Srivastava.

Friday, September 9, 2011

Blog: Google to Unveil 'Dart' Programming Language

Google to Unveil 'Dart' Programming Language
eWeek (09/09/11) Darryl K. Taft

Google plans to introduce a new programming language called Dart at the upcoming Goto conference. Dart is described as a structured Web programming language, and Google engineers Lars Bak and Gilad Bracha are scheduled to present it at Goto, which takes place Oct. 10-12 in Aarhus, Denmark. Bracha is the creator of the Newspeak programming language, co-author of the Java Language Specification, and a researcher in the area of object-oriented programming languages. Bak has designed and implemented object-oriented virtual machines, and has worked on Beta, Self, Strongtalk, Sun's HotSpot, OOVM Smalltalk, and Google's V8 engine for the Chrome browser. In 2009, Google introduced the experimental language Go in an attempt to combine the development speed of working in a dynamic language, such as Python, with the performance and safety of a compiled language such as C or C++.

Tuesday, September 6, 2011

Blog: NSA Extends Label-Based Security to Big Data Stores (key/value data stores - NoSQL]

NSA Extends Label-Based Security to Big Data Stores
IDG News Service (09/06/11) Joab Jackson

The U.S. National Security Agency (NSA) recently submitted Accumulo, new label-based data store software, to the Apache Software Foundation, hoping that more parties will continue to develop the technology for use in future secure systems. "We have made much progress in developing this project over the past [three] years and believe both the project and the interested communities would benefit from this work being openly available and having open development," say the NSA developers. Accumulo, which is based on Google's BigTable design, is a key/value data store, in which providing the system with the key will return the data associated with that key. Accumulo also can be run on multiple servers, making it a good candidate for big data systems. The system's defining feature is the ability to tag each data cell with a label, and a section called column visibility that can store the labels. "The access labels in Accumulo do not in themselves provide a complete security solution, but are a mechanism for labeling each piece of data with the authorizations that are necessary to see it," the NSA says. The new label-based storage system could be the basis of other secure data store-based systems, which could be used by healthcare, government agencies, and other parties with strict security and privacy requirements.

Friday, August 26, 2011

Blog: Hanging Can Be Life Threatening

Hanging Can Be Life Threatening
AlphaGalileo (08/26/11)

Although testing and static code analysis are used to detect and remove bugs in a system during development, problems can still occur once a software system is in place and is being used in a real-world application. Such problems can cause one critical component of the system to hang without crashing the whole system and without being immediately obvious to operators and users until it is too late. Researchers at the Universita degli Studi di Napoli Federico II and at Naples company SESM SCARL have developed a software tool that offers non-obtrusive monitoring of systems, based on multiple sources of data gathered at the operating system level and collected data. "Our experimental results show that this framework increases the overall capacity of detecting hang failures, it exhibits a 100 percent coverage of observed failures, while keeping low the number of false positives, less than 6 percent in the worst case," according to the researchers. They also say the response time, or latency, between a hang occurring and detection is about 0.1 seconds on average, while the impact on computer performance of running the hang-detection software is negligible.

Friday, August 12, 2011

Blog: Robot 'Mission Impossible' Wins Video Prize

Robot 'Mission Impossible' Wins Video Prize
New Scientist (08/12/11) Melissae Fellet

Free University of Brussels researchers have developed Swarmanoid, a team of flying, rolling, and climbing robots that can work together to find and grab a book from a high shelf. The robot team includes flying eye-bots, rolling foot-bots, and hand-bots that can fire a grappling hook-like device up to the ceiling and climb the bookshelf. Footage of the team in action recently won the video competition at the Conference on Artificial Intelligence. The robotic team currently consists of 30 foot-bots, 10 eye-bots, and eight hand-bots. The eye-robots explore the rooms, searching for the target. After an eye-bot sees the target, it signals the foot-bots, which roll to the site, carrying the hand-bots. The hand-bots then launch the grappling hooks to the ceiling and climb the bookshelves. All of the bots have light-emitting diodes that flash different colors, enabling them to communicate with each other. Constant communication enables Swarmanoid to adjust its actions on the fly, compensating for broken bots by reassigning tasks throughout the team.

Monday, July 25, 2011

Blog: Sandia's CANARY Software Protects Water Utilities From Terrorist Attacks and Contaminants, Boosts Quality

Sandia's CANARY Software Protects Water Utilities From Terrorist Attacks and Contaminants, Boosts Quality
Sandia National Laboratories (07/25/11) Heather Clark

Researchers at Sandia National Laboratories and the U.S. Environmental Protection Agency have developed the CANARY Event Detection Software, an open source program that monitors public water systems to protect them from terrorist attacks or natural contaminants. The CANARY software tells utility operators whether something is wrong with their water system within minutes. CANARY can be customized for individual utility systems with their own sensors and software, according to Sandia's Sean McKenna. The researchers used algorithms to analyze data coming from multiple sensors and differentiate between natural variability and unusual patterns that indicate a problem. When new data is received, CANARY determines whether it is close enough to a known cluster to be considered normal or whether it is far enough away to be deemed anomalous. An unintended benefit of the software is that when utility operators better understood the data being sent by their sensors, they could make changes to the management of the water systems to improve its overall quality.

View Full Article

Thursday, July 21, 2011

Blog: Prof Says Tech Entering the Age of the Algorithm

Prof Says Tech Entering the Age of the Algorithm
University of Texas at Dallas (TX) (07/21/11) David Moore

University of Texas at Dallas (UTD) professor Andras Farago thinks that as algorithms become more important to software development, educational and career opportunities will follow. Farago says the rise in the importance of algorithms mirrors the life cycle of software, which originally was viewed as a secondary feature to hardware. "In a sense, algorithms up until very recently have had the same relationship to software implementation as software previously had to hardware: Icing on the cake," he says. However, Farago says there recently have been more cases, such as the Heritage Provider Network's $3 million prize, in which the hardest part is finding the perfect algorithm. "Once it is found, the implementation can be done by any skilled team, and I believe this may show the emergence of a trend in which the industry starts recognizing the real, hard value of sophisticated algorithms," he says. As part of the Heritage contest, participants are trying to design the algorithm that best predicts which people are more likely to require hospitalization in the future.

View Full Article

Blog Archive