Movie Magic Conjured by Science
Discovery News (12/30/10) Eric Niiler
Filmmakers increasingly use technology based on fluid dynamics to create realistic scenes of violent oceans and falling buildings. "It used to be that the story was limited by the technology," says Digital Domain's Doug Roble. "Now we're getting to the point where there are no limits." Roble and others are leading the way in the use of software that uses algorithms that describe the physics of nature. The mathematical formulas behind the algorithms also can be used for movie special effects. "In order to simulate it accurately, you have to take extremely small time steps to move the simulation forward," Roble says. "[Digital filmmaking] has a lot in common with foundational work with applied mathematics and computational physics," says Exotic Matter's Robert Bridson. He notes that the continued convergence of computer science and filmmaking is possible due to the new, inexpensive computing systems needed to run the software. "We will start seeing more low-budget independent types of shops producing extraordinary effects," Bridson predicts. Roble says simulating human expressions is the next frontier. "The human face is extraordinarily tough," he says. "Right now, the research community is focused on muscles and skin."
Thursday, December 30, 2010
Blog: Movie Magic Conjured by Science
Monday, December 27, 2010
Blog: Algorithms Take Control of Wall Street [an example of Holland's Complex Adaptive Systems (CAS)]
Algorithms Take Control of Wall Street
Wired (12/27/10) Felix Salmon; Jon Stokes
The bulk of trading on Wall Street and an increasing volume of global trading is governed by algorithms, and this has engendered a market that is more efficient, faster, and more intelligent than any human. However, it also is unpredictable, highly volatile, and often defies human understanding. Some of the algorithms are designed to discover, purchase, and sell individual stocks, while others were developed to help brokers carry out large trades. Consequently, the trading arena is glutted with competing lines of code, each one trying to outsmart and counter the other. Interaction between the algorithms can give rise to unexpected behaviors, and this can lead to sudden market drops such as the May 6 flash crash. In the wake of this event, the U.S. Securities and Exchange Commission imposed or is considering imposing safeguards such as governors for trading algorithms that would restrict the size and speed at which trades can be executed. But such measures can only slow down or halt the algorithms for a short while. "Our financial markets have become a largely automated adaptive dynamical system, with feedback," says University of Pennsylvania professor Michael Kearns.
Saturday, December 25, 2010
Blog: 7 Programming Languages on the Rise
7 Programming Languages on the Rise
InfoWorld (12/25/10) Peter Wayner
Python, Ruby on Rails, MATLAB, JavaScript, R, Erlang, Cobol, and CUDA are among the niche programming languages that are becoming increasingly popular in enterprises. Python has a structure that makes it easy to scale in the cloud, making it popular with scientists because it is flexible and allows users to improvise and quickly see results. Ruby on Rails is popular for prototyping and cataloging data that can be stored in tables. MATLAB was built to help mathematicians solve complex linear equations, but has found a market in the enterprise due to the large amounts of data that modern organizations must analyze. There are several open source alternatives to MATLAB, including Octave, Scilab, Sage, and PySci. Meanwhile, several new applications, including CouchDB and Node.js, have boosted the popularity of JavaScript. The R programming language carries out multiple functions for numerical and statistical analysis of large data sets, and is used to examine the feasibility of business or engineering projects. Erlang combines traditional aspects of functional programming with a modern virtual machine that organizes machine code. Cobol appeals to programmers who work well with syntax that is more similar to a natural language. CUDA extensions are being used by enterprise programmers for machine vision, huge simulations, and complicated statistical computations.
Thursday, December 23, 2010
Blog: Meet the Data-Storing Bacteria [each cell can hold about 5 GB]
Meet the Data-Storing Bacteria
PC World (12/23/10) Elizabeth Fish
University of Hong Kong researchers have inserted 90 gigabytes (GB) of data into the DNA of a colony of 18 E.coli bacteria in an attempt to test its capability of storing electronic data. Bacteria possess enormous storage capacities, considering a gram contains about 10 million cells, and each cell can hold about 5 GB. Moreover, different types of cells are more radioresistant than others, which suggests that data in certain cells would survive a nuclear explosion. However, accessing that data is problematic. The researchers say that retrieving data from DNA cells currently is "tedious and expensive," and they note that stored data would be jeopardized because DNA cells can mutate. The team has only used genetically modified bacteria and copyright information data storing for testing.
Blog: Software [using classical computing] Said to Match Quantum Computing Speed
Software Said to Match Quantum Computing Speed
IDG News Service (12/23/10) Joab Jackson
University of Waterloo researchers have shown that for some computing problems, using the right software algorithms could enable classical computing techniques to work just as well as quantum computing. The researchers demonstrated how a seldom-used algorithm could achieve new levels in problem-solving performance when used on contemporary computers and theoretically match quantum computing speeds. "One striking implication of this characterization is that it implies quantum computing provides no increase in computational power whatsoever over classical computing in the context of interactive proof systems," according to the paper. Massachusetts Institute of Technology professor Scott Aaronson and colleagues recently proved that quantum interactive proof systems are just as difficult to solve as classical interactive proof systems, by using the matrix multiplicative weights update method to devise a new algorithm. The algorithm provides a method for solving problems using parallel processes, matching the efficiency of quantum computing. The researchers illustrated that "for a certain class of semi-definite programs you can get not the exact answer but a very good approximate answer, using a very small amount of memory," Aaronson says.
Monday, December 20, 2010
Blog: DARPA Goal for Cybersecurity: Change the Game
DARPA Goal for Cybersecurity: Change the Game
DVIDS (12/20/10) Cheryl Pellerin
The U.S. Defense Advanced Research Projects Agency (DARPA) has developed programs that deal with cybersecurity threats by surprising the attackers. The agency created the Clean-slate Design of Resilient, Adaptive, Secure Hosts (CRASH) and Programming Computation on Encrypted Data (PROCEED) programs to enhance the agency's cybersecurity research, says DARPA's Kaigham Gabriel. CRASH aims to develop new computer systems that resist cyberattacks the same way organisms fight bacteria and viruses. Gabriel says the researchers are developing computer hardware that give systems a kind of genetic diversity that would make them more resistant to cyberinfections by learning from attacks and repairing themselves. He notes that over the last two decades, the lines of code in security software has increased from approximately 10,000 to about 10 million lines, but the number of lines of code in malware has remained constant at about 125 lines. This analysis and others "led us to understand that many of the things we're doing are useful, but they're not convergent with the problem," Gabriel says. The PROCEED program is working to improve the efficiency of working on encrypted data that has not been decrypted. "If we were able to do relevant sorts of operations without ever having to decrypt, that would be a tremendous gain because ... whenever you decrypt into the open, you create vulnerability," he says.
Sunday, December 19, 2010
Blog: Computers Help Social Animals to See Beyond Their Tribes
Computers Help Social Animals to See Beyond Their Tribes
New York Times (12/19/10) Noam Cohen
IBM's Center for Social Software is employing increasingly sophisticated computers to function as information advisers for users of social media. "I do think of computers as augmenting people, not replacing them," says center director Irene Greif. "We need help with the limits of the brain." The lab's scientists produce programs that spot patterns in the information flood, making it easier to choose which data or people are worth a user's full attention. For example, the researchers created the Many Bills Web site, which summarizes and displays congressional bills as they go through the legislative process via textual analysis, highlighting certain material of interest. Another tool designed for IBM employees, SaNDVis, can help search for expertise by displaying a web of relationships surrounding a search term to show who within IBM is an expert on a certain subject, mapping these links using writings, meetings attended, personal profile information, and previous work experience. IBM also performs data mining on its own workforce, with access to the full spectrum of internal social networking tools connected to an employee ID number. For business purposes, IBM is trying to escape the standard mode of social network use for navigating the data flood, which is interaction with like-minded friends that reinforces bias.
Friday, December 17, 2010
Blog: In 500 Billion Words, New Window on Culture [Steven Pinker involved]
In 500 Billion Words, New Window on Culture
New York Times (12/17/10) Patricia Cohen
Researchers from Google and Harvard University have developed an online database of 500 billion words taken from 5.2 million digitized books published between 1500 and 2008 in English, French, Spanish, German, Chinese, and Russian. The database offers a year-by-year count of how often certain words and phrases appear, data representations, and searching tools. Users can submit a string of up to five words and see a graph that displays the phrase's use over time. "The goal is to give an eight-year-old the ability to browse cultural trends throughout history, as recorded in books," says Harvard's Erez Lieberman Aiden. The database provides research opportunities to liberal arts professors, who have historically avoided quantitative analysis, in a new field dubbed culturomics. "We wanted to show what becomes possible when you apply very high-turbo data analysis to questions in the humanities," says Lieberman Aiden. The data set is downloadable and users can develop their own search tools. The researchers estimate that the English language has grown by 70 percent in the last 50 years and the new system could be used to update dictionaries by highlighting newly popular and underused words. The database and others like it will soon become universal in humanities research, says Harvard's Steven Pinker.
Tuesday, December 14, 2010
Blog: JASON: Science of Cyber Security Needs More Work
JASON: Science of Cyber Security Needs More Work
Secrecy News (12/14/10) Steven Aftergood
The JASON independent scientific advisory panel has produced a report on cybersecurity for the U.S. Department of Defense (DoD) that says a fundamental understanding of the science of cybersecurity is needed to improve the country's security approaches. The advisory says the science of cybersecurity "seems underdeveloped in reporting experimental results, and consequently in the ability to use them." The report notes that the science of cybersecurity is unique in that the background for events is almost completely created by humans and is digital, and there are good actors as well as adversaries who are purposeful and intelligent. The JASON report also addresses the importance of definitions, the need for a standard vocabulary to discuss the subject, and the need to devise experimental protocols for developing a reproducible experimental science of cybersecurity. "At the most abstract level, studying the immune system suggests that cybersecurity solutions will need to be adaptive, incorporating learning algorithms and flexible memory mechanisms," the report says. It also says the DoD should support a network of cybersecurity research centers in universities and elsewhere.
Monday, December 13, 2010
Blog: Cryptographers Chosen to Duke It Out in Final Fight [SHA-3]
Cryptographers Chosen to Duke It Out in Final Fight
New Scientist (12/13/10) Celeste Biever
The U.S. National Institute of Standards and Technology (NIST) has selected five Secure Hash Algorithm (SHA-3) entrants as finalists for its competition to find a replacement for the gold-standard security algorithm. The finalists include BLAKE, devised by a team led by Jean-Philippe Aumasson of the Swiss company Nagravision, and Skein, which is the work of computer security expert and blogger Bruce Schneier. "We picked five finalists that seemed to have the best combination of confidence in the security of the algorithm and their performance on a wide range of platforms" such as desktop computers and servers, says NIST's William Burr. "We wanted a set of finalists that were different internally, so that a new attack would be less likely to damage all of them, just as biological diversity makes it less likely that a single disease can wipe out all the members of a species." The finalists incorporate new design ideas that have arisen in recent years. The Keccak algorithm from a team led by STMicroelectronics' Guido Bertoni uses a novel idea called sponge hash construction to produce a final string of 1s and 0s. The teams have until Jan. 16, 2011, to tweak their algorithms, then an international community of cryptanalysts will spend a year looking for weaknesses. NIST willl pick a winner in 2012.
Friday, December 10, 2010
Blog: Problem-Solving Ants Inspire Next Generation of Algorithms
Problem-Solving Ants Inspire Next Generation of Algorithms
University of Sydney (12/10/10) Katie Szittner
University of Sydney researchers have found that ants can solve difficult math problems as well as adapt the optimal solution to fit a changing problem. The researchers say their results will help computer scientists develop better software to solve logistical problems and maximize efficiency in different industries. The researchers tested the ants using a version of the Towers of Hanoi problem, a toy puzzle that asks players to move disks between rods while following certain rules and using the fewest possible moves. The researchers converted the puzzle into a maze in which the shortest path corresponds to the solution with the fewest moves in the toy puzzle. The findings suggest that when the ants use an exploratory pheromone they are much better at solving a problem in a changing environment, which is similar to many real-world human problems. "Discovering how ants are able to solve dynamic problems can provide new inspiration for optimization algorithms, which in turn can lead to better problem-solving software and hence more efficiency for human industries," says Sydney researcher Chris Reid.
Thursday, December 9, 2010
Blog: Researchers Open the Door to Biological Computers
Researchers Open the Door to Biological Computers
University of Gothenburg (Sweden) (12/09/10) Anita Fors
University of Gothenburg researchers have developed genetically altered yeast cells that can communicate with each other like electronic circuits. They say the technology could lead to complex systems in which human cells help keep the body healthy. "In the future we expect that it will be possible to use similar cell-to-cell communication systems in the human body to detect changes in the state of health, to help fight illness at an early stage, or to act as biosensors to detect pollutants in connection with our ability to break down toxic substances in the environment," says Gothenburg researcher Kentaro Furukawa. The yeast cells can sense their surroundings based on predetermined criteria and send messages to other cells using signaling molecules. The different cells can be fixed together to build more complex circuits, including electronic functions. "Even though engineered cells can't do the same job as a real computer, our study paves the way for building complex constructions from these cells," Furukawa says.
Wednesday, December 8, 2010
Blog: UCSF Team Develops "Logic Gates" to Program Bacteria as Computers
UCSF Team Develops "Logic Gates" to Program Bacteria as Computers
UCSF News (12/08/10) Kristen Bole
University of California, San Francisco (UCSF) researchers have genetically engineered E. coli bacteria with a specific molecular circuitry that will enable scientists to program the cells to communicate and perform computations. The process can be used to develop cells that act like miniature computers that can be programmed to function in a variety of ways, says UCSF professor Christopher A. Voigt. "Here, we've taken a colony of bacteria that are receiving two chemical signals from their neighbors, and have created the same logic gates that form the basis of silicon computing," Voigt says. The technology will enable researchers to use cells to perform specific, targeted tasks, says UCSF's Mary Anne Koda-Kimble. The purpose of the research is to be able to utilize all of biology's tasks in a reliable, programmable way, Voigt says. He says the automation of biological processes will advance research in synthetic biology. The researchers also plan to develop a formal language for cellular computation that is similar to the programming languages used to write computer code, Voigt says.
Blog: Quantum Links Let Computers Understand Language
Quantum Links Let Computers Understand Language
New Scientist (12/08/10) Jason Aron
University of Oxford researchers are using a form of graphical mathematics to develop an approach to linguistics that could enable computers to make sense of language. Oxford's Bob Coecke and Samson Abramsky used a graphical form of the category theory, a branch of mathematics that allows different objects within a collection to be linked, to formulate quantum mechanical problems more intuitively by providing a way to link quantum objects. The researchers are using that graphical approach to create a universal theory of meaning in which language and grammar are encoded as mathematical rules. Most existing human language models focus on deciphering the meaning of individual words, or the rules of grammar, but not both. The researchers combined the existing models using the graphical approach that was designed for quantum mechanics. Coecke developed an algorithm that connects individual words. The Oxford team plans to teach the system using a billion pieces of text taken from legal and medical documents.
Tuesday, December 7, 2010
Blog: How Rare Is that Fingerprint? Computational Forensics Provides the First Clues
How Rare Is that Fingerprint? Computational Forensics Provides the First Clues
UB News Services (12/07/10) Ellen Goldbaum
University at Buffalo researchers have developed a method to computationally determine how rare a particular fingerprint is and how likely it is to belong to a specific crime suspect. The Buffalo researchers created a probabilistic method to determine if a fingerprint would randomly match another in a database. The researchers say their study could help develop computational systems that quickly and objectively show how important fingerprints are to solving crimes. "Our research provides the first systematic approach for computing the rarity of fingerprints in a scientifically robust and reliable manner," says Buffalo professor Sargur N. Srihari. Determining the similarity between two sets of fingerprints and the rarity of a specific configuration of ridge patterns are the two main types of problems involved in fingerprint analysis, Srihari says. The Buffalo method relies on machine learning, statistics, and probability.
Wednesday, December 1, 2010
Blog: New Psychology Theory Enables Computers to Mimic Human Creativity at Rensselaer Polytechnic Institute
New Psychology Theory Enables Computers to Mimic Human Creativity at Rensselaer Polytechnic Institute
RPI News (12/01/10) Mary L. Martialay
Rensselaer Polytechnic Institute researchers are using the new Explicit-Implicit Interaction Theory to develop artificial intelligence (AI) models. The researchers say the theory, which explains how humans solve problems creatively, could provide a blueprint to building AI systems that perform tasks like humans. The model can be used "as the basis for creating future artificial intelligence programs that are good at solving problems creatively," says RPI professor Ron Sun. He worked with the University of California, Santa Barbara's Sebastien Helie to develop the CLARION cognitive architecture, a system based on the Explicit-Implicit theory that acts like a cognitive system. The researchers ran a logic test in which 35 percent of humans answered correctly after discussing their thinking and 45 percent answered correctly after working on another problem. In 5,000 trials of the same test, the CLARION system got the correct answer 35 percent of the time on the first try, and 45 percent of the time on the second try. "This tells us how creative problem solving may emerge from the interaction of explicit and implicit cognitive processes," Sun says.
Tuesday, November 30, 2010
Blog: IBM Chip Breakthrough May Lead to Exascale Supercomputers
IBM Chip Breakthrough May Lead to Exascale Supercomputers
Computerworld (11/30/10) Agam Shah
IBM's new CMOS Integrated Silicon Nanophotonics technology boosts the data transfer rate between computer chips using pulses of light, a development that could increase the performance of supercomputers by a thousand times or more. CMOS Integrated Silicon Nanophotonics combines electrical and optical components on one piece of silicon. The new technology can replace the copper wires that are used in most chips today. The integrated silicon converts electrical signals into pulses of light, making the communication between chips faster, says IBM researcher Will Green. He says the photonics technology could boost supercomputing calculations to speeds approaching an exaflop, which IBM hopes to develop into an exaflop computer by 2020. "This is an interesting milestone for system builders [who are] looking at building ... exascale systems in 10 years," Green says. IBM also plans to use the optics technology to develop new types of transistors. "The nice thing about it is we have a platform which allows us to address many different places simultaneously," he says.
Tuesday, November 16, 2010
Blog: ECS Researcher Highlights Need for Transparency on the Web
ECS Researcher Highlights Need for Transparency on the Web
University of Southampton (United Kingdom) (11/16/10) Joyce Lewis
The complex flows of information on the Web make it difficult to determine where information originates from, says University of Southampton professor Luc Moreau. "This is a challenge since we want to be able to establish the exact source of information, we want to decide whether information has been altered, and by whom, we want to corroborate and possibly reproduce such information, and ultimately we want to decide whether the information comes from a trustworthy source," Moreau says. The solution lies in provenance, which focuses on establishing that an object has not been forged or altered, and could apply to computer-generated data. He says enabling users to determine where data comes from and decide if it is trustworthy will lead to a new generation of Web services that are capable of producing trusted information. Moreau notes that systems would become transparent as a result of provenance. "Our aim, with the community of researchers, is to establish a standard method to ascertain the provenance of information on the Web," he says.
Blog: 'Chaogates' Hold Promise for the Semiconductor Industry
'Chaogates' Hold Promise for the Semiconductor Industry
EurekAlert (11/16/10) Jason Socrates Bardi
Researchers have created alternative logic gates, dubbed chaogates, by selecting desired patterns offered by a chaotic system, and using a subset to map system inputs to desired outputs. The process offers a way to use the richness of nonlinear dynamics to design computing devices with the capacity to reconfigure into a range of logic gates. "Chaogates are the building block of new, chaos-based computer systems that exploit the enormous pattern formation properties of chaotic systems for computation," says Arizona State University's William Ditto. "Imagine a computer that can change its own internal behavior to create a billion custom chips a second based on what the user is doing that second--one that can reconfigure itself to be the fastest computer for that moment, for your purpose." Ditto says chaogates offer advantages for gaming, secure computer chips, and custom, morphable gaming chips. He notes that integrated circuits using chaogates can be manufactured using existing production systems, and they can incorporate standard logic, memory, and chaogates on the same device.
Monday, November 15, 2010
Blog: Rensselaer Team Shows How to Analyze Raw Government Data
Rensselaer Team Shows How to Analyze Raw Government Data
RPI News (11/15/10) Marshall Hoffman; Mark Marchand
Researchers at Rensselaer Polytechnic Institute's Tetherless World Research Constellation have developed a method for finding relationships buried within government data, using mash-up technology that can combine it to identify new relationships. "We're working on designing simple yet robust Web technologies that allow someone with absolutely no expertise in Web Science or semantic programming to pull together data sets from Data.gov and elsewhere and weave them together in a meaningful way," says Rensselaer professor Deborah McGuinness. The approach also enables U.S. government agencies to share information more readily. The researchers developed a Web site that provides examples of what the approach can accomplish. The RPI researchers used Semantic Web technologies, enabling multiple data sets to be linked even when the underlying structure is different. "Data.gov mandates that all information is accessible from the same place, but the data is still in a hodgepodge of different formats using differing terms, and therefore challenging at best to analyze and take advantage of," says Rensselaer professor James Hendler. "We are developing techniques to help people mine, mix, and mash-up this treasure trove of data, letting them find meaningful information and interconnections."
Friday, November 12, 2010
Blog: Time to blow up best practices myths
Time to blow up best practices myths
By Dennis Howlett | November 12, 2010, 11:25am PST
I’m not sure if this is a trend but I am noticing the term ‘best practices’ turning up in many a presentation. It kinda goes like this: ‘We know your implementation is failing but if you follow these best practices, then hey presto, good karma will be magically restored to all.’ The polite way of characterizing this is ‘piffle.’ The less polite way is per this Tweet:
@SAP_Jarret The term ‘best practice’ is - Grade A BS. A best practice might exist for 1,2 co’s rarely if ever for everyone
This Tweet prompted SAP consultant Nathan Genez to get up on his hind legs on the SAP Community Netowrk and lay into the expression. He starts:
Best Practices are merely a guide or process that is believed to be more effective than an alternative processes. Note that they are not *the* solution even though most everyone in the industry equates one with the other. When interpreted from this more moderate (realistic?) viewpoint, they serve as a good reference point for SAP customers. Considering the large number of SAP projects that fail to live up to pre-implementation expectations and deliver sub-optimal solutions, it would seem that the industry would be falling over itself to continually refine these best practices. Part of that process would be to correctly interpret the phrase from the get-go but the industry doesn’t seem to care about that.
… But then in comes the consultants. As Nathan describes:
I dislike the phrase because I routinely see consultants using it as a shield. By that, I mean that they use the phrase “its best practice” as a way to justify what is, in fact, just their opinion. This seems to come mostly from people who can’t justify their answer on their own. They can’t explain the rationale behind why their solution is better / easier / quicker / more stable / etc. Either they don’t fully understand the functionality or process in question, or they aren’t aware of all of the alternative solutions that are available, and therefore can’t justify their answer based on merit. They take the easy way out… they recommend a course of action based on the little that they know and then append “its best practice” to it as if this will legitimze the inaccuracies of their answer. Then they sweat bullets as they pray that the other party won’t press them on the issue.
Nathan’s argument takes us to a level I believe is sorely under-estimated. When you look at the footprint that an ERP covers it may, and I say may, reach 30-45% of required functionality. It should therefore be obvious that what masquerades as a claimed best practice needs careful examination. Too often, customers are blinded by Jar-Gon and then wonder what went wrong.
Blog: Rats to Robots--Brain's Grid Cells Tell Us How We Navigate
Rats to Robots--Brain's Grid Cells Tell Us How We Navigate
Queensland University of Technology (11/12/10) Niki Widdowson
Queensland University of Technology (QUT) robotics researchers have formulated a theory on how the brain combines separate pieces of information to map out familiar environments and navigate them. The theory was prompted by practical improvements that were made to the navigation system of robots that were having problems with some navigational tasks. QUT's Michael Milford says that Norwegian researchers recently discovered new cells in the brains of rats that are arranged in a grid and fire every time a rat is in one of a number of locations. Preliminary evidence also suggests that other animals, including humans, have certain cells that fire only when they are in a certain place. A person who may not be paying attention when exiting an elevator would begin to think he or she is on the second floor when seeing a Coke machine and then a photocopier. "We are postulating that the 'grid cells' help put these two pieces of information together to tell you you're on the second floor," Milford says. "In this study we are able to enhance our understanding of the brain by providing insights into how the brain might solve a common problem faced by both mobile robots and animals."
Blog: Algorithm Pioneer Wins Kyoto Prize
Algorithm Pioneer Wins Kyoto Prize
EE Times (11/12/10) R. Colin Johnson
Eotvos Lorand University professor Laszlo Lovasz, who has solved several information technology (IT) problems using graph theory, has been awarded the Kyoto Prize. "Graph theory represents a different approach to optimization problems that uses geometry to compute results instead of differential equations," says Lovasz. "It turns out that very large networks in many different fields can be described by graphs, from cryptography to physical systems." His work has led to breakthroughs in RSA encryption technology, 4G channel capacity, extending the point-to-point IT of Claude Shannon, and the weak perfect graph conjecture. Lovasz may be best known for the breakthrough principles called the "Lovasz local lemma" and the "LLL-algorithm," which are widely used in cryptography, and for the multiple-input and multiple-output wireless communications scheme. The Kyoto Prize was founded by Kyocera chairman Kazuo Inamori in 1984 and comes with a $550,000 award.
Monday, November 8, 2010
Blog: The Ethical Robot
The Ethical Robot
University of Connecticut (11/08/10) Christine Buckley; Bret Eckhardt
University of Connecticut professor Susan Anderson and University of Hartford computer scientist Michael Anderson have programmed a robot to behave ethically. Their work is part of a relatively new field of research known as machine ethics. "There are machines out there that are already doing things that have ethical import, such as automatic cash withdrawal machines, and many others in the development stages, such as cars that can drive themselves and eldercare robots," says Susan Anderson. Machine ethics combines artificial intelligence with ethical theory to determine how to program machines to behave ethically. The robot, called Nao, is programmed with an ethical principle that determines how often to remind people to take their medicine and when to notify a doctor when they do not comply. "We should think about the things that robots could do for us if they had ethics inside them," says Michael Anderson. Interacting with robots that have been programmed to behave ethically could inspire humans to behave more ethically, says Susan Anderson.
Blog: Part Moth, Part Machine: Cyborgs Are on the Move
Part Moth, Part Machine: Cyborgs Are on the Move
New Scientist (11/08/10) Duncan Graham-Rowe
Researchers are developing methods to produce complex behavior from robots by tapping into the nervous system of living organisms and using algorithms that already exist in nature. For example, Tokyo Institute of Technology researchers have developed a cyborg moth that uses chemical plume tracking to locate the source of certain pheromones. The researchers immobilized a moth on a small wheeled robot and placed two recording electrodes into nerves running down its neck to monitor commands the moth uses to steer. By rerouting these signals to motors in the robot, the researchers found that they could emulate the moth's plume-tracking behavior. Researchers also hope to recreate biological circuits in silicon, says Northwestern University's Ferdinando Mussa-Ivaldi. Scientists have made progress toward this goal with central pattern generators (CPGs), which are a type of behavioral circuit in the human brain and spine that carry out routine tasks with little or no conscious input, such as walking or grasping an object. Johns Hopkins University's Ralph Etienne-Cummings has used recordings of CPGs taken from a lamprey to generate walking motions in a pair of robotic legs.
View Full Article - May Require Free Registration
Friday, November 5, 2010
Blog: Gartner Report: The Future of Information Security is Context Aware and Adaptive
Note the futility of following the static approach to security. Another important issue, probably covered in the report, is the false sense of security that comes from depending on a static security environment.
--Peter
|
Thursday, November 4, 2010
Blog: Metasploit and SCADA exploits: dawn of a new era?
Metasploit and SCADA exploits: dawn of a new era?
By Ryan Naraine | November 4, 2010, 11:23am PDT
On 18 October, 2010 a significant event occurred concerning threats to SCADA (supervisory control and data acquisition) environments. Let’s think through the ramifications. That event is the addition of a zero-day exploit for the RealFlex RealWin SCADA software product into the Metasploit repository.
Wednesday, November 3, 2010
Blog: The bigger the system, the greater the chance of failure
The bigger the system, the greater the chance of failure By Joe McKendrick November 3, 2010, 7:00pm PDT | |
IT projects will see more success as smaller, bite-size chunks, especially since software has reached a state of complexity far beyond the ability of any individual. |
Blog: New Google Tool Makes Websites Twice as Fast
New Google Tool Makes Websites Twice as Fast
Technology Review (11/03/10) Erica Naone
Google has released mod_pagespeed, free software for Apache servers that could make many Web sites load twice as fast. Once installed, the software spontaneously determines way to optimize a Web site's performance. "We think making the whole Web faster is critical to Google's success," says Google's Richard Rabbat. The tool could be especially useful to small Web site operators and anyone that uses content management systems to operate their Web sites, since they often lack the technical savvy and time needed to make their own speed improvements to Web server software. During testing, mod_pagespeed was able to make some Web sites load three times faster, depending on how much optimization had already been done. The program builds on Google's existing Page Speed program, which measures the speed at which Web sites load and offers suggestions on how to make them load faster.
Tuesday, November 2, 2010
Blog: A Software Application Recognizes Human Emotions From Conversation Analysis
A Software Application Recognizes Human Emotions From Conversation Analysis
Universidad Politecnica de Madrid (Spain) (11/02/10) Eduardo Martinez
Researchers at the Universidad Politecnica de Madrid have developed an application that can recognize human emotion from automated voice analysis. The program, based on a fuzzy logic tool called RFuzzy, analyzes a conversation and can determine whether the speaker is sad, happy, or nervous. If the emotion is unclear, the program can specify how close the speaker is to each emotion in terms of a percentage. RFuzzy also can reason with subjective concepts such as high, low, fast, and slow. The researchers say RFuzzy, which was written in Prolog, also could be used in conversation analysis and robot intelligence applications. For example, RFuzzy was used to program robots that participated in the RoboCupSoccer league. Because RFuzzy's logical mechanisms are flexible, its analysis can be interpreted based on logic rules that use measurable parameters, such as volume, position, distance from the ball, and speed.
Monday, November 1, 2010
Blog: New Help on Testing for Common Cause of Software Bugs
New Help on Testing for Common Cause of Software Bugs
Government Computer News (11/01/10) William Jackson
As part of the Automated Combinatorial Testing for Software (ACTS) program, the U.S. National Institute of Standards and Technology (NIST) has developed algorithms for automated testing of the multiple variables in software that can cause security faults. Research has shown that at least 89 percent of security faults are caused by combinations of no more than four variables, and nearly all are caused by no more than six variables, according to NIST. "This finding has important implications for testing because it suggests that testing combinations of parameters can provide highly effective fault detection," NIST says. The ACTS program is a collaborative effort by NIST, the U.S. Air Force, the University of Texas at Arlington, George Mason University, Utah State University, the University of Maryland, and North Carolina State University to produce methods and tools to generate tests for any number of variable combinations.
Saturday, October 30, 2010
Blog: In D.C.'s Web Voting Test, the Hackers Were the Good Guys
In D.C.'s Web Voting Test, the Hackers Were the Good Guys
Washington Post (10/30/10) Jeremy Epstein; David Jefferson; Barbara Simons
Washington, D.C., held an Internet voting experiment in September during which a team of University of Michigan hackers successfully penetrated election computers and rigged the electoral outcome, demonstrating the extreme national security hazards of online voting, write SRI International computer scientist Jeremy Epstein, Lawrence Livermore National Laboratory researcher David Jefferson, and former ACM president Barbara Simons. They say the test verifies that Internet voting systems can be assaulted from anywhere by any malicious individual or entity, and effective defense is a virtual impossibility. Worse still, a cyberattack against an election may be completely invisible to election officials. The computer security community agrees that no secure Internet voting framework exists for public elections, and that simply correcting the problems highlighted by the recent experiment will not ensure the system's security. Epstein, Jefferson, and Simons contend that the hacker team "has done our nation an enormous service" by calling attention to the dangers of Internet voting, but they lament that more than 30 U.S. states are permitting the use of online voting systems in the midterm elections, despite the lessons learned from the D.C. experiment.
Thursday, October 28, 2010
Blog: Computer Scientists Make Progress on Math Puzzle
Computer Scientists Make Progress on Math Puzzle
UT Dallas News (10/28/10) David Moore
University of Texas at Dallas (UTD) professors Linda Morales and Hal Sudborough have made progress on the Topswops mathematical puzzle. Stanford University computer scientist Donald Knuth previously proved an exponential upper bound on the number of Topswops steps, but Morales and Sudborough proved a lower bound that is better than that proposed in Knuth's conjecture. "What I find fascinating about a problem such as bounding the Topswops function is connected to its simplicity, to its fundamental nature, and to the complexity and difficulty of finding an answer," Sudborough says. "Our research uncovered permutations whose iterate sequences have a fascinating structure, which upon analysis have revealed hitherto unknown lower bounds for the problem," Morales says. Knuth called their proof technique both "elegant" and "amazing." "There is much more to learn from the problem," Morales says. "We have tantalizing hints of more revelations just waiting to be uncovered."
Monday, October 25, 2010
Blog: 7 Programming Languages on the Rise
7 Programming Languages on the Rise
InfoWorld (10/25/10) Peter Wayner
Seven increasingly popular niche programming languages offer features that cannot be found in the dominant languages. For example, Python has gained popularity in scientific labs. "Scientists often need to improvise when trying to interpret results, so they are drawn to dynamic languages which allow them to work very quickly and see results almost immediately," says Python's creator Guido von Rossum. Many Wall Street firms also rely on Python because they like to hire university scientists to work on complex financial analysis problems. Meanwhile, Ruby is becoming popular for prototyping. Ruby sites are devoted to cataloging data that can be stored in tables. MatLab was originally designed for mathematicians to solve systems of linear equations, but it also has found a following in the enterprise because of the large volumes of data that organizations need to analyze. Although JavaScript is not a new programming language, new applications for JavaScript are constantly in development. For example, CouchDB uses JavaScript's Map and Reduce functions to help bring harmony to both client and server-side programming. Other popular niche languages include R, which also is known as S and S-Plus, Erlang, Cobol, and CUDA.
Sunday, October 24, 2010
Blog: As E-Voting Comes of Age, Security Fears Mount
As E-Voting Comes of Age, Security Fears Mount
Agence France-Presse (10/24/10) Rob Lever
New technologies that allow voters to cast ballots using the Internet or other electronic means are gaining popularity in the United States and elsewhere, despite growing security concerns. Thirty-three U.S. states are allowing some email, fax, or online ballots in 2010, according to the Verified Voting Foundation (VVF). These systems have the potential to increase voter participation but their security remains in question. For example, University of Michigan computer scientists recently hacked into a Washington D.C. pilot Internet voting system and changed the password directing the system to play the university fight song. "Within the first three hours or so of looking at the code we found the first open door and within 36 hours we had taken control of the system," according to Michigan professor Alex Halderman. He says that during the attack they discovered that hackers from Iran and China also were trying to hack into the system. "After this, there can be no doubt that the burden of proof in the argument over the security of Internet voting systems has definitely shifted to those who claim that the systems can be made secure," says VVF chairman David Jefferson.
Friday, October 22, 2010
Blog: D.C. Hacking Raises Questions About Future of Online Voting
D.C. Hacking Raises Questions About Future of Online Voting
Stateline.org (10/22/10) Sean Greene
Washington, D.C.'s failure to prevent a team of University of Michigan computer scientists from taking control of its online voting Web site has called into the question the future of electronic voting. Michigan professor J. Alex Halderman noted that his team had taken complete control over the elections board's server. Although some experts say the incident proves that the Internet, in its current state, cannot support secure online voting, others still see potential in the technology. For example, Arizona and eight counties in West Virginia are planning to go ahead with online voting experiments on November 2. "All an attacker has to find is one hole in a system to mount a serious attack," warns University of California, Berkeley researcher Joe Hall. Washington D.C. used a system based on open source software, believing that it provides the transparency necessary for elections. The West Virginia counties are using proprietary software that officials say should be more resistant to hackers. Rokey Suleman, executive director of the D.C. board of elections, says the hacking incident is an opportunity to improve the technology. "We are not disappointed that this occurred," Suleman says. "It is an opportunity for the computer science community to work with us."
Wednesday, October 20, 2010
Blog: New Search Method Tracks Down Influential Ideas
New Search Method Tracks Down Influential Ideas
Princeton University (10/20/10) Chris Emery
Princeton University computer scientists have developed a method that uses computer algorithms to trace the origins and spread of ideas, which they say could make it easier to measure the influence of scholarly papers, news stories, and other information sources. The algorithms analyze how language changes over time within a group of documents and determines which documents had the most influence. "The point is being able to manage the explosion of information made possible by computers and the Internet," says Princeton professor David Blei. He says the search method could eventually enable historians, political scientists, and other scholars to study how ideas originate and spread. The Princeton method enables computers to analyze the actual text of documents, instead of focusing on citations, to see how the language changes over time. "We are also exploring the idea that you can find patterns in how language changes over time," Blei says.
Monday, October 18, 2010
Blog: Analyzing Almost 10 Million Tweets, Research Finds Public Mood Can Predict Dow Days in Advance
Analyzing Almost 10 Million Tweets, Research Finds Public Mood Can Predict Dow Days in Advance
IU News Room (10/18/10) Steve Chaplin
Indiana University researchers found that by analyzing millions of tweets they can predict the movement of the Dow Jones Industrial Average (DJIA) up to a week in advance with near 90 percent accuracy. Indiana professor Johan Bollen and Ph.D. candidate Huina Mao used two mood-tracking tools to analyze the text content of more than 9.8 million Twitter feeds and compared the public mood to the DJIA's closing value. One tool, called OpinionFinder, analyzed the tweets to give a positive or negative daily time series of public mood. The other tool, Google-Profile of Mood States, measures the mood of tweets in six dimensions--calm, alert, sure, vital, kind, and happy. The two tools gave the researchers seven public mood time series that could be matched against a similar daily time series for the DJIA closing. "What we found was an accuracy of 87.6 percent in predicting the daily up and down changes in the closing values of the Dow Jones Industrial Average," Bollen says. The researchers demonstrated that public mood can significantly improve the accuracy of the basic models currently used to predict Dow Jones closing values by implementing a prediction model called a Self-Organizing Fuzzy Neural Network.
Sunday, October 17, 2010
Blog: HIMSS Analytics, the 8 stages to creating a paperless patient record environment
HIMSS Analytics, the authoritative source on EMR Adoption trends, devised the EMR Adoption Model to track EMR progress at hospitals and health systems. The EMRAM scores hospitals in the HIMSS Analytics Database on their progress in completing the 8 stages to creating a paperless patient record environment.
Thursday, October 14, 2010
Blog: Faster Websites, More Reliable Data
Faster Websites, More Reliable Data
MIT News (10/14/10) Larry Hardesty
Massachusetts Institute of Technology (MIT) researchers have developed TxCache, a database caching system that eliminates certain types of asymmetric data retrieval while making database caches easier to program. TxCache is designed to solve the problem of making sure that data cached on local servers is as current as the data stored in the master database. The MIT system can handle transactions, sets of computations that are treated as a block, which means that none of the computations will be performed unless all of them can be performed. TxCache makes it easier for programmers to manage caches, says MIT graduate student Dan Ports, who led the system's development along with professor Barbara Liskov, who received the 2008 ACM A.M. Turing Award. Ports says TxCache ensures that programmers can change variables in a line of code just once, and have the cached copies be automatically updated everywhere. The system has to track what data are cached where, and which data depend on each other, Liskov says. The researchers say that during testing, Websites using TxCache were more than five times faster.
Blog: Five tips to learn from failure
Five tips to learn from failure By Michael Krigsman October 14, 2010, 5:56am PDT | |
This five-point advisory list offers a great start to organizations that want to improve IT project success rates. |
Wednesday, October 6, 2010
Blog: W3C: Hold Off on Deploying HTML5 in Websites
W3C: Hold Off on Deploying HTML5 in Websites
InfoWorld (10/06/10) Paul Krill
The World Wide Web Consortium (W3C) says that HTML5, which has gained the support of Microsoft, Apple, and Google, is still not ready for deployment to Web sites. "The problem we're facing right now is there is already a lot of excitement for HTML5, but it's a little too early to deploy it because we're running into interoperability issues," including differences between video on devices, says W3C's Philippe Le Hegaret. Companies can now deploy HTML5 in their applications or in intranets, where a rendering engine can be controlled, but it is not feasible on the open Web right now, Le Hegaret says. When finished, HTML5 will support a variety of modern Web applications, and Le Hegaret notes it is now being viewed as a "game changer." "What's happening is the industry is realizing that HTML5 is going to be real," he says. However, Le Hegaret says W3C still plans to make some API changes, and notes that HTML5 also does not yet work across all browsers. "We basically want to be feature-complete by mid-2011," he says.
Blog: D.C. Web Voting Flaw Could Have Led to Compromised Ballots
D.C. Web Voting Flaw Could Have Led to Compromised Ballots
Computerworld (10/06/10) Jaikumar Vijayan
University of Michigan researchers recently found a major security flaw in Washington, D.C.'s new Digital Vote by Mail system, which enabled them to access, modify, and replace marked ballots in the system. The shell injection flaw in the ballot upload function allowed the researchers to access usernames, passwords, and the public key used to encrypt ballots, according to Michigan professor Alex Halderman. He also says the researchers were able to install a backdoor on the server, which enabled them to view the recorded votes and the names of the voters. "If this particular problem had not existed, I'm confident that we would have found another way to attack the system," Halderman says. The Digital Vote by Mail system is designed to let military personnel and overseas U.S. civilians receive and cast ballots over the Internet using a pre-provided PIN to authenticate themselves. In response to the discovery of the security flaws, D.C.'s Board of Election and Ethics announced that voters will not be allowed to use Digital Vote by Mail to send back ballots.
Blog: Stopping Malware: BLADE Software Eliminates "Drive-By Downloads" From Malicious Websites
Stopping Malware: BLADE Software Eliminates "Drive-By Downloads" From Malicious Websites
Georgia Tech Research News (10/06/10) Abby Vogel Robinson
Georgia Tech researchers have developed Block All Drive-By Download Exploits (BLADE), a browser-independent tool that eliminates drive-by download threats. "BLADE is an effective countermeasure against all forms of drive-by download malware installs because it is vulnerability and exploit agnostic," says Georgia Tech professor Wenke Lee. In testing, BLADE blocked all drive-by malware installation attempts from the more than 1,900 malicious Web sites tested. "BLADE monitors and analyzes everything that is downloaded to a user's hard drive to cross-check whether the user authorized the computer to open, run, or store the file on the hard drive," says Georgia Tech graduate student Long Lu. Testing found that Adobe Reader, Java, and Adobe Flash were the most frequently targeted applications. "BLADE requires a user's browser to be configured to require explicit consent before executable files are downloaded, so if this option is disabled by the user, then BLADE will not be able to protect that user's Web surfing activities," Lee notes.
Friday, October 1, 2010
Blog: Professor Wendy Hall Speaks [on Web Science; why Semantic Web failed to emerge]
Professor Wendy Hall Speaks
Inquirer (UK) (10/01/10) Wendy M. Grossman
Wendy Hall, dean of the University of Southampton's new Faculty of Physical and Applied Sciences, helped found the Web Science Research Initiative for the purpose of facilitating a formal understanding of the Web. "Partly what I'm passionate about in Web science ... [is] what makes the Web what it is, how it evolves and will evolve, what are the scenarios that could kill it or change it in ways that would be detrimental to its use," she says. Hall says the need for the initiative was demonstrated when she discussed with Web pioneer Tim Berners-Lee why the semantic Web has failed to emerge. Berners-Lee determined that the semantic Web concept was commandeered by the artificial intelligence community for the purpose of solving massive problems, when what was needed was for the data to be freed up so that people's use of it could be observed. "What creates the Web are us who put the content on it, and that's not natural or engineered," Hall says. Among the subjects she says can be explored through Web science is how to construct a better future Web by predicting what people would and would not do with it.
Blog: Regulators Blame Computer Algorithm for Stock Market 'Flash Crash'
Regulators Blame Computer Algorithm for Stock Market 'Flash Crash'
Computerworld (10/01/10) Lucas Mearian
A joint investigation by the U.S. Securities and Exchange Commission and the Commodity Futures Trading Commission has issued a report calling the May 6 stock market flash crash the responsibility of an automated trade execution system that inundated the Chicago Mercantile Exchange's Globex electronic trading platform with a major sell order that triggered a nearly 1,000-point plunge in the Dow Jones Industrial Average in a half hour. "[A] large fundamental trader chose to execute this sell program via an automated execution algorithm that was programmed to feed orders into the June 2010 E-Mini market to target an execution rate set to 9 percent of the trading volume calculated over the previous minute, but without regard to price or time," the report says. "The execution of this sell program resulted in the largest net change in daily position of any trader in the E-Mini since the beginning of the year." The study found that under strained market conditions, the automated execution of a big sell order can induce extreme price movements, particularly if price is not factored in by the automated execution algorithm. "Moreover, the interaction between automated execution programs and algorithmic trading strategies can quickly erode liquidity and result in disorderly markets," the report concludes.
Thursday, September 30, 2010
Bloag: 'Fabric' Would Tighten the Weave of Online Security [...a way to incorporate security in the programming language used to write computer programs]
'Fabric' Would Tighten the Weave of Online Security
Cornell Chronicle (09/30/10) Bill Steele
Cornell University professors Fred Schneider and Andrew Myers are developing a way to incorporate security in the programming language used to write computer programs, so that the systems are protected from the beginning. Until now, computer security has been reactive, Schneider says. "Our defenses improve only after they have been successfully penetrated," he says. Schneider and Myers developed Fabric, a computer platform that replaces multiple existing layers with a simpler programming interface that makes security reasoning more direct. Fabric is designed to create secure systems for distributed computing, such as systems that move money around or control medical records. Fabric's programming language, which is based on Java, builds in security as the program is written. Myers says most of what Fabric does is transparent to the programmer. "I think we can make life simpler and improve performance," he says.
Blog: Multicore May Not Be So Scary [dealing with the issue: ...at a certain point, adding more cores slowed the system down instead of speeding it up.]
Multicore May Not Be So Scary
MIT News (09/30/10) Larry Hardesty
Massachusetts Institute of Technology (MIT) researchers built a system consisting of eight six-core chips that can simulate the performance of a 48-core chip, as a way to test if adding more cores continues to boost computing performance. The researchers tested several applications on their model, activating the 48 cores one by one and observing the results. The researchers found that at a certain point, adding more cores slowed the system down instead of speeding it up. However, slightly rewriting the Linux code so that each core kept a local count greatly improved the system's overall performance. "There's a bunch of interesting research to be done on building better tools to help programmers pinpoint where the problem is," says MIT professor Frans Kaashoek. "The big question in the community is, as the number of cores on a processor goes up, will we have to completely rethink how we build operating systems," says University of Wisconsin professor Remzi Arpaci-Dusseau.
Blog Archive
-
►
2012
(35)
- ► April 2012 (13)
- ► March 2012 (16)
- ► February 2012 (3)
- ► January 2012 (3)
-
►
2011
(118)
- ► December 2011 (9)
- ► November 2011 (11)
- ► October 2011 (7)
- ► September 2011 (13)
- ► August 2011 (7)
- ► April 2011 (8)
- ► March 2011 (11)
- ► February 2011 (12)
- ► January 2011 (15)
-
▼
2010
(183)
-
▼
December 2010
(16)
- Blog: Movie Magic Conjured by Science
- Blog: Algorithms Take Control of Wall Street [an e...
- Blog: 7 Programming Languages on the Rise
- Blog: Meet the Data-Storing Bacteria [each cell ca...
- Blog: Software [using classical computing] Said to...
- Blog: DARPA Goal for Cybersecurity: Change the Game
- Blog: Computers Help Social Animals to See Beyond ...
- Blog: In 500 Billion Words, New Window on Culture ...
- Blog: JASON: Science of Cyber Security Needs More ...
- Blog: Cryptographers Chosen to Duke It Out in Fina...
- Blog: Problem-Solving Ants Inspire Next Generation...
- Blog: Researchers Open the Door to Biological Comp...
- Blog: UCSF Team Develops "Logic Gates" to Program ...
- Blog: Quantum Links Let Computers Understand Language
- Blog: How Rare Is that Fingerprint? Computational ...
- Blog: New Psychology Theory Enables Computers to M...
-
►
November 2010
(15)
- Blog: IBM Chip Breakthrough May Lead to Exascale S...
- Blog: ECS Researcher Highlights Need for Transpare...
- Blog: 'Chaogates' Hold Promise for the Semiconduct...
- Blog: Rensselaer Team Shows How to Analyze Raw Gov...
- Blog: Time to blow up best practices myths
- Blog: Rats to Robots--Brain's Grid Cells Tell Us H...
- Blog: Algorithm Pioneer Wins Kyoto Prize
- Blog: The Ethical Robot
- Blog: Part Moth, Part Machine: Cyborgs Are on the ...
- Blog: Gartner Report: The Future of Information Se...
- Blog: Metasploit and SCADA exploits: dawn of a new...
- Blog: The bigger the system, the greater the chanc...
- Blog: New Google Tool Makes Websites Twice as Fast
- Blog: A Software Application Recognizes Human Emot...
- Blog: New Help on Testing for Common Cause of Soft...
-
►
October 2010
(15)
- Blog: In D.C.'s Web Voting Test, the Hackers Were ...
- Blog: Computer Scientists Make Progress on Math Pu...
- Blog: 7 Programming Languages on the Rise
- Blog: As E-Voting Comes of Age, Security Fears Mount
- Blog: D.C. Hacking Raises Questions About Future o...
- Blog: New Search Method Tracks Down Influential Ideas
- Blog: Analyzing Almost 10 Million Tweets, Research...
- Blog: HIMSS Analytics, the 8 stages to creating a ...
- Blog: Faster Websites, More Reliable Data
- Blog: Five tips to learn from failure
- Blog: W3C: Hold Off on Deploying HTML5 in Websites
- Blog: D.C. Web Voting Flaw Could Have Led to Compr...
- Blog: Stopping Malware: BLADE Software Eliminates ...
- Blog: Professor Wendy Hall Speaks [on Web Science;...
- Blog: Regulators Blame Computer Algorithm for Stoc...
- ► September 2010 (25)
- ► August 2010 (19)
- ► April 2010 (21)
- ► March 2010 (7)
- ► February 2010 (6)
- ► January 2010 (6)
-
▼
December 2010
(16)
-
►
2009
(120)
- ► December 2009 (5)
- ► November 2009 (12)
- ► October 2009 (2)
- ► September 2009 (3)
- ► August 2009 (16)
- ► April 2009 (4)
- ► March 2009 (20)
- ► February 2009 (9)
- ► January 2009 (19)
-
►
2008
(139)
- ► December 2008 (15)
- ► November 2008 (16)
- ► October 2008 (17)
- ► September 2008 (2)
- ► August 2008 (2)
- ► April 2008 (12)
- ► March 2008 (25)
- ► February 2008 (16)
- ► January 2008 (6)
-
►
2007
(17)
- ► December 2007 (4)
- ► November 2007 (4)
- ► October 2007 (7)
Blog Labels
- research
- CSE
- security
- software
- web
- AI
- development
- hardware
- algorithm
- hackers
- medical
- machine learning
- robotics
- data-mining
- semantic web
- quantum computing
- Cloud computing
- cryptography
- network
- EMR
- search
- NP-complete
- linguistics
- complexity
- data clustering
- optimization
- parallel
- performance
- social network
- HIPAA
- accessibility
- biometrics
- connectionist
- cyber security
- passwords
- voting
- XML
- biological computing
- neural network
- user interface
- DNS
- access control
- firewall
- graph theory
- grid computing
- identity theft
- project management
- role-based
- HTML5
- NLP
- NoSQL
- Python
- cell phone
- database
- java
- open-source
- spam
- GENI
- Javascript
- SQL-Injection
- Wikipedia
- agile
- analog computing
- archives
- biological
- bots
- cellular automata
- computer tips
- crowdsourcing
- e-book
- equilibrium
- game theory
- genetic algorithm
- green tech
- mobile
- nonlinear
- p
- phone
- prediction
- privacy
- self-book publishing
- simulation
- testing
- virtual server
- visualization
- wireless