Sites Feed Personal Details to New Tracking Industry
Wall Street Journal (07/30/10) Angwin, Julia; McGinty, Tom
Wall Street Journal (WSJ) investigators have found that the largest U.S. Web sites are installing new, intrusive consumer-tracking software on user's computers. The WSJ examined the 50 most popular Web sites in the United States, which account for about 40 percent of total U.S. page views, to measure the quantity and capabilities of the trackers installed on a user's computer. The 50 sites installed a total of 3,180 tracking files on a test computer used in the study. Only one site, Wikipedia.org, installed no trackers, while 12 sites installed more than 100 tracking tools each. The companies that used the most tracking tools were Google, Microsoft, and Quantcast; however, they claim to not track individuals by name and offer users a way to remove themselves from the tracking networks. Some tracking programs can record and analyze a user's keystrokes for content, tone, and clues to their social connections. Some of the tracking tools also allowed data-gathering companies to build personal profiles that could include age, gender, race, zip code, income, marital status, and health concerns, in addition to recent purchases and favorite TV shows and movies. The growing use of tracking technologies has begun to raise regulatory concerns, and Congress is considering laws that would limit tracking.
Friday, July 30, 2010
Blog: Sites Feed Personal Details to New Tracking Industry
Wednesday, July 28, 2010
Blog: IBM Scientists Create Most Comprehensive Map of the Brain's Network
IBM Scientists Create Most Comprehensive Map of the Brain's Network
KurzweilAI.net (07/28/10)
IBM researchers have successfully mapped out the long-distance network of the brain of the Macaque monkey, which holds significant ramifications for reverse-engineering the brain and creating a network of cognitive computing chips. "We have collated a comprehensive, consistent, concise, coherent, and colossal network spanning the entire brain and grounded in anatomical tracing studies that is a stepping stone to both fundamental and applied research in neuroscience and cognitive computing," says IBM Almaden's Dharmendra S. Modha. The researchers concentrated on the long-distance network of nearly 400 brain regions and more than 6,600 long-distance brain connections that pass through the brain's white matter. The work builds on the publicly available Collation of Connectivity data on the Macaque brain database. The scientists' ranking of brain regions uncovered evidence suggesting that the prefrontal cortex is a functionally central part of the brain that might serve as an integrator and distributor of information. They determined that the brain network does not appear to be scale-free like Web social networks, but is exponential--a discovery that will help inform the design of a cognitive computing chip network's routing architecture.
Tuesday, July 27, 2010
Blog: More Accurate Than Heisenberg Allows?
More Accurate Than Heisenberg Allows?
Ludwig-Maximilians-Universitat Munchen (07/27/10)
Quantum cryptography is the safest data encryption method, and takes advantage of the fact that transmitted information can only be quantified with a strictly limited degree of precision. Scientists at ETH Zurich and Ludwig-Maximilians-Universitat (LMU) in Munich have made a discovery in how the use of a quantum memory impacts this uncertainty. "The result not only enhances our understanding of quantum memories, it also provides us with a method for determining the degree of correlation between two quantum particles," says ETH Zurich professor Matthias Christandl. "Moreover, the effect we have observed could yield a means of testing the security of quantum cryptographic systems." Quantum mechanics dictates that the measurement of a parameter can itself disturb a particle's state, and this effect is harnessed by quantum cryptography to encrypt data and thwart eavesdropping. The LMU and ETH Zurich teams have demonstrated that the result of a measurement on a quantum particle can be predicted with greater accuracy if data about the particle is contained in a quantum memory, which can consist of atoms or ions.
Monday, July 26, 2010
Blog: XML Pioneer Pitches Functional Programming for Concurrency
InfoWorld (07/26/10) Krill, Paul
XML co-inventor Tim Bray says that functional programming, rather than threads, is the best option for programmers developing code for multicore processors. Programming for multicore chips requires developers to deal with concurrency, which presents its own issues, Bray says. "It involves a lot of problems that are very difficult to think about and reason about and understand," he says. However, functional programming, made possible with languages such as Erlang and Clojure, offers a way to handle concurrency. Erlang was originally designed for programming massive telephone switches with thousands of processors. Bray says that although it has no classes, objects, or variables, it is "bulletproof and reasonably high-performance." Clojure is a Lisp, runs on the Java Virtual Machine, and compiles to straight Java bytecode, which makes it very fast, Bray notes. "This is a super, super high-performance language," he says.
Blog: Bringing Data Mining Into the Mainstream
Bringing Data Mining Into the Mainstream
New York Times (07/26/10) Lohr, Steve
A record number of corporate researchers and university scientists are attending an ACM conference on knowledge discovery and data mining, which offers papers and workshops that apply data mining to everything from behavioral targeting to cancer research. Although data mining has become a growth industry, profitably probing large data sets is still costly for companies and difficult for users. According to conference executive director Usama Fayyad, an institutional mindset that recognizes the value of data is needed to bring modern data mining into the business mainstream. The executive level must view data as a new strategic asset that can create revenue streams and businesses. Fayyad also says a translation layer of technology is needed to democratize modern data mining, and the underlying software for handling large data sets should be linked to software that ordinary people can use. Using Microsoft's Excel spreadsheet as a metaphor, Fayyad says the sophisticated data-handling layer should be "built in ways that Excel can consume the data and people can browse it."
Friday, July 23, 2010
Blog: Machine Science
Machine Science
Science (07/23/10) Vol. 329, No. 5990, P. 399; Evans, James; Rzhetsky, Andrey
Manually tracking all the published science relevant to a scientist's research is an impossible task, but it is predicted that computers capable of generating many helpful hypotheses with little human input will emerge within a decade. New computational tools can broaden the range of concepts and relations used for producing automated hypotheses by tapping a greater portion of the massive archive of published science, and by synthesizing new higher- and lower-order concepts and relations from the existing body of knowledge. Researchers can productively trim the multitude of low-quality hypotheses generated by a bigger pool of concepts and relations by employing a selection process that draws on insights into the social, cultural, and cognitive creation of science. The number of possible hypotheses could be vastly enlarged if researchers could computationally map concepts across different scientific communities' distinct languages. This would flag parallels in theories from different domains, as well as changes in meaning with time and multiple meanings. These distinctions could be computationally mined to uncover unique conceptual connections. By assigning priority to hypotheses containing concepts spanning existing scientific theories, cultures, and languages, investigators could profitably concentrate on the most novel.
View Full Article - May Require Free Registration | Return to Headlines
Blog: Neurons to Power Future Computers
Neurons to Power Future Computers
BBC News (07/23/10)
University of Plymouth computer scientists led by Thomas Wennekers are developing novel computers that mimic the way neurons are built and how they communicate. Neural-based computers could lead to improvements in visual and audio processing. "We want to learn from biology to build future computers," Wennekers says. "The brain is much more complex than the neural networks that have been implemented so far." The researchers are collecting data about neurons and how they are connected in one part of the brain. The project is focusing on the laminar microcircuitry of the neocortex, which is involved in higher brain functions such as seeing and hearing. Meanwhile, Manchester University professor Steve Furber is using the neural blueprint to produce new hardware. Furber's project, called Spinnaker, is developing a computer optimized to run like biology does. Spinnaker aims to develop innovative computer processing systems and insights into the way that several computational elements can be connected. "The primary objective is just to understand what's happening in the biology," Furber says. "Our understanding of processing in the brain is extremely thin."
Thursday, July 22, 2010
Blog: Data Mining Made Faster
Data Mining Made Faster
University of Utah (07/22/10) Ferebee, Kate
University of Utah researchers have developed a new multidimensional scaling method that they say enables simpler, faster data mining. "The challenge of data mining is dealing with the dimensionality of the data and the volume of it," says Utah professor Suresh Venkatasubramanian. "What our approach does is unify into one common framework a number of different methods for doing this dimensionality reduction," which simplifies high-dimensional data, he says. Using multidimensional scaling to simplify multidimensional data is an attempt "to reduce the dimensionality of data by finding key attributes defining most of the behavior," Venkatasubramanian says. "Prior methods on modern computers struggle with data from more than 5,000 people. Our method smoothly handles well above 50,000 people." The researchers' new approach uses one set of instructions to perform a wide variety of multidimensional scaling that previously required separate instructions. It can handle large amounts of data because "rather than trying to analyze the entire set of data as a whole, we analyze it incrementally, sort of person by person," Venkatasubramanian says.
Wednesday, July 21, 2010
Blog: Protein From Poplar Trees Can Be Used to Greatly Reduce Size of Memory Elements and Increase the Density of Computer Memory
Protein From Poplar Trees Can Be Used to Greatly Reduce Size of Memory Elements and Increase the Density of Computer Memory
Hebrew University of Jerusalem (07/21/10)
Genetically engineered poplar-derived protein complexes have the potential to increase the memory capacity of future computers. Scientists from the Hebrew University of Jerusalem have combined protein molecules obtained from the poplar tree with memory units based on silica nanoparticles. The team genetically engineered the poplar protein to develop a hybrid silicon nanoparticle. Attached to the inner pore of a stable, ring-like protein, the hybrids are arranged in a large array of very close molecular memory units. Professor Danny Porath and graduate student Izhar Medalsy have successfully demonstrated the approach. They say genetically engineered poplar-derived protein complexes could lead to systems that would need much less space for memory and functional logic elements. The researchers say the approach to miniaturizing memory elements is cost-effective and could replace standard fabrication techniques.
Tuesday, July 20, 2010
Blog: Self-Sustaining Robot Has an Artificial Gut
Self-Sustaining Robot Has an Artificial Gut
PhysOrg.com (07/20/10)
British researchers have developed an autonomous robot with an artificial stomach that enables it to fuel itself by eating and excreting. Bristol Robotics Laboratory researchers designed the robot, called Ecobot III, so that it consumes partially processed sewage, using the nutrients within the mixture for fuel and excreting the remains. The robot also drinks water to maintain power generation. The meal is processed by 24 microbial fuel cells (MFCs), which are held in a stack of two tiers in the robot's body. Undigested matter passes via a gravity feed to a central trough from which it is pumped back into the feeder tanks to be reprocessed in order to extract as much of the available energy as possible. The bacteria in the MFCs metabolize the organic mixture, producing hydrogen atoms in the process, which help produce an electric current. The robot has maintained itself unaided for up to seven days, but is so far extremely inefficient, using only one percent of the energy available within the food.
Monday, July 19, 2010
Blog: Passwords That Are Simple--and Safe
Passwords That Are Simple--and Safe
Technology Review (07/19/10) Garfinkel, Simson
Microsoft researchers have developed a new approach to creating passwords that retains the security of complex passwords but does away with their complexity requirements. The method makes sure that no more than a few users can have the same password, which has a similar effect on overall security when employed by organizations with millions of users. The system counts how many times any user on the service chooses a given password, and when more than a small number of users pick a password, it is banned and no one else is allowed to choose it. "Replacing password creation rules with popularity limitations has the potential to increase both security and usability," write Microsoft researchers Cormac Herley and Stuart Schechter in a paper to be published at the upcoming Hot Topics in Security conference. "Since no passwords are allowed to become too common, attackers are deprived of the popular passwords they require to compromise a significant faction of accounts using online guessing."
Blog: Virtual Universe Study Proves 80 Year Old Theory on How Humans Interact
Virtual Universe Study Proves 80 Year Old Theory on How Humans Interact
Imperial College London (07/19/10) Goodchild, Lucy
A study by Imperial College London (ICL), in conjunction with the Medical University of Vienna and the Sante Fe Institute, offers large-scale evidence to prove the psychological theory called Structural Balance Theory (SBT), which suggests that some relationships are more stable than others in a society. In the ICL study, information about interactions between players in a virtual universe game called Pardus is more detailed than that from other electronic sources because it includes data on the types of relationships and whether they are positive or negative. The research shows that positive relationships form stable networks in society, proving SBT for the first time. "Our new study reveals in more detail than ever before the key ingredients that make these networks stable," says ICL's Renaud Lambiotte. The researchers analyzed the data from the game by examining individual networks and the interplay between all the networks. The researchers are now using the tools developed for the study to examine large, complex networks for patterns of communication between millions of people using mobile phone data.
Tuesday, July 13, 2010
Blog: Crunching Cancer With Numbers
Crunching Cancer With Numbers
New Scientist (07/13/10) Buchen, Lizzie
In 2009, the U.S. National Cancer Institute recruited scientists from a broad range of disciplines to apply their expertise and computational tools toward the discovery of simple laws governing the fate of cancer cells. This differs from the molecular-level approach that cancer research has concentrated on for the last several decades. The researchers are testing a series of interlocking computational models they have devised from fundamental precepts to describe and predict different aspects of cancer. Within five years they hope to have a single, all-encompassing model of mouse lymphoma that seamlessly aligns with the data. The ultimate goal is to generate a model that can anticipate an individual's response to various combinations of cancer therapies by feeding it with key parameters, such as gender, blood pressure, and genetic sequences. One of the researchers engaged in this effort is Paul Newton, project leader at the Scripps Research Institute's physical sciences oncology center. His goal is to unlock the underlying mechanics of metastasis by deconstructing the process into simple steps that can each be modeled using equations.
Tuesday, July 6, 2010
Blog: Cloud security: Google won't like the enterprise view, neither will Facebook
Cloud security: Google won't like the enterprise view, neither will Facebook
By Dennis Howlett | July 6, 2010, 6:00am PDT
Last week I was at the grandly titled Cloud Computing World Forum for pretty much the whole of the week. Apart from chairing three sessions and participating in another I wanted to get a feel for what EU people really think about this topic du jour.
The panels gave me an opportunity to explore issues around security/privacy with people who are living the issues as end user representatives. Here’s a potted round up of views from some of those sitting in the check writing seats:
- Generally, panelists are interested in and actively exploring opportunities presented by cloud services though they have plenty of caveats.
- Gordon Penfold, CTO British Airways for example explained that as representative of an industry we all love to hate, data location is a sensitive issue.
- Mary Hensher, CIO and IT partner, Deloitte UK and Switzerland said her company has regulatory requirements that demand data be stored up to eight years. She is under pressure to utilize cloud storage but is skeptical about whether such methods will allow her business to remain compliant. Many of the documents and emails that pass between clients and Deloitte are of a commercially sensitive nature and having them the subject of possible scrutiny by virtue of storage location was not something the firm is prepared to risk without a full understanding of what happens in a cloud environment.
- Robert Johnson, head of front office technology Mitsubishi UFJ Securities International was adamant that while he is prepared to consider cloud for many types of data, counterparty data is off limits.
- Miles Gray, hardware solutions architect, UK National Health Service argued that understanding how identity management impacts the various systems being integrated in the cloud is taking on greater importance.
Friday, July 2, 2010
Blog: International Conference Confronts Data Deluge
International Conference Confronts Data Deluge
Scientific Computing (07/02/10)
At the recent International Conference on Scientific and Statistical Database Management, scientific domain experts, database researchers, practitioners, and developers discussed concepts, tools, techniques, and architectures for scientific and statistical database applications. Yale University professor Daniel Abadi says that as market demand for analyzing data sets of increasing variety and size continues to grow, the software options for performing this analysis also are starting to spread. Abadi says Yale is developing a hybrid database system called HadoopDB designed to combine the advantages of parallel databases and MapReduce-based systems. Microsoft's Roger Barga says that data-intensive scalable computing methods are expected to play a more important role in providing support for well-informed technical decisions and policies. He says that researchers and decision-makers increasingly are requiring new combinations of data, more sophisticated data analysis methods, and better ways to present results.
Blog Archive
-
►
2012
(35)
- ► April 2012 (13)
- ► March 2012 (16)
- ► February 2012 (3)
- ► January 2012 (3)
-
►
2011
(118)
- ► December 2011 (9)
- ► November 2011 (11)
- ► October 2011 (7)
- ► September 2011 (13)
- ► August 2011 (7)
- ► April 2011 (8)
- ► March 2011 (11)
- ► February 2011 (12)
- ► January 2011 (15)
-
▼
2010
(183)
- ► December 2010 (16)
- ► November 2010 (15)
- ► October 2010 (15)
- ► September 2010 (25)
- ► August 2010 (19)
-
▼
July 2010
(15)
- Blog: Sites Feed Personal Details to New Tracking ...
- Blog: IBM Scientists Create Most Comprehensive Map...
- Blog: More Accurate Than Heisenberg Allows?
- Blog: XML Pioneer Pitches Functional Programming f...
- Blog: Bringing Data Mining Into the Mainstream
- Blog: Machine Science
- Blog: Neurons to Power Future Computers
- Blog: Data Mining Made Faster
- Blog: Protein From Poplar Trees Can Be Used to Gre...
- Blog: Self-Sustaining Robot Has an Artificial Gut
- Blog: Passwords That Are Simple--and Safe
- Blog: Virtual Universe Study Proves 80 Year Old Th...
- Blog: Crunching Cancer With Numbers
- Blog: Cloud security: Google won't like the enterp...
- Blog: International Conference Confronts Data Deluge
- ► April 2010 (21)
- ► March 2010 (7)
- ► February 2010 (6)
- ► January 2010 (6)
-
►
2009
(120)
- ► December 2009 (5)
- ► November 2009 (12)
- ► October 2009 (2)
- ► September 2009 (3)
- ► August 2009 (16)
- ► April 2009 (4)
- ► March 2009 (20)
- ► February 2009 (9)
- ► January 2009 (19)
-
►
2008
(139)
- ► December 2008 (15)
- ► November 2008 (16)
- ► October 2008 (17)
- ► September 2008 (2)
- ► August 2008 (2)
- ► April 2008 (12)
- ► March 2008 (25)
- ► February 2008 (16)
- ► January 2008 (6)
-
►
2007
(17)
- ► December 2007 (4)
- ► November 2007 (4)
- ► October 2007 (7)
Blog Labels
- research
- CSE
- security
- software
- web
- AI
- development
- hardware
- algorithm
- hackers
- medical
- machine learning
- robotics
- data-mining
- semantic web
- quantum computing
- Cloud computing
- cryptography
- network
- EMR
- search
- NP-complete
- linguistics
- complexity
- data clustering
- optimization
- parallel
- performance
- social network
- HIPAA
- accessibility
- biometrics
- connectionist
- cyber security
- passwords
- voting
- XML
- biological computing
- neural network
- user interface
- DNS
- access control
- firewall
- graph theory
- grid computing
- identity theft
- project management
- role-based
- HTML5
- NLP
- NoSQL
- Python
- cell phone
- database
- java
- open-source
- spam
- GENI
- Javascript
- SQL-Injection
- Wikipedia
- agile
- analog computing
- archives
- biological
- bots
- cellular automata
- computer tips
- crowdsourcing
- e-book
- equilibrium
- game theory
- genetic algorithm
- green tech
- mobile
- nonlinear
- p
- phone
- prediction
- privacy
- self-book publishing
- simulation
- testing
- virtual server
- visualization
- wireless