Data Mining: Consumers’ Convenience, Privacy’s End12/09/2007 by Monika Ermert for Intellectual Property Watch Leave a CommentShare this Story:Click to share on Twitter (Opens in new window)Click to share on LinkedIn (Opens in new window)Click to share on Google+ (Opens in new window)Click to share on Facebook (Opens in new window)Click to email this to a friend (Opens in new window)Click to print (Opens in new window)Much of our best content is available only to IP Watch subscribers. We are a non-profit independent news service, and subscribing to our service helps support our goals of bringing more transparency to global IP and innovation policies. To access all of our content, please subscribe now.By Monika Ermert for Intellectual Property Watch LINZ, AUSTRIA – How distant is the realisation of a “prevision” and prevention culture as portrayed in the Steven Spielberg movie Minority Report? Telling the future is not as far off as critical minds might hope for, US author and activist Brian Holmes warned at the Ars Electronica, a traditional gathering of media artists and media researchers held in Linz, Austria last week.Data mining and profiling in order to anticipate every move of the consumer – or the suspicious citizen – are part of the business of the advertising industry and terrorist hunters respectively.“Goodbye Privacy” was chosen as main conference topic for Ars Electronica this year amidst growing concerns about the erosion of privacy and data protection by new European Union laws such as the much-debated EU Directive on Data Retention and new antiterrorism laws.“Our movements, our speech, our emotions and even our dreams have become the informational message that is incessantly decoded, probed and reconfigured into statistical silhouettes, serving as targets for products, services, political slogans or interventions of the police,” said Holmes. Dual-use technology financially supported by the United States government allowed ever more precise mapping, profiling and predicting of the individual, he said. Holmes pointed for example to InferX real-time analytical software, a data-mining tool “made specifically for the Department of Homeland Security.”The software “inserts an ‘InferAgent’ program into the computer systems of institutions or corporations – banks, airports, ticketing, agencies, harbour authorities, subways, department stores, etcetera – and then uses networked queries to perform real-time pattern recognition,” he said. Whenever there is a deviance from “normal patterns” there, the software can tell.Another example quoted by Holmes was the Personicx customer relationship management system that, according to a US Democratic Party campaigner, allowed targeting of the voter audience. “If I want to sit at my desk, pull up on the screen the state of Ohio, and say, ‘who in Ohio says that education is going to be the number one issue they’re going to vote on’, six seconds later, 1,2 million names will pop up,” the campaigner said according to Holmes.Helen Nissenbaum, professor at New York City University’s Department of Culture and Communication, one of the authors of the concept of contextual integrity, asked several times why one would bother with the sorting for economic reasons and also for the enforcement of accepted rules. Contextual integrity, according to Nissenbaum’s concept, “ties adequate protection for privacy to norms of specific contexts, demanding that information gathering and dissemination be appropriate to that context and obey the governing norms of distribution within it.”The director of the Surveillance Project at Canada’s Queens University, David Lyon, on the other hand warned against social sorting that regularly is opaque to the individual. People just do not know if they were put on hold calling their bank because the company attached a negative scoring to their name. In his opinion what is wrong with sorting is that “we do not know the basis on which the lives of whole groups of people are changed.”Appeals against sorting are impossible, he added. The problem, Lyon said, “is not the lack of privacy, but the lack of public spaces where we can see and be seen without suspicion or pre-judgement.” Activists but also legal experts see a grave danger in the trend to use once-stored data for evermore extensive investigation of individuals, beyond the original intent.In Linz, several strategies for reaction were presented by artists, activists and experts. MediaShed, a UK initiative that allows people to experiment with free media, exemplified one possibility: a group of unemployed, homeless youngsters hacked the radio signals of public video surveillance cameras to create their own videos. The project did not undo the surveillance, but made it much more transparent.Transparency and civilian counter-surveillance is also a core motive of Slovenian media activist and artist Marko Peljhan. The Ars Electronica-featured artist presented for example his video-equipped unmanned aerial vehicle. It allows observation of areas where citizens are kept out or the creation of one’s own feed about a protest march and police action there. Peljhan said drones will soon be used by every police and military unit. “We have to participate in the technology from the beginning, have to be there when the governments start legislating this,” said Peljhan.Privacy DRMsAnother strategy described by several panellists at Ars Electronica is the use of digital rights management technology to protect one’s personal data. Yet “privacy DRM” has its problems, said Viktor Mayer-Schönberger, professor at the Kennedy School of Government at Harvard University. “One problem is that so far DRM systems always have been easy to hack,” said Mayer-Schoenberger, “and it is more painful for you when your personal data is out there than if there is an illegal copy of Pocahontas.” The other problem is that privacy DRM would have to capture every move of its user, making it an extremely sensitive data collection, he said.Mayer-Schönberger proposed something much simpler to act against the erosion of privacy. “Now, forgetting is the exception, and the storing of data is the rule. Let’s turn this around and make forgetting the data the default.” This is similar to the notion that milk or eggs data stored by the state and/or companies should have a ‘best before’ date. Because, as one representative of a large telecommunications company put it: “The best way to protect data is to delete them. Deleted data are the best protected data by any means.”Monika Ermert may be reached at firstname.lastname@example.org.Share this Story:Click to share on Twitter (Opens in new window)Click to share on LinkedIn (Opens in new window)Click to share on Google+ (Opens in new window)Click to share on Facebook (Opens in new window)Click to email this to a friend (Opens in new window)Click to print (Opens in new window)Related"Data Mining: Consumers’ Convenience, Privacy’s End" by Intellectual Property Watch is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.