Surveillance and privacy
Discussion of issues related to liberty and privacy, and especially how they are affected by and interrelated with data management and analytic technologies. Related subjects include:
Petabyte-scale data management
Privacy, censorship, and freedom (in The Monash Report)
8 not very technical problems with analytic technology
In a couple of talks, including last Thursday’s, I’ve rattled off a list of eight serious problems with analytic technology, all of them human or organizational much more than purely technical. At best, these problems stand in the way of analytic success, and at least one is a lot worse than that.
The bulleted list in my notes is:
-
Individual-human
-
Expense of expertise
-
Limited numeracy
-
-
Organizational
-
Limited budgets
-
Legacy systems
-
General inertia
-
-
Political
-
Obsolete systems
-
Clueless lawmakers
-
Obsolete legal framework
-
I shall explain. Read more
Categories: Analytic technologies, Business intelligence, Data integration and middleware, Data warehousing, EAI, EII, ETL, ELT, ETLT, Surveillance and privacy | Leave a Comment |
Big Brother watching our parents?
Life as an elderly person can have Kafkaesque aspects. For example, whether you are allowed to continue to live independently in your own apartment can depend upon whether you are trusted to follow orders for your own good in areas such as:
- Taking medication
- Walking with proper care
- Keeping your feet elevated to let various medical conditions heal
Similarly, it can depend upon whether you are deemed likely, for whatever reason, to fall.
Note: All these examples are taken directly from my family’s very recent experience, although at the immediate time we have bigger problems than that.
This raises the subject of how the elderly can be provided with precious additional months or years of independent living. when constantly attentive in-home nursing assistance isn’t affordable. Well, it won’t be long before technology can monitor all of those subjects and more, via a variety of video, audio, tactile, or motion-detecting sensors. In other words, an utter Big Brother set-up is what may allow the elderly some continued freedom.
Putting it that way illustrates that there are huge reasons to invent and commercialize this kind of technology. But clearly, once invented and deployed, that technology would be horrifically easy to abuse. That’s just one more reason we really, really need to get our collective liberty and privacy act together.
Related links
- A 2003 post speculating about multiple uses to which home monitoring technology could be put.
- A couple of academic papers about home/health monitoring kinds of technology
Categories: Surveillance and privacy | 2 Comments |
I’ll be speaking in Washington, DC on May 6
My clients at Aster Data are putting on a sequence of conferences called “Big Data Summit(s)”, and wanted me to keynote one. I agreed to the one in Washington, DC, on May 6, on the condition that I would be allowed to start with the same liberty and privacy themes I started my New England Database Summit keynote with. Since I already knew Aster to be one of the multiple companies in this industry that is responsibly concerned about the liberty and privacy threats we’re all helping cause, I expected them to agree to that condition immediately, and indeed they did.
On a rough-draft basis, my talk concept is:
Implications of New Analytic Technology in four areas:
- Liberty & privacy
- Data acquisition & retention
- Data exploration
- Operationalized analytics
I haven’t done any work yet on the talk besides coming up with that snippet, and probably won’t until the week before I give it. Suggestions are welcome.
If anybody actually has a link to a clear discussion of legislative and regulatory data retention requirements, that would be cool. I know they’ve exploded, but I don’t have the details.
Categories: Analytic technologies, Archiving and information preservation, Aster Data, Data warehousing, Presentations, Surveillance and privacy | 1 Comment |
Information found in public-facing social networks
Here are some examples illustrating two recent themes of mine, namely:
- Easily-available information reveals all sorts of things about us.
- Graph-based analysis is on the rise.
Pete Warden scraped all of Facebook’s social graph (at least for the United States), and put up a really interesting-looking visualization of same. Facebook’s lawyer’s came down on him, and he quickly agreed to destroy the data he’d scraped, but also published ideas on how other people could duplicate his work.
Warden has since given an interview in which he outlines some of the things researchers hoped to do with this data: Read more
Categories: Analytic technologies, Facebook, RDF and graphs, Surveillance and privacy | 1 Comment |
The retention of everything
I’d like to reemphasize a point I’ve been making for a while about data retention: Read more
Categories: Archiving and information preservation, Surveillance and privacy, Web analytics | 3 Comments |
Liberty and privacy, once again
I’ve long argued three points:
- It is inevitable* that governments and other constituencies will obtain huge amounts of information, which can be used to drastically restrict everybody’s privacy and freedom.
- To protect against this grave threat, multiple layers of defense are needed, technical and legal/regulatory/social/political alike.
- One particular layer is getting insufficient attention, namely restrictions upon the use (as opposed to the acquisition or retention) of data.
*And indeed in many ways even desirable
I surprised people by leading with the liberty/privacy subject at my New England Database Summit keynote; considerable discussion ensued, largely supportive. I hope for a similar outcome when I keynote the Aster Big Data Summit in Washington, DC in May. And I expect to do even more to advance the liberty/privacy discussion as 2010 unfolds.
Fortunately, I’m not the only only thinking or talking about these liberty/privacy issues. Read more
Data-based snooping — a huge threat to liberty that we’re all helping make worse
Every year or two, I get back on my soapbox to say:
- Database and analytic technology, as they evolve, will pose tremendous danger to individual liberties.
- We in the industry who are creating this problem also have a duty to help fix it.
- Technological solutions alone won’t suffice. Legal changes are needed.
- The core of the needed legal changes are tight restrictions on governmental use of data, because relying on restrictions about data acquisition and retention clearly won’t suffice.
But this time I don’t plan to be so quick to shut up.
My best writing about the subject of liberty to date is probably in a November, 2008 blog post. My best public speaking about the subject was undoubtedly last Thursday, early in my New England Database Summit keynote address; I got a lot of favorable feedback on that part from the academics and technologists in attendance.
My emphasis is on data-based snooping rather than censorship, for several reasons:
- My work and audience are mainly in the database and analytics sectors. Censorship is more a concern for security, networking, and internet-technology folks.
- After censorship, I think data-based snooping is the second-worst technological threat to liberty.
- In the US and other fairly free countries, data-based snooping may well be the #1 threat.
Categories: Analytic technologies, Data warehousing, Presentations, Surveillance and privacy | 8 Comments |
When people don’t want accurate predictions made about them
In a recent article on governmental anti-terrorism data mining efforts — and the privacy risks associated with same — The Economist wrote (emphasis mine):
Abdul Bakier, a former official in Jordan’s General Intelligence Department, says that tips to foil data-mining systems are discussed at length on some extremist online forums. Tricks such as calling phone-sex hotlines can help make a profile less suspicious. “The new generation of al-Qaeda is practising all that,” he says.
Well, duh. Terrorists and fraudsters don’t want to be detected. Algorithms that rely on positive evidence of bad intent may work anyway. But if you rely on evidence that shows people are not bad actors, that’s likely to work about as well as Bayesian spam detectors.* Read more