This article about using cognitive mapping technologies to mine mass email archives, with the goal of identify "useful" information and "persons of interest" is both fascinating and terrifying. It's beyond doubt that this kind of technology is already being used by the three letter agencies, and can there be any question that the Chinese government is using such technologies to monitor its own people?
I would also be extremely surprised if this technology is not commercialized in short order. Certainly law enforcement agencies that would love to have COTS software that could help them when researching complex cases. No doubt some companies will use this technology to improve marketing techniques. (G-mail, anyone? are we so sure that Google will continue to "do no evil"?) But most sinisterly and quite likely, a commercialized version of this technology may well find customers in enterprises seeking "to improve management and oversight" over knowledge workers.
If I didn't find the technology so essentially disturbing, I'd go start a company to do this stuff right now.
Like other massive technological shifts, the rise of the network society, with the Internet's ability to aggregate information that otherwise and hitherto has been dispersed and unassembled, offers janus-faced moral and political implications. On the one hand, this information aggregation effect can (and will) be used to massively increase surveillance and centralized control. On the other hand, it can (and will) be used for much more democratic purposes, as a way to collect and understand individual views that might otherwise remain inchoate, or to find like-minded individuals on narrow topics, with whom one can collaborate.
Almost all epochal technological breakthroughs have this moral ambiguity, or more accurately, double moral possibility. Such breakthroughs enable new worlds, and the question is, will those new worlds be more or less moral than the one we live in now? The answer to that depends on the moral characters of the people who master the technologies themselves.
No comments:
Post a Comment