Category: surveillance

To measure the policy positions of every single person

Earlier this month I wrote about the “Rule of Suspicion Algorithms” . Using computer expert systems in order to predict who is more or less likely to become a criminal or a political dissident is not so different from predicting peoples’ policy positions. Michael Laver, an authority on computer-aided quantitative content analysis in political science from New York University, is enthusiastic about the prospects that the large new data troves generated by users themselves hold for political science data analysis:

There is no reason, for example, why we should not set out to measure the policy positions of every single person who uses social media and, with appropriate modeling, to make inferences from these positions about people who do not use social media.

While this indeed is exciting, from a normative perspective concerned with the quality of democracy I’d like to add that it does matter whether such information is generated by academics in order to inform the academic debate and the wider public or if this information will only inform the few, such as security services and corporations. If information about the many is accessible to the many — in aggregated form — societies may reach a higher degree of self-understanding. This would be on the basis of symmetric information distribution. An asymmetric information distribution, on the other hand, would diminish the quality of democracy by granting a limited set of the population privileged access to information which offers them possibilities for manipulating opinion and perception — from the macro to the micro-scale.

How such kinds of information are used will likely become a defining feature of politics over the years to come.

 

The rule of “suspicion algorithms”

The decision-making criteria of computer expert systems are often so complex that they are beyond the comprehension of their individual users and creators. For example, computer systems equipped with artificial intelligence can be used for estimating the degree to which somebody is likely to default on her credit. These days, computer programs can also be used for the large-scale monitoring of populations and for attempts at predicting who is more or less likely to become a criminal or a political dissident.

When computers start to decide who is likely to be a threat and who isn’t and neither secret services, law enforcement nor the subjects of surveillance understand how a threat assessment comes about, the shared understanding of what constitutes suspicious behaviour gets lost. Writing in the Intercept Dan Froomkin cites Phillip Rogaway, a professor of computer science at the University of California, Davis:

If the algorithms NSA computers use to identify threats are too complex for humans to understand, Rogaway wrote, “it will be impossible to understand the contours of the surveillance apparatus by which one is judged.  All that people will be able to do is to try your best to behave just like everyone else.

If people don’t understand the criteria by which they are judged anymore, one can still find it reasonable to use such computer systems. Yet, their “suspicion algorithms” themselves don’t express human reasoning anymore. People become subject to a governance by statistical probabilities instead of human value choices . The computers may not rule as they don’t possess true agency yet. Still, humans delegate their assessment of who is an insider and who an outsider, of who is a friend and a potential foe to systems whose calculations are beyond their comprehension. Reasoning about an essentially political decision is transferred to machines .

The data is there, the algorithms set in place. An ethics of the data age has yet to emerge.

How Apple keeps track of the programs you install on your Mac

It probably doesn’t come as news but I just got curious why this little program called gkoverride wants to call someone through my firewall when I try to install a patch for SPSS and I found a pretty good explanation on Zdziarski’s Blog of Things saying that Apple basically checks new program installations for security purposes. However, without asking they also keep track of the programs users install — in the name of security. Most users probably care more about security than about privacy but I think it should be made more clear and transparent and it would be important to also provide alternative security mechanisms — as suggested by Zdziarski.

A browser extension that exposes the role money plays in the US Congress

What moneyed interests support a politician? It would clearly enhance the politics section of any newspaper if that type of contextual information could be presented as an accompaniment to news articles that feature the words and voting behaviour of elected representatives. That’s probably exactly what a teenager in the USA was thinking when he developed a browser plug-in that “when you mouse-over the name of a US lawmaker, will serve up a list of which parties have donated to their campaign funds, and the quantities”.

greenhouse

 

One could think of many interesting extensions or alternative applications: For example, one could adapt it to other polities by drawing on datasets from other countries or it would be possible to switch the perspective from lawmakers to firms and represent information on firms’ lobbying history via mouse-overs.

In their book “Full Disclosure. The Perils and Promise of Transparency”, Fung, Graham and Weil (2007) call this type of emergent transparency “collaborative transparency”. In the age of big data, ubiquitous information technology and smart kids, this is going to stay exciting for a long time to come.

WhatsFace? Still not liking it.

Today it was in the news. Facebook snapped up WhatsApp. I never really feel comfortable communicating privately via Facebook. I feel Facebook knows too much about us, while we know too little about what Facebook knows about us.

digital-privacy-surveillance

Now WhatsApp. When my old Smartphone broke a few years ago I didn’t really bother to replace it. I’m sort of glad that it spared me the choice of installing WhatsApp. I thought, well, surely the technology is great but here comes yet another monopolising service that snaps up all our data, and can pass it on to people that run network analyses on my friends and colleagues and content analyses on my messages.

Of course, Facebook has much of that data already. I never believed you could trust them. But well, nearly everybody uses it and in a way we are all in this together, generation „friend“ and „like“ and „tag“. I just didn’t like the idea of once again succumbing to the seduction of a company that makes money by me connecting to it, by me pouring information about me and everyone around me into it. Now it’s basically one company. I still don’t like it.

There are other, more private and secure services as well. But when you load them it feels like an empty corridor. No one is there. You hear your own echo. Still, I don’t wanna give up on it.

You can set up free and encrypted chats on your mobile phone with apps for Android and iOS. You can also connect to them from Windows, Mac and Linux computers. It take some more effort than WhatsApp and might not develop as fast. But you won’t be bugged so easily either. You will be free. At least a little bit.

Breathe.

NSA’s growing distrust of own staff: Advice yes, but please only from a distance!

The National Security Agency (NSA) boss Keith Alexander announced on Thursday, 9th August 2013, that the NSA intends to reduce the number of system administrators by 90% in order to lower the risk of further leaks like those released by Edward Snowden. Apparently, the NSA does not only not trust the American population, it also doesn’t trust its own staff.

Those administrators who remain may be more carefully vetted. That vetting may also include more careful and constant “background screening”, which means that someone needs to monitor those who monitor the population at large. Then, someone also needs to (secretly?) monitor those (secretly?) monitoring the population. Why do I put a question mark behind “secretly”? Because thanks to Snowden it’s not secret anymore that large-scale surveillance is installed and its also quite obvious that the NSA itself feels a need to increase its staff surveillance. So it is out in the open that “secret” surveillance is being conducted.
What we are left with is a situation that is dripping with paranoia of a dual sort: The NSA doesn’t trust anybody and nobody can trust to be left alone by the NSA and their corporate allies.

“Just because you’re paranoid doesn’t mean they aren’t after you”

Joseph Heller, Catch-22

Distrust may, hopefully, not always be the most solid of foundations. For all I can imagine about the role of system administrators in defending IT systems, I doubt that large-scale redundancies are a sustainable way of fortification. There are hordes of smart hackers out there. The NSA will certainly continue enlisting them. But they may only ask for advice from a distance — like consulting an online doctor instead of going to a hospital.