The End of Privacy, Part 2: Scoring Pro-Social Behaviour

Despite what some philosophers will tell you, morality is clearly a work in progress. It changes over time. My father was born into a world in which “fornication” was considered immoral. Now, not only do most people not regard it as immoral, many have trouble even understanding how anyone could ever have regarded it as immoral. Such is the way things change.

There is a complex relationship between our moral code and the present state of technology. It is surely no accident that the seismic shifts in sexual mores occurred in the wake of the discovery of safe, effective birth control technology.

There is another technological change looming on the horizon, which it seems to me stands a good chance of changing everyday morality in fundamental ways. We can already see the technology at work in apps like Airbnb and Uber, which seek to eliminate the “trust” problem between contracting parties by allowing them to rate one another. Uber allows passengers to rate drivers, but just as importantly, allows drivers to rate passengers. Act like a boor in the back seat of an Uber driver’s car, and you may find that fewer drivers are willing to pick you up next time you try to summon a ride (actually, what happens is that the high-rank drivers will pass on you, leaving you to the drivers who themselves have a low ranking). Similarly, if you make a mess of an apartment that you’ve rented through Airbnb, fewer people will be willing to rent to you in the future. This is sometimes called an online reputation system.

Now imagine an online reputation system, not for cars and apartments, but for life generally, where you get to “score” based on any and all aspects of your behavior. What is this guy like? Is he nice, or nasty? Is he cooperative, or is he a jerk? What’s he like on a date? Did he do well in school? Does he show up for work on time? (substitute she throughout as well). Imagine a system that tracks this, with certain inputs from teachers, employers, etc., and other from the public – like Yelp reviews. There is already a dating app that allows women to rate men (lulu), but it still requires the man’s permission in order to create a profile (which limits its usefulness at filtering out rapists, which is one of the most obvious potential applications). But there’s no way to stop someone from creating a system that permits an involuntary profile to be created, and it seems to me it’s just a matter of time before someone does.

Of course, it may not be private individuals that do it. The government of China has announced its intention to create a “social credit” system, with a rating for all individuals – like a credit score, blended with a secret police file, then generalized to include all sorts of information about pro-social behaviour (other than just repayment of loans). The pitch is that it can be used for dating, as well as employment (but also, of course, by police and authorities). Right now the government is letting private firms do the pilot projects, but the plan is ultimately to create a unified system under state control. Most Chinese, it should be noted, do not appear to be overly fussed about this (which I don’t find particularly surprising, although it has attracted some incredulous commentary in the Western media).

The question is, should we be fussed about this? Like most Westerners, my natural reaction is negative; I find the whole idea to be shockingly intrusive. One the other hand, it would obviously be useful, and it would eliminate an enormous number of collective action problems. As Uber and Airbnb have shown, the problem of trust-between-strangers created enormous transaction costs, which were preventing billions of dollars worth of valuable transactions from taking place, that could have been occurring. So many apartments sitting empty! So many cars sitting idle! Lots of people want places to stay, or need rides. The problem was that people couldn’t trust one another, and so the transaction didn’t take place. Solve the trust problem (though an online payment system, as well as the rating system), and suddenly an entire market appears where previously there was none.

Now just imagine how many other transactions aren’t taking place, right now, because people can’t trust one another… Think of how dramatically different your life would be, how many different activities you would engage in, if you had some way of immediately knowing, right away, upon meeting a stranger, whether you could trust that person, and if you had the power to alert everyone else, should that person abuse your trust. Imagine that your phone simply displayed the “social credit” score of everyone in proximity to you… in the bar, on the subway, in the classroom. I don’t think it would be an exaggeration to say that social life would be completely transformed.

There is also, it should be noted, a traditional left-wing fantasy about a cashless economy that runs entirely on reciprocity, with something like a “social credit” score (or like reddit’s “karma”) to track each individual’s contributions. Many of these socialist schemes, particularly those that aim to reduce or eliminate “consumerism,” are also shockingly intrusive – consider, for instance, the “Parecon” proposal, which suggests that all of your consumption demands should be subject to deliberative approval by your neighbours. The charitable interpretation of these schemes is that their supporters have not really thought through very carefully what it would actually be like to live under them. In any case, I am curious what supporters of such schemes think of the incipient Chinese social credit system.

Incidentally, this sort of information-sharing about people’s character is a structural feature of small-scale societies, where it was accomplished largely through gossip. This went hand-in-hand with very intrusive forms of social control. In the small-scale societies in which humans evolved, everyone knew everything about your business, and your reputational “score” was common knowledge in all interactions. Gossip, however, as a social control mechanism, does not survive the transition to large-scale societies and urban living. It’s just too hard to keep track of what everyone is up do (this is the point of Robin Dunbar’s claims). So you wind up interacting will all sorts of people that you don’t really know anything about.

In this way, modern technology is really just recreating the conditions of small-scale societies on a mass scale. In retrospect, it may turn out that we lived in an enchanted time, where humanity had figured out how to make the leap from small scale to large scale societies, but had not yet figured out how to reimpose the social control systems that operate on the small scale. This creating something of a golden age of individual freedom – where you could act in anti-social ways without serious consequence. When I contemplate the possibility of a comprehensive social ranking system, I’m struck by the sentimental attachment I have to the ability to act immorally without anyone knowing.

Recall the passage in John Stuart Mill’s On Liberty, in which he deals with the issue of alcohol sales. He is arguing against a temperance “Alliance,” which asserted that the state could prohibit conduct whenever it violated the “social rights” of others:

A theory of “social rights,” the like of which probably never before found its way into distinct language: being nothing short of this—that it is the absolute social right of every individual, that every other individual shall act in every respect exactly as he ought; that whosoever fails thereof in the smallest particular, violates my social right, and entitles me to demand from the legislature the removal of the grievance. So monstrous a principle is far more dangerous than any single interference with liberty; there is no violation of liberty which it would not justify; it acknowledges no right to any freedom whatever, except perhaps to that of holding opinions in secret, without ever disclosing them: for, the moment an opinion which I consider noxious passes any one’s lips, it invades all the “social rights” attributed to me by the Alliance. The doctrine ascribes to all mankind a vested interest in each other’s moral, intellectual, and even physical perfection, to be defined by each claimant according to his own standard.

Mill’s argument argue here proceeds through a sort of reductio, but I wonder how much of that reductio is due to the undesirablity of the outcome he describes, and how much is due rather to its infeasibility. In other words, I wonder how much of the “liberty” we have enjoyed has been a consequence of technological limitations, which did not so much limit the state as prevented people from effectively supervising and controlling each other.

Another possibility, of course, is that morality may simply become less strict. Right now what we have is a relatively strict moral code, with extremely lax enforcement. Perhaps it’s because the code is so hard to enforce that we make it fairly strict. When the probability of apprehension was very low, punishment was more strict, and the number of excusing conditions were fewer. But suddenly, as probability of apprehension increases dramatically, it seems to me this puts a lot of pressure on the code to become laxer (just as our tolerance for “youthful folly” is under pressure to expand, now that we have a generation for whom a perfect record of the thoughts and opinions of one’s 17-year old self is being preserved for eternity).

Comments

The End of Privacy, Part 2: Scoring Pro-Social Behaviour — 10 Comments

    • Thanks for the comment about the Black Mirror episode. I kept thinking about it as I read this post.

      For those who haven’t seen it,the episode speaks at least in part to the possibility of “gaming” such a social scoring system.

  1. This is the gist of a book I always wanted to write called “Freedom as friction”. The friction in the transmission of information is at the root of a great deal of what we call freedom. A friction-free information society is a totalitarian state.

  2. “Modern technology is really just recreating the conditions of small scale societies on a mass scale” sounds quite like Marshall McLuhan’s “global village”. Is that what you were thinking of?

  3. Actually, people in small scale societies often did have an out if the members of their tribe got too intrusive. As Steve Sailer notes in his review of Sebastian Junger’s Tribe:

    Rather than living in solitude, according to Junger, tribe members benefit psychologically from community, unity, and equality. Yet both suggestions are useful…. [Yet], part of the appeal of American Indian life was that tribe members had the options of not just loyalty or voice, but also of exit. Tribes frequently fractured when different factions couldn’t get along. Rather than stay together and stab each other in the back, one group would up and leave.

    http://takimag.com/article/tribal_counsel_steve_sailer/print#ixzz4WkXHqE7V

    • Tribe is a really important book. A better guide to “what’s going on” right now than a hell of a lot of political science that’s being published.

  4. Thank you for this article Heath, you help me see important issues through a new lenses and it is always enlightening.

    Anyway, the idea of how new tech companies are essentially filling a trust void is very interesting because I think we have not seen that happen yet in two key areas of our society, politics and news media. These two arenas have seen huge gaps form in trust. Although we are seeing fact checkers rise in popularity, probably because of the trust void in news media but when it comes to who our political leaders we do not have a good mechanism yet.

    In the past, the news media was the mechanism through which we were able to weigh evidence on whether someone was worthy of our trust in them to govern. But now that the news media industry is broken and we have an overabundance of information telling us how politicians will DO ANYTHING to get elected, it has become very difficult to tell if someone is truly trustworthy. This is why I think Trump was so appealing.

    I do not know if I am right here or not but I feel like I am. Especially when you start bringing in the ideas that Andrew Potter has been sharing on here as well as the amazing book Social Organism (definitely give it a read, I honestly think it is a game changer!).

  5. Erm . . . I’m pretty dashed sure that the rounds of deliberation in Parecon are between the AGGREGATED demands of consumers and the AGGREGATED supply proposals of suppliers. So I’m actually rather shocked at the drive-by this article pulls on it.

    But aside from that, quite interesting and thought-provoking. Not that I entirely buy it, but I don’t think that’s the intent anyway. There are a lot of cool thought-experiments here, definitely some stuff to chew on.

    • Regarding my Parecon drive-by: eventually it has to get aggregated, but it doesn’t start that way (i.e. in “round one”). From Albert’s Parecon book:

      Individuals present consumption requests to neighborhood councils, which collectively approve or disapprove the requests and organize them into a total council request for individual goods for all their members along with the neighborhood collective consumption request, to become the total neighborhood consumption proposal (p. 130).

      Strictly speaking, in the Parecon system, there is no way to obtain a consumption good without the approval of your neighbours. Of course, as we already know from the experience in Eastern Europe, a rigid planning system like Parecon (and it is incredibly rigid — think how hard it would be to move from one neighborhood to another), would immediately give rise to deep, comprehensive black markets (a topic that I can’t remember seeing addressed in the canonical literature), so you would still be able to get most things you wanted illegally… still.

  6. Another example of reputational feedback systems completely changing a market is the market for illegal drugs. Since darknet sites such as the now defunct silk road allowed users to rate online vendors people who sold drugs cut with other substances were very quickly weeded out and consequently the quality of drugs purchased in this way is extremely high. There’s certainly an argument this has improved safety for people who use such substances.

    However reputational systems are also open to all sorts of gaming. Until recently, one of the top restaurants in London according to trip advisor was a small pasta takeaway kiosk in the O2 arena. I dress to imagine what sort of corruption might happen to this system in China.