LawMeme LawMeme Yale Law School  
LawMeme
Search LawMeme [ Advanced Search ]
 
 
 
 
Features: The Morality of Software
Posted by James Grimmelmann on Sunday, July 18 @ 19:23:50 EDT Governance
Lon Fuller's The Morality of Law is a fascinating book, by turns brilliant, elusive, provocative, and frustrating. As I read it, Fuller argues that law, by its very nature, must emobody certain basic procedural aspects of morality. Such things as bans on secret laws and on retroactive laws are necessary components of anything that wishes to call itself "law." Without them, there is only the random exercise of power.

Now, I've been thinking a lot about the relationship between law and software lately. Larry Lessig's famous slogan that "code is law" draws an explicit comparison between the power of law and the power of computer software in shaping human conduct. So when I was reading Fuller's book last week, it occurred to me to try using Fuller's "moral" criteria to judge software: to what extent does using computer code instead of legal code make the exercise of power more or less acceptable?

In this little essay, I'll be taking Lessig and Fuller, dropping them in the blender together, and hitting "puree." Click inside for more.

Lon Fuller in Seven Paragraphs

One of the major debates in the philosophy of law is between natural law and positivism. Natural law theorists see law as something people discover; positivists see law as something people create. The Morality of Law makes most sense against the backdrop of that debate.

Plato is the classic example of a natural law thinker: for him, the good and the just have a timeless and perfect existence. Man, through a process of reason, expresses a love for justice that leads him to seek it out. And when a good man becomes the lawgiver for a city, he sets in place those rules which are inherent in the very idea of justice. The coercive force of laws disappears into the background: the purpose of legal institutions is something more like teaching citizens how to be good. Aquinas and Kant are two more great examples of thinkers devoted to natural law: God and our reason tell us that we must not kill each other and that we must keep our promises to each other.

Legal positivism is, perhaps, the necessary corrective to the excessve ambitions of natural law. After all, it seems very hard to derive from first principles whether the law should require that people drive on the right side or the left side of the road. Further, given that people are people, and falliable, it seems somewhat sophistical to say that the acts of a legal system constitute "law" only when they act in perfect conformity with the abstract rules of natural law and that any deviation constitutes lawlessness. Legal positivists tend to see the workings of power in law: think of Hobbes with his emphasis on struggle. They also tend to see law as an instrument of human purposes: think of Bentham's utilitarianism.

The Morality of Law is a reaction against the strong legal positivism of the 20th century (often associated with the legal realist movement). Fuller doesn't want to turn the dial all the way back towards natural law: he thinks that morality provides almost no guidance on most substantive issues of what the law should be. Rather, he thinks that there's a kind of "procedural" morality to law. What natural law tells us is that law has to be made and enforced in certain ways for it to qualify as law. Thus secret or incomprehensible laws aren't really law, and neither are laws that are never enforced. They're just meaningless jabber and random exercises of despotic power. The laws don't provide a useful standard by which citizens can shape their conduct.

If this point were just a statement about the conditions laws must satisfy in order to be effective laws, legal positivists would have little to object to. Just as a smart utilitarian thinks about what behavior should be prohibited because they are destructive, a smart utilitarian thinks about what exercises of lawmaking power would be counterproductive. But Fuller goes further: he says that these concerns about legal process are "moral."

To be more precise, Fuller sees law as a two-way exchange. In exchange for the citizen's promise to obey a rule of law, the lawmaker promises that the stated rule really is the rule by which the citizen's conduct will be judged. Offenses such as making law retroactively or making laws that are impossible to obey are violations of this reciprocity: any obedience on the part of the citizen becomes futile. Fuller sees the citizen's moral duty of obedience as being intimately connected to a lawmaker's moral duty to go by the book in enforcing them.

I'm not sure I agree that this exchange really is "moral." But the point is a powerful one. The power of law is legitimate because of the structure of law. It is only reasonable to ask people to obey the law because they know what the laws are, are capable of following the law, and know that they'll be held accountable only to the laws on the books and not to some secret additional laws. Violations of these conditions -- which lawyers collectively call "legality" -- undermine the justification for asking people to obey the law.

The Morality of Software

Now, this is where "code is law" comes into play. Software regulates conduct, just like law regulates conduct. DRM, TCP/IP, censorware, PayPal, DNS, whois, and spam-filtering mailers all shape what we do online. (If you're skeptical on this score, go read Code.) It becomes a fair question, therefore, whether or not software satisfies the conditions that make law legitimate. If software does better at respecting legality than law itself, then a transition to an environment regulated by software is a good thing. Conversely, in places where software doesn't respect legality, extra caution becomes necessary.

Fuller states eight conditions that laws must satisfy to be worthy of being called "law;" I'll follow that organization in comparing software and law. Following Lessig, I'll only be thinking about software in its ability directly to control behavior: I'm online, and I try to do something, and either the software goes ahead and does it or doesn't do it, thereby "prohibiting" me from something I'd otherwise have chosen.

  • Generality: The first condition is that the rules involved must be generally applicable, not a bunch of ad hoc improvisation. This is the typical complaint used against one-off decisions like Bush v. Gore; it's also why some people go nuts over "nonprecedential" opinions. Decisions that aren't part of some general pattern of consistency don't usefully tell anyone what to do. There's just this guy in charge, and he does . . . stuff.

    Simple software systems do well when it comes to generality. An if-then statement tells you exactly when the "then" clause will take effect. But as a software system gets more complex, its decisions can get less and less general. A project in a late stage of kluging and ad hoc patching may have special case code for thousands of special cases. What are the "general" rules enforced by Windows XP, with its tens of millions of lines of source code? The sheer complexity of software can make a mockery of generality.

  • Publicity: The second condition is that rules be announced to those affected by them. It's somewhere between pointless and unjust to ask me to obey a law I have no way of knowing about. Lessig pegged this one perfectly: closed-source software is bad at telling users what's going on, while open-source software is, if not always good, at least not terrible. You can do horrible things in secret with software: every piece of censorware in existence depends in large part on keeping users from knowing what the rules of its censorship are. With conscientious effort, designers can publicize to users what the software will and won't let them do; unscrupulous ones can get away with virtual murder.

  • Prospectivity: The third condition is that rules must not be retroactive. Good law is prospective: it must be passed before being enforced. Indeed, the Constitution specifically forbids "ex post facto" laws. Here, software shines. Lessig refers to architectural restrictions as "present constraints" in that they block the forbidden conduct beforehand, rather than punishing it after the fact. It follows immediately that retroactive "present constraints" are a condtradiction in terms: you have to program the software before it does anything.

  • Comprehensibility: The fourth condition is that rules must be comprehensible. If Congress passes a law that "squizmarunk blarfoo woogly ern-frezeeta quomf," and courts start punishing people for quomfing without appropriate blarfoo, that's not law. It's a cruel Kafkaesque joke. This condition is where Lessig's cautions about even open-source software come into play: sure, we can all read the source code. But that doesn't mean it's bug-free, or that we understand what's going on. To the average user, every software bug with visible effects is a glaring instance of incomprehensible software. Dammit, I have no idea why it's crashing.

  • Consistency: The fifth condition is that rules must not contradict each other. A law that dancing on Sunday is compulsory and a law that dancing on Sunday is forbidden are, as John Marshall said, a "plain repugnance." With law, this situation is possible because lawmakers can make mistakes or be deliberately malicious. But software can't contradict itself in this way. Bits are either one or zero; a given action either works or produces an error. The software does what it does--only on a quantum computer would it even make sense to talk about having "contradictory" rules.

  • Possibility: The sixth (and closely related) condition is that rules must not require the impossible. Again, it takes a mistaken or malicious lawmaker to set up laws against breathing and then fine the "lawbreakers" who criminally insist on breathing. Here again, software comes out more or less okay. If there's no way to do something in a software system, there' no way to do it. But since it's always possible not to use the software in the first place, there is, strictly speaking, no impossibility involved.

    True, that answer is something of a dodge. With sufficient outside pressure to log on, it may become infeasible not to use the software. In that case, it's reasonable to pin the "impossibility" on the other regulator, the one that stupidly forces you to use software that just doesn't work. The DMCA, in some of its sillier moments, wanders into this territory: our policy is that we are not being hacked. The software qua software may be pointless, but it doesn't -- it can't -- punish you for not doing the impossible.

  • Stability: The seventh condition is that the rules not change so frequently as to make reliance on them pointless. Law in a constant state of flux is confusing and unsettling. I make this condition a wash: programmers can change software and legislatures can change law. There's nothing intrinsic to the medium of law or of software that blocks or requires frequent change. As with law, it's good practice not to change horses midstream, but good practice is not reuired by the nature of software itself.

  • Reality: The eighth condition is that the rules as they are actually applied must conform to the rules as they are announced. Compliance with all the stated laws in the world do you no good if the cop beats you up and the courts don't care. If you take the position that the "announced" rules of software are whatever the source code says, then software does brilliantly: the only deviation from those announced rules occurs in cases of hardware failure. Too bad for you if you can't get at the souce code, but there's no monkey business going on in the actual application of the rules. And indeed, this is one of the advantages of software: computers can't be bribed or swayed the way people can.

    On the other hand, it doesn't always make sense to treat the source code as the "announcement" of the rules. That just doesn't match what's going on in most uses of software. If you think of the "announcement" as being what the designer says the software does, then software is actually a lot less faithful to its stated purposes than law is. Bugs, looked at another way, are deviations from the announced rules of software -- deviations even the programmer doesn't expect. Further, because a good legal system announces what it is doing as it goes along (in the opinions of judges and the press conferences of prosecutors), software's silence in this regard is striking. It just tells you a lot less about what it's doing and why than law does.

Let's tally things up. Software is unambiguously better at legality than law itself on three counts (prospectivity, consistency, and possibility). It's strictly inferior to law on two (publicity and comprehensibility). One (stablility) is a complete wash. The last two (generality and reality) depend on very much on the kind of software we're talking about and how it's used.

Overall, then, there is no simple answer as to whether software is better than law or not when it comes to the conditions that Fuller would say make any system of authority worthy of obedience. It respects those values more in some ways, less in others. Whether or not any given software system is a good replacement for a legal alternative will depend on which values of legality are more important to you (a large part of Law and Morality discussed the ways in which these values are necessarily in tension). Further, it will depend on how well the software system's designers handle the challenges of explaining accurately just what it is that their software does, and those explanations will be more or less persuasive for different kinds of software.

What a Fullerian analysis makes clearer, I think, is the concerns which which any attempt to shape human behavior must take into account. In the case of software, the exercise points out both why software is appealing as a replacement for law and why it is scary. People focusing on different values of legality will see different pieces of the elephant that is software. Talking about these values all at once helps bring the whole of the animal into view.

 
Related Links
· More about Governance
· News by James Grimmelmann


Most read story about Governance:
How Not to Shutter a Service: Weblogs.com Goes Dark

Options

 Printer Friendly Page  Printer Friendly Page

 Send to a Friend  Send to a Friend

Threshold
  
The comments are owned by the poster. We aren't responsible for their content.

Code and Law: different beasts (Score: 0)
by Anonymous on Monday, July 19 @ 02:38:03 EDT
Maybe I just need a dose of Lessig, but I see this comparison as misguided in an important way. Law is a set of rules used for the governance of free-acting agents; code is a set of instructions by which an automaton is directed. When our actions are governed by software (as per so-called "rights management systems"), the attempt is to regulate what we *can actually do*, as opposed to law, which regulates what we are *allowed* to do. This is like the difference between a speed limit on a highway (law) and a speed-limiter in the engine of a car (code).

The above analysis lacks this distinction, and suffers for that fact, in my opinion. The question of whether and when code is a good substitute for law is another can of worms entirely, since it comes back to a question of personal freedom, among other things. Law-abiding citizens obey the law of their own volition; there's no such thing as a code-abiding citizen because code is code, and code must be changed for any other behaviour to be possible. Falling back on the speed-limit analogy, it makes no sense to ask whether a driver obeyed the speed-limiter in the engine of his car (although it makes sense to ask whether he attempted to *circumvent* it, thus anti-circumvention clauses in laws like the DMCA).

"Code is Law" is a useful aphorism, but don't take it literally.


[ Reply to This ]


Re: The Morality of Software (Score: 1)
by HowardGilbert on Monday, July 19 @ 17:17:38 EDT
(User Info | Send a Message) http://www.yale.edu/pclt
The problem is that Lessig really doesn't understand software development. He is a lawyer, not a software engineer.
In software, the functional equivalent of law is the Standard. There are some standards (from ISO for example) that have the force of law in some countries. Standards specify the elements of programming languages, network protocols, and all the W3 standards on XML and the Web.
Based on the standard, a good software engineer creates a specification. The specification tells a potential consumer of some code what the code will do for him and how to use it. Programmer use the term "contract" for this, and it is a pretty good analogy to a legal term. A specification should conform to all applicable standards.
Code is then an implementation of the specification. Code can be private, because if it conforms to the standards and correctly implements the specification, the details of that implementation are unimportant. Software engineers have even argued that the code should not be exposed to potential users (breaking encapsulation) because that often leads to dependence on features that were not part of the specification, or it covers up omissions in the specification that should have been corrected properly.
Lessig has listened to too many people who try to make a virtue of bad practice by skipping over the specification (and, for that matter, all documentation, comments, and test cases) and simply write code. They want to say that the code is the specification, and the documentation, and anyone who wants to know how things work should just "read the code."
Bad practice, no matter how widespread, doesn't create new definitions. If you want to know the "law" about XML, read the W3 standards on the matter. If you want to write Java code that uses XML, then read up on JAXP, the specification of the Java interface for XML processing. All real programmers know that if you are hunting around in the source for Xerces 2.6.2 to figure out how to handle a routine problem, then you are in the wrong place.
Standards conform to all of the rules about law. They are all

  • General - not specific to any vendor or system
  • Public - available to anyone to read
  • Prospective - IPv6 will happen eventually
  • Comprehensible - but only if you know how to read standards language (it is easier than reading a patent, about on par with reading your car insurance policy)
  • Consistent and Stable - with all other applicable standards and all previous versions
  • Possible - We could turn IPv6 on today
  • Real - (Well, I am not sure that the W3 is realistic about Ontology and the Semantic Web), but most standards can actually be implemented.

If you want examples of vendor initiated standards, look at Sun Java and Microsoft .NET. (both of which have implementing code, but the standards are not the code, the standards are the standards).
If something become generally important and it has no standard or specification, then the code may be the only source of information. This is not, however, the general principle that Lessig claims.
Most disputes occur when "common practice" differs from the actual text of the standard. Should a programmer do what most people do, even if not obligated to do so? There are also disputes about "intent" when someone conforms to a standard but uses it in a way that the original developer of the standard does not like. Sun is rather agressive about trying to go back and change its "Java standard" to declare things illegal that someone else did and they didn't like. Here the line between figurative law and actual law can be crossed, since most of the Sun-Microsoft Java case was based on this trick. One you make a law, you cannot control the intent of the people who follow it.
Code is not law, and never was law. That is sloppy thinking to justify lazy practice.


[ Reply to This ]


Leges humanae nascuntur, vivunt, moriuntur
Human laws are born, live, and die

Contributors retain copyright interests in all stories, comments and submissions.
The PHP-Nuke engine on which LawMeme runs is copyright by PHP-Nuke, and is freely available under the GNU GPL.
Everything else is copyright copyright 2002-04 by the Information Society Project.

This material may be distributed only subject to the terms and conditions
set forth in the Open Publication License, v1.0 or later.
The latest version is currently available at http://www.opencontent.org/openpub/.

You can syndicate our news with backend.php



Page Generation: 0.329 Seconds