Introducing Free Software Hackers
Introducing the Hacker Ethic
The word 'hacker' originated in the computer labs of Massachusetts Institute of Technology (MIT) in the early 1960s amongst a group of programmers who believed that "all information should be free" and that "access to computers... should be unlimited and total" (Levy 1994, p.40). Hackers now define themselves as "an expert or enthusiast of any kind. One might be an astronomy hacker, for example" (Raymond 2003, entry on hacker). One could work in a 'hackerish' way in any field of endeavour where universal access to, and sharing of, the tools of your trade would be positive and viable. Decades later the media began to apply the term to criminals using computers, who hackers began to call 'crackers' (Raymond 2003, entry on cracker). For the purposes of this essay I refer exclusively to the earlier meaning of 'hacker', and in particular on computer hackers, who remain the most successful example, though their attitude is now spreading into other fields of endeavour such as the arts and science .
Pekka Himanen wrote the first major study of the hackers' attitude from a philosophical perspective, establishing the idea of a 'Hacker Ethic', in which he identified seven key characteristics: passion, freedom, their work ethic, their money ethic, their network ethic, caring and creativity (Himanen 2001, p.141). Broadly speaking the Hacker Ethic suggests (a) the importance of a particular kind of work, namely the kind that hackers can be passionate about, that isn't motivated by money, and that is playful (b) a particular approach to working, which allows an individual rhythm of life and yet also places the community and cooperation at the centre, and (c) a particular approach to building productive communities, involving equal and unfettered access to information and tools facilitated by open sharing. Utopian though it sounds, it is important to recognise that the hackers who subscribe to this ethic have built much of the infrastructure of today's information society .
The Hacker Work Ethic
Parts (a) and (b) encapsulate a work ethic that is orientated towards work as being intrinsically worthwhile and motivating, rather than instrumental. Hacker work is, in the words of the hacker Linus Torvalds, "interesting, exciting, and joyous", "intrinsically interesting and challenging" (Himanen 2001, pp.xiii-xvii) and "goes beyond the realm of surviving or of economic life" (Capurro 2003). That these features are intrinsic to the work, rather than being a subjective attitude on the part of the individual, is demonstrated by a comment from an employee of Microsoft. The company competes with the work of hackers, often attacking them, and so charged an employee with the task of investigating the competitiveness of the hackers' work. Without any bias in favour of hackers, he wrote that when hacking on their software, "the feeling was exhilarating and addictive" (OSI 1998).
It is important to note that when hackers talk about intrinsic motivation they almost always use adjectives like "fun", "passionate", "joyous" and "entertaining". In contemporary society we maintain a distinction between work and leisure, and are acutely aware of when work erodes the time we usually dedicate to leisure. To hackers, the distinction is a red herring. Hacking on some challenging code is every bit as entertaining as playing a game of football or reading a book, albeit in a different way. Not all play is "something wasteful [or] frivolous", and can be "the experience of being an active, creative and fully autonomous person"; to hack "is to dedicate yourself to realizing your full human potential; to take an essentially active, rather than passive stance towards your environment; and to be constantly guided in this by your sense of fulfillment (sic), meaning and satisfaction" (Kane 2000).
This guiding sense is apparent in hackers' approach to work management, and specifically in how they decide what to work on. The dominant factor, according to most theorists, is the desire to "scratch and itch", i.e. to satisfy a need (Raymond 1999, Lakhani and Wolf 2005). This need may be a functional one where the hacker needs a particular bit of software, or it may be a personal one where the hacker wants to try his hand at a particular technique. Most hacker work is entered into voluntarily because it is "intellectually stimulating", because it "improves skills" and because of the code's "work functionality" (Lakhani and Wolf 2005). If this is the case, then the adjectives that I related in the previous paragraph should not be thought of as the sole motivations for work, nor solely as pleasant byproducts, but rather as factors that affect how a hacker prefer to scratch an itch.
Of course the hackers' work ethic, insofar as it concentrates on how one should work and why it should be motivating, invites the charge of self-indulgence. It argues for an autonomy in work that facilitates personal fulfillment without accounting for social obligations that might reasonably abridge this autonomy. The Hacker Ethic does encapsulate social obligations, however, in part (c) mentioned above. These obligations can be most clearly studied in the free software movement, an applied example of the hackers' social ethic.
The free software movement arose out of the hacker subculture in MIT in the 1980s. It was started by Richard M. Stallman, who wanted to produce an entire operating system that would be developed and distributed according to the principles of the Hacker Ethic. This act would be "a way of bringing back the cooperative spirit" found in the hackers' social ethic (FSF 2003). This spirit was being taken away by an increasingly proprietary application of copyright to software, whereby the copyright owners abridge hackers' access to information.
The hackers' social ethic is based upon three axioms: First, the belief that sharing information, be it about the weather or a novel, is good; second, that hackers have an ethical duty to share the information with which they work ; and third, that hackers should facilitate access to computers wherever possible (Raymond 2003, entry on hacker ethic). These principles are closely related to the hacker work ethic, since they facilitate it by removing artificial restrictions on the hacker's freedom to use the information with which they work. It is important to note that there is no obligation to create socially valuable products, only to remove restrictions, a liberal feature that I will attend to later.
Stallman applied these axioms by subverting copyright, a limited monopoly granted by the state to a creative person in return for their increased productivity. He wrote and applied a license to his copyrighted work that gave the community free access to the information. He coined this act "copyleft", and described the licensed work as "free software", where "free" refers to freedom rather than price. The license guarantees the following four freedoms:
- The freedom to run the program, for any purpose
- The freedom to study how the program works, and adapt it to your needs
- The freedom to redistribute copies so you can help your neighbor
- The freedom to improve the program, and release your improvements to the public, so that the whole community benefits(FSF 2004)
Stallman conceived of these freedoms as an ethical duty on the part of the hacker to society. Not sharing information in this way is "the wrong treatment of other people", "anti-social" and it "cuts the bonds of society" (Stallman 2004a, 2004b). These bonds are hinted at when he writes that not sharing with others is "divisive" because it reduces the emphasis on "helping one's neighbors" and on working "for the public good". To do this is an obligation, but not one so strong that we must always work for the public good (Stallman 1992, 2005). Even the suggestion that we ought to work for the public good on occasion seems to contradict Raymond's reluctance to mention working on socially valuable products.
In a seminal position paper, Stallman describes the harms that non-free software causes, which parallel the freedoms his licenses guarantee. In the first place, "fewer people use the program" (Stallman 1992). People might be unable to use a program because of 'natural restrictions', such as blindness, or by 'artificial restrictions', such as copyright. In correspondence with Stallman, he confirmed that only harms caused by artificial restrictions need concern a hacker, suggesting that they have no obligation to ensure that, for example, a blind person can use their program as well as somebody with sight (Stallman 2005). The second harm he identifies is that "none of the users can adapt or fix the program", caused by an application of copyright that obstructs access to the program's source code. This also causes the final harm, which is that "other developers cannot learn from the program, or base new work on it" (Stallman 1992). Again, a person with no programming skills or with blindness would suffer the harms regardless of artificial restrictions.
The free software philosophy operates on the harm principle, suggesting that placing any artificial restriction on the sharing of information causes a social harm that is never justified and that must therefore be avoided. From this Stallman develops a consequentialist golden rule, that one must always share software freely under the terms described above because the good consequences, and the avoidance of the aforementioned harms, always outweigh the bad. He writes that "if anything deserves a reward, it is social contribution. Creativity can be a social contribution, but only in so far as society is free to use the results. If programmers deserve to be rewarded for creating innovative programs, by the same token they deserve to be punished if they restrict the use of these programs.".
The free software social ethic also has an important positive component that goes beyond Raymond's weak or liberal emphasise on rights and access. Though these are an important part of the free software ethic -- Stallman maintains that the freedoms his licenses guarantee are a human right (Berry 2004, p.70) -- it also emphasises values such as cooperation and communication in productive communities. "The conception of the social good is strongly communitarian and privileges both a vision of a social order that assigns rights and obligations, and one that is fair and equitable" (Berry 2004, p. 73). The rights and obligations that this position implies are taken as a Kantian categorical imperative and should be scrupulously followed by all hackers.
Stallman's position poses two problems. In the first place, one might reasonably ask why it is that we have any obligation to share information but not to produce it. The ethic is neutral towards an idle hacker who does no work but hostile to a busy hacker who refuses to share his work. It may be simply that social sanctions against idleness already exist, and so the Hacker Ethic is only concerned with an additional sanction not present, that against exclusive ownership of information. Raymond, for example, is an ardent supporter of the free market and so presumably he believes that we needn't worry about idleness because the need for money will compel a hacker to work. Stallman's more left wing political stance, on the other hand, explains his reference to working for the public good. It is safe to say, then, that the Hacker Ethic does place value on individual performing socially useful work, but that there is no consensus on where the responsibility for this lies, be it in the market or social obligations.
The second problem is to do with Stallman's aversion to natural restrictions. By 'natural', Stallman doesn't just refer to biological restrictions but also to other restrictions that we would normally think of as outside the direct control of the hacker. So both blindness and poverty in the user are natural restrictions that the hacker cannot directly overcome, or at least that is the prevailing opinion in the society in which Stallman lives. But the aversion remains strange. Imagine if I were blind or if I had no money. I would be unable to use free software for any purpose, be it to use it, to adapt it or to learn from it if the program doesn't work with accessibility software or if I am unable to purchase a computer. Would I not be less free, according to Stallman's criteria, than a person who faces no natural restrictions but is nonetheless unable to study how the program works because of artificial restrictions? Surely the hacker breaks the bonds of society more strongly if he refuses to make it usable for the majority of his fellow human beings out of a desire for other work that would be characterised as self-indulgent? Furthermore, buying a computer for the poorer person would seem to heed Raymond's call for universal access to computers, and Stallman's call to work for the public good.
An amended rule based upon his four freedoms might state: where the good consequences of a hacker overcoming restrictions outweigh the bad, the hacker has a duty to overcome these restrictions, be they natural or artificial. In the examples given above, adapting my software for blind people or buying a computer for a poorer person would had obvious good consequences, whilst abridging my autonomy and setting me back financially. Given that Stallman suggests we ought to accept a lower wage writing free software rather than attempting to "get rich" through writing non-free software, the hacker will have to heed his social obligations in most cases.
In response to this claim, Stallman simply wrote to me that "to demand an impractical level of clarity in practical applications of ethics simply brings it to a standstill, since it sets the bar impossibly high" (Stallman 2005). In other words, his philosophy is based upon a utilitarian principle that, if taken to its logical extreme, becomes impractical or even undesirable but which, when applied in moderation, becomes desirable. Aside from the fact that this violates his desire for a categorical imperative, because it is impossible to apply the ethic in full to all members of society, it also suggests that he is wrong either in thinking the bar is too high, or somewhere in the construction of the social obligations that set the bar so high.
In his defence, it would be absurd to suggest that a hacker must go out of his way to educate the whole of society to an advanced level of physics so they could use his physics program, even if society wanted to use it. This would involve the hacker volunteering a phenomenal amount of time and resources to educating society, with limited discernible public good. It is not so absurd, however, to suggest that a hacker should spend a small proportion of his time adapting a program essential to a group of people so that they can use it, even if that work isn't intrinsically interesting for the hacker.
Resolving self-indulgent and social obligations
The free software philosophy, as an example of the hackers' social ethic, seems to be a strongly socialistic counterweight to the self-indulgent work ethic. Stallman writes that "a user of software is no less important than an author... their interests and needs have equal weight, when we decide which course of action is best" (Stallman 1992). That is to say that the author of some software has obligations to himself and to society. The self-indulgent obligations are met by working in a joyous and passionate way on software that is intellectually challenging, that develops skills and that has significant use value; the social obligations are more complicated. Stallman posits a weaker social obligation that can be met by distributing any information produced under a free, copyleft license. I have advanced a stronger social obligation, more consistent with the calls for universal access, that can only be met by producing socially useful information, distributing it under a free, copyleft license and purchasing equipment for those whose poverty denies them access.
A hacker will automatically meet Stallman's social obligations without prejudicing his self-indulgent obligations simply by virtue of working according to the Hacker Ethic. By releasing his work under a free license, the hacker won't prejudice his ability to work freely, passionately, joyously and so on. In fact, as part of a community that also meets this obligation, the free distribution of information will facilitate his self-indulgent work practises. A hacker maybe, however, have to temper his self-indulgent obligations to meet the stronger social obligations. For example, making a piece of software usable for blind people may be unchallenging, uninteresting work but it should nonetheless be undertaken for the sake of universal access to that software.
In practise, of course, the point of moderation between the obligations bestowed by the work ethic and the social ethic is decided not by an ethical principle but by personal circumstance; the hacker exercises his own judgement. But the question remains as to whether or not the Hacker Ethic has anything to say on this matter. For Torvalds and Himanen, once a hacker is self-sufficient then the Hacker Ethic can account for characteristically self-indulgent work practises. For Stallman, given self-sufficiency, he is interested in social relations and obligations. Unlike in other cases, where the two demands are clearly antagonistic, the mechanisms that hackers employ to meet their social obligations facilitate their self-indulgent work practises, and vice versa. Given the close connection between these two aspects of the ethic, it would seem possible and attractive that there might be some common framework that could account for both aspects and help resolve the conflicts.
 - The Creative Commons organisation have translated the orientation and techniques of computer hackers to the arts and science. They cite the Free Software Foundation's software license, the GNU GPL, as their inspiration. See http://creativecommons.org
 - Though governments and corporations undoubtedly had a role to play, most of the people working on technologies like TCP/IP and the World Wide Web were and remain self-identified hackers.
 - In computer programming, the information is the source code, human readable and modifiable instructions that programmers work with. The source code is usually compiled into a binary that the computer understands and can run; sharing of binaries isn't useful to programmers, since they are essentially black boxes that reveal little or cryptic information.
R. Capurro (2003). Passions of the Internet and the Art of Living. Paper presented at a colloquium organized by The Graduate School of Library and Information Science, University of Illinois at Urbana-Champaign. http://www.capurro.de/illinois.htm, on file with author.
Free Software Foundation (FSF) (2003). Overview of the GNU Project. GNU Project Website, http://www.gnu.org/gnu/gnu-history.html, on file with author.
Free Software Foundation (FSF) (2004). The Free Software Definition. GNU Project Website, http://www.gnu.org/philosophy/free-sw.html, on file with author.
P. Himanen (2001). The Hacker Ethic and the Spirit of the Information Age. Secker & Warburg, London.
P. Kane (2000). The Play Ethic: why believe in work, when it doesn't believe in you?. The Observer, October 22nd.
K. Lakhani, R. Wolf (2005). Why Hackers Do What They Do: Understanding Motivation and Effort in Free/Open Source Software Projects, in Perspectives on Free and Open Source Software. MIT web site, http://opensource.mit.edu/papers/lakhaniwolf.pdf, on file with author.
S. Levy (1994). Hackers: Heroes of the Computer Revolution. Delta, New York.
Open Source Initiative (OSI) (1998). Halloween Document II, version 1.4. OSI's web site, http://www.opensource.org/halloween/halloween2.php, on file with author.
E. Raymond (1999). The Cathedral and the Bazaar: Musings On Linux and Open Source by an Accidental Revolutionary . O'Reilly.
E. Raymond (2003). The Jargon File, version 4.4.7. Eric Raymond's Web site, http://www.catb.org/~esr/jargon/, on file with author.
R. Samudrala (Date unknown). A primer on the ethics of "intellectual property". Ram Samudrala's Website, http://www.ram.org/ramblings/philosophy/fmp/copying_primer.html, on file with author
R. Stallman (2004a). Free Software - Free Society!, interview with Richard Stallman. GNU Project Website, http://www.gnu.org/philosophy/audio/rms-interview-edinburgh-040527.txt, on file with author
R. Stallman (2005). Personal communication, on file with author.
R. Stallman (2004b). The Free Software Community After 20 Years: With great but incomplete success, what now?. GNU Project Website, http://www.gnu.org/philosophy/use-free-software.html, on file with author
R. Stallman (1993). The GNU Manifesto. GNU Project Website, http://www.gnu.org/gnu/manifesto.html, on file with author
R. Stallman (1992). Why Software Should Be Free. GNU Project Website, http://www.gnu.org/philosophy/shouldbefree.html, on file with author