New Criticals

Killing Time, Pt. [1] - Capture


Pleasure in the job puts perfection in the work. - Aristotle

We fill pre-existing forms and when we fill them we change them and are changed. - Frank Bidart

On screen appears a series of visually distorted letters and numbers obscured via the manipulation of color, composition and space. The user is prompted to identify and record the characters before they can move on to the next page or step in whatever process. Complete the puzzle correctly and the user proves his or her humanity. More importantly, an implicit contract is signed - your intentions are benign. This affirmation rests on another, even more turbulent and nebulous, principle: the understanding of what is “proper” on the Internet; how one should behave and what one should be doing with these digital freedoms. In lieu of a montary exchange, the user offers his or her personal information, pledges allegiance to a particular organizations terms of service;  we will follow the rules.

The technology behind these puzzles was invented at Carnegie Mellon University by a group of several computer programmers, graduate students and professors in 2000 at the request of Yahoo!, in response to the growing problem of unsolicited bulk email (spam). [1]  The preponderance of free services on the web is, ostensibly, perplexing, but at this point we don't buy in to that delusion (benevolent tech corporations providing service for public good) and we are  comfortable with the price (information). Discursively, free email is not just a privilege but a right, and no serious attempts to – for lack of a better word – "privatize" email have been made with any success. Our first red flag in any exploration into the intersection of capitalism, labor, and the Internet: whither the cost? The regeneration and recirculation of (digital) capital relies on organizations that want to reach you through corporations that have the keys. Free is a perspective.  

The early Internet experience was often disjointed and fragmented – dial-up was slow, the lack of certain features commonplace today - centralization, hyper-linking, and indexing - made navigation arduous, the number of pages was limited. Like any nascent medium, it had its shortcomings, both technologically and socially. The Internet today can suffer from similar pitfalls, due in part not to a lack of structure but more due to superabundance. It also exists as it exists. The Internet is overstaurated, overwhelmed (as for us? not so much). As free electronic mail services like Hotmail and Yahoo! became more popular features of the web (and as a consequence the proprietary companies more profitable), the market locked down. Corporations controlled the circulation of capital (via information) in a very simple way: e-mail addresses (registered and validated by user's biographical information) were valuable. There was little room for so-called unsanctioned business. Spammers carved out a necessary space.

[the first result for "spammers" in a Google image search]

The mail services required very little information to sign up – basic biographical and geographical information was acceptable to register for an account – and the necessary information was unverifable. [2] Spammers developed (extremely simple) computer programs that could fill in that information to create thousands of fake email addresses over an extremely short period of time and then use those accounts to deliver massive amounts of UBE indiscriminately. For Yahoo!, this corrupted their users experience. They were rogues, criminals, infecting your safe space with what didn't belong. This couldn't work without the ideological manufacture of how you see and what you feel in front of your inbox.

CAPTCHAs were inspired by the Turning Test, which has become a cultural point of reference and fascinated the public (outside the field of computer science) since the '80s and '90s. [3]  In 1950, Alan Turing published his paper “Computing Machinery and Intelligence” that opened with the provocation “Can machines think?” To approach the question rather than simply it answer it  Turing describes the Imitation Game. The Imitation Game has three participants: a man, a woman, and an interrogator of either sex. The interrogator must determine the sex of the two participants via a yes or no line of questioning. Both participants of both sexes will try to confuse the interrogator in an attempt to produce a misidentification. Adopted from that, the Turning Test for computers is in principle the same, except a machine replaces one of the contestants. The interrogator will attempt to determine which user is the computer, while both the human responder (through their intellect) and the computer (through programming) will employ various strategies to confuse the guesser. Turing conjectured “that in about fifty years’ time it will be possible, for programmed computers, with a storage capacity of about 109, to make them play the Imitation Game so well that an average interrogator will not have more than a 70 per cent chance of making the right identification after five minutes of questioning.”  A CAPTCHA reverses the Turing Test and alters it slightly, demanding a user prove it is not a computer by solving a puzzle generated by the computer that the computer itself cannot solve.

Turing conjectured that at the time such a machine could pass the test “one will be able to speak of machines thinking without expecting to be contradicted,” partially because the vocabulary and discourse would have expanded but also because, if his prediction of escalating computer sophistication turned out to be correct, (it did) then thinking about computers thinking would at least be of actual critical and philosophical (if not technical) relevance. [4] Events like the IBM computer Watson winning Jeopardy against human contestants (a game known for puns and wordplay, among the most difficult things for a computer to process), a fairly seriously taken Singularity movement and the generally rigorous pace of technological innovation make Turing’s paper quite interesting contemporarily. In 1950, Turing encouraged the public to “accept it as a fact that digital computers can be constructed, and indeed have been constructed, according to the principles we have described, and that they can in fact mimic the actions of a human computer very closely.”  CAPTCHAs clock in double shifts - they increase security and advance the field of artificial intelligence – fueled by human computation. They are a win-win; “based on open problems” in the A.I. community so that when they are in use on the Internet “either a CAPTCHA is not broken and there is a way to differentiate humans from computers, or the CAPTCHA is broke and an AI problem is solved.” 

Unlike the Turing Test, the CAPTCHA involves no gamesmanship (but that doesn't mean it's not fun!). The computer is both interrogator and judge. CAPTCHAs first determined that the most important question was if a user was human, then developed a quantifiable method to answer, and took off quickly. By now, they are ubiquitous in web geography and found in all manner of registration processes. They also buttress simple password-protected accounts; in certain cases too many incorrect entries will prompt a CAPTCHA. Naturally, as CAPTCHAs became more widespread and sophisticated, so did techniques to defeat them – new programs better at reading the text distortion but also, more importantly, entirely new approaches for generating and disseminating UBE. CAPTCHAs had to evolve with its “enemies” to provide security against comment spamming, scrapers, online polling manipulating, dictionary attacks, worms, spam and search engine bots. [5] 

[a model Turing Machine]

CAPTCHAs represent “one of the rare moments when the invisible war being waged between spammers and programmers becomes visible to you, the prey.”  The users, and their experience, are the victims, “the prey,” pitted against a so-called criminal element in the marketplace that seeks to corrupt the Internet experience for illegal profit. Spam is unsanctioned, not (yet) illegal, because it operates (as) outside (as possible) of an entrenched system of capitalism on the Internet. Spam is not how we do it here, Melissa Mayer might say. One noteworthy way to beat CAPTCHAs is the “brute force” method, which tasked one person or a group of people, not a computer program, with solving thousands of CAPTCHAs manually, rendering the test obsolete.

At the moment where labor appears most likely to slip from an active role, subordinate to faster and more cost-effective computers, it re-emerges as a valuable resource against an (economic) institution’s chosen method of control. What does that lead to, or where? Lev Grossman: “The faster the software evolves, and the harder it gets it distinguish between people and computers, the faster CAPTCHAs have to change. They might soon involve identifying animals or listening to a sound file – anything computers aren’t good at (what’s next? Tasting wine? Composing a sonnet?).” What's next indeed? 

CAPTCHAs defend an idealized Internet-space by evoking familiar anxieties, “rogue computer programs masquerading as teenagers….creating havoc” on the web through stealing personal information, dumping UBE, and sending malware.  Notions of perversion, invasion, corruption, and abuse saturate the discourse on UB (not to mention hacker discourse writ large). A CAPTCHA should make us think about work, but the discourse instead focuses on what makes us human and how to define and maintain that difference against not just computers, but a class of computer criminals.

Contained within a CAPTCHA interaction are a litany of key themes that confront the economics and modes of labor in a dominant, post-industrial, late-capitalist society like the United States. Certain ideological boundaries are constructed around what is proper use of free services, as organizations lack the comprehensive legal or juridical support to make such activities explicitly criminal, and instead code them as undesirable, unlawful (in the court of public opinion) and dangerous. The user, then, is not just implicated in a war on spam, but drafted into service for the cause every time they solve a CAPTCHA. That service is not meaningless action, but productive activity.

CAPTCHAs are not just about spam, they are about knowledge, micro-labor, affect, digitization, informization, and free time, all taking place on the relatively new economic sphere of the Internet. No longer the responsibility of the company to protect their services, the impetus rests upon the user to prove to the company that he or she will be of service to the service, that they deserve the service, that they will do no harm. Forget money – there is plenty of it around - the user exchanges something of true value: information. Free is a misnomer.

Turing’s digital computers of the future were theoretically capable of mimicking humans. Now, to subvert digital security measures, humans both mimic computers and design computer programs to work like humans. This would not have been lost on Turing, whose test complicated the bio/technical divide and played on and with (anxieties around) electro-organic boundaries. The computer has aggregated previously disparate relationships – tool, machine, luxury item and recreational object – into one. In essence, though, CAPTCHAs open us to thinking about the state of work and labor in a post-industrial capitalist country whose job sector is characterized by mass unemployment, highly speculative, digitized and unstable financial markets, the dominance of the service and technology sectors, and founded on/funded by a total reliance on both abstract and concrete information. Labor has always been in a precarious position in Post-Fordist/global capitalism, but work online raises the bar.

On to Marx!


[1] Signed into law in 2003, the CAN-SPAM (Controlling the Assault of Non-Solicited Pornography and Marketing) Act attempted to craft definitions, regulations and punishments for spam. It applies to commercial email and requires a few things: the header must show sending source, destination, and routing information and cannot contain false or misleading information, non-deceptive subject lines, clear identification as advertisement, the inclusion of a return address, clear notice and protocol for stopping future emails, and a valid geographical postal address from the sender. The CAN-SPAM Act says just that – one can spam. Though not adhering to the principles is punishable under the law, it does not address the un-solicited, non-commercial bulk emails that are the more common manifestations of spam. CAN-SPAM is generally thought of as overreaching, difficult to enforce, and ineffective. Simply put, spam is not illegal.

[2] This illuminates an interesting aspect of the Internet and associated technology: an information paradox. The most unique information (name, address, phone number) is also the most unstable. No one verifies that information.

[3] Which just so happens to coincide with the spread of the Internet as we know it today and the rise of the affordable personal computer.

[4] He actually underestimated amounts of storage and processing power, but overestimated his 30% success rate for the computer.

[5] Scrapers are computer programs that scan the web automatically for email addresses embedded as clear text. The program will read, process, store and then subsequently spam that address.

**Adapted from a thesis submitted to the Department of Media, Culture, and Communication, under advisement from Alex Galloway,  in New York University's Steinhardt School of Culture, Education and Human Development.**