You have a choice dear reader: spend 3 seconds scanning this blogpost, or spend the full 1:11:28 minutes listening to the interview John Brockman did with technology philosopher and founding editor of Wired Magazine, Kevin Kelly.
The interview touches upon the nature of technology, big data, surveillance society, money as a medium, techno-literacy and the question whether the universe is analog or digital.
In the spring of 1986, Back to the Future, the Michael J Fox blockbuster featuring a time-traveling DeLorean car, was less than a year old. The Apple Macintosh, launched via a single, iconic ad directed by Ridley Blade Runner Scott, was less than two years old. Ronald Reagan, immortalized by Gore Vidal as “the acting president”, was hailing the mujahideen in Afghanistan as “freedom fighters”.
The world was mired in Cyber Cold War mode; the talk was all about electronic counter-measures, with American C3s (command, control, communications) programmed to destroy Soviet C3s, and both the US and the USSR under MAD (mutually assured destruction) nuclear policies being able to destroy the earth 100 times over. Edward Snowden was not yet a three-year-old.
It was in this context that I set out to do a special report for a now defunct magazine about Artificial Intelligence (AI), roving from the Computer Museum in Boston to Apple in Cupertino and Pixar in San Rafael, and then to the campuses of Stanford, Berkeley and the MIT.
AI had been “inaugurated” in 1956 by Stanford’s John McCarthy and MIT professor Marvin Minsky, then a student at Harvard. The basic idea, according to Minsky, was that any intelligence trait could be described so precisely that a machine could be created to simulate it.
My trip inevitably involved meeting a fabulous cast of characters. At the MIT’s AI lab, there was Minsky and an inveterate iconoclast, Joseph Weizenbaum, who had coined the term “artificial intelligentsia” and believed computers could never “think” just like a human being.
At Stanford, there was Edward Feigenbaum, absolutely paranoid about Japanese scientific progress; he believed that if the Japanese a fifth-generation (5G) program, based on artificial intelligence, “the US will be able to bill itself as the first great post-industrial agrarian society”.
And at Berkeley, still under the flame of hippie utopian populism, there was Robert Wilensky – Brooklyn accent, Yale gloss, California overtones; and philosopher Robert Dreyfus, a tireless enemy of AI who got his kicks delivering lectures such as “Conventional AI as a Paradigm of Degenerated Research”.
Meet Kim No-VAX
Soon I was deep into Minsky’s “frames” – a basic concept to organize every subsequent AI program – and the “Chomsky paradigm”; the notion that language is at the root of knowledge, and that formal syntax is at the root of language. That was the Bible of cognitive science at the MIT.
. . . . . . .
Human beings don’t have the appropriate engineering for the society they developed. Over a million years of evolution, the instinct of getting together in small communities, belligerent and compact, turned out to be correct. But then, in the 20th century, Man ceased to adapt. Technology overtook evolution. The brain of an ancestral creature, like a rat, which sees provocation in the face of every stranger, is the brain that now controls the earth’s destiny.
It’s as if Wilensky was describing the NSA as it would be 28 years later. Some questions still remain unanswered; for instance, if our race does not fit anymore the society it built, who’d guarantee that its machines are properly engineered? Who’d guarantee that intelligent machines act in our interest?
What was already clear by then was that “intelligent” computers would not end a global arms race. And it would be a long time, up to the Snowden revelations in 2013, for most of the planet to have a clearer idea of how the NSA orchestrates the Orwellian-Panopticon complex. As for my back to the future trip, in the end I did not manage to uncover the “secret” of AI. But I’ll always remain very fond of Kim No-VAX.
In the late eighteenth century the majority of people alive on earth were held in slavery or serfdom (three-quarters of the earth’s population, in fact, according to the Encyclopedia of Human Rights from Oxford University Press). The idea of abolishing something so pervasive and long-lasting as slavery was widely considered ridiculous. Slavery had always been with us and always would be. One couldn’t wish it away with naive sentiments or ignore the mandates of our human nature, unpleasant though they might be. Religion and science and history and economics all purported to prove slavery’s permanence, acceptability, and even desirability. Slavery’s existence in the Christian Bible justified it in the eyes of many. In Ephesians 6:5 St. Paul instructed slaves to obey their earthly masters as they obeyed Christ.
Slavery’s prevalence also allowed the argument that if one country didn’t do it another country would: “Some gentlemen may, indeed, object to the slave trade as inhuman and evil,” said a member of the British Parliament on May 23, 1777, “but let us consider that, if our colonies are to be cultivated, which can only be done by African negroes, it is surely better to supply ourselves with those labourers in British ships, than buy them from French, Dutch or Danish traders.” On April 18, 1791, Banastre Tarleton declared in Parliament—and, no doubt, some even believed him—that “the Africans themselves have no objection to the trade.”
By the end of the nineteenth century, slavery was outlawed nearly everywhere and rapidly on the decline. In part, this was because a handful of activists in England in the 1780s began a movement advocating for abolition, a story well told in Adam Hochschild’s Bury the Chains: Prophets and Rebels in the Fight to Free an Empire's Slaves. This was a movement that made ending the slave trade and slavery a moral cause, a cause to be sacrificed for on behalf of distant, unknown people very different from oneself. It was a movement of public pressure. It did not use violence and it did not use voting. Most people had no right to vote. Instead it used so-called naive sentiments and the active ignoring of the supposed mandates of our supposed human nature. It changed the culture, which is, of course, what regularly inflates and tries to preserve itself by calling itself “human nature.”
Copyright Monopoly – Zacqary Adam Green: So you’re an artist, author, or creative person, and you’ve heard the arguments against the copyright monopoly. That it locks away knowledge from the public. That it hurts free speech. That it’s declaring a monopoly on an idea. Okay, but what about your paycheck? As it turns out, the copyright monopoly is a raw deal that helps corporations steal your profits and barely helps you at all.
Occupy Wall Street has directed our attention to the extreme concentration of wealth resulting from decades of policy designed to trickle down prosperity. Through using a single type of bank debt currency, we allocate our labor and resources to benefit a global elite instead of our communities. Can we engage our local leaders and municipal governments to break this currency monoculture? Can global examples of currency ecology provide a map for improving educational experiences, enhancing the arts and building resilience to the fragility of central bank finance mechanisms?