Internet of Things on CNN
iPhone in The Atlantic
Mobile Augmented Reality Applications
wireless nomadism spimes
. Snow Crash
Future Of Mobile
. Ubiquitous Computation
Internet of Things
Anti RFID Religious Belief
I have the sense that when around an iPhone user, I'm already in the presence of someone with nascent superiority––one who has jumped the evolutionary bell curve via network and become exponentially more functional than me and my thrashed-out Razr. I see the owners recording a random snippet of Salsa Muzak in a Mexican restaurant and identifying the track, Twittering their activities in media res, locating a bakery in Manhattan (not by only it's proximity, but by the customer-reviewed quality of it's cupcakes), and otherwise doing things that would have seemed God-like ten years ago, and I get the polymer meat-scintering smell of, well, proto-transhumanism.
This is hyperbole is, of course, fueled by resentment at my own inability to afford a search engine in my pocket coupled with a late night movie geek's dread of all futures dystopian. In a recent Atlantic Monthly article, Bruce Sterling described the iPhone as a Digital Age analogue to the Leatherman Multi-Tool. Like a Leatherman, it aspires to be an all-in-one item––an Ultimate Object––co-opting every possible application and device into its shiny, black carapace.
However, at the moment it is a sloppy beta-generation gizmo––an amalgam of widgetry so wonderful that, like a Leatherman, no one can really get their head around its complete usefulness. Like the Google search engine itself, the new GPhone's open source potential gifts itself prematurely into the hands of hominids whose basal ganglia fused in the 20th century. I mean, does anyone really take advantage of the fullest possibilities of Google?
The new smartphones seem like an overture to the emerging promises of Mobile Augmented Reality Applications, wireless nomadism, and RFID-tagged geolocative "spimes," set to enfold complex aspects of our material world into an interconnected digital Pangea.
If science fiction is any barometer of the changing nature of tomorrow, there's been some fundamental shift in the concepts and shape of the formless, yet omniscient, digital realm.
In 1990's, philosophers, writers, and theorists glommed onto the word "cyberspace" as a way of conceptualizing and giving shape to formless, disparate, yet encroaching developments in telecommunications and digital technologies. The word was coined in the early 1980's by William Gibson to describe the VR network connecting the world of his seminal novel Neuromancer.
"Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts… A graphic representation of data abstracted from banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding."
The shimmering landscape of neon geodesics, informed visually by the wireframe graphics of early arcade games, struck fire with a certain type of stylistic ambiance in the 1980's a la Max Headroom and Tron. Cyberspace provided a Plato's Cave amidst a dense and hot future. It was something into which one could escape, as the hero from Neuromancer did, from the cold prison of flesh.
That was seemingly the promise of network saturation and Moore's Law of exponentially increasing computational power––artificial realities would eventually become more vivid and pertinent than the real thing. As a teenager in the 1990's, I would have banked on VR goggles and gloves becoming the consumer standard. It wasn't merely the dreams of this teenager at those days of 90's belle epoque, however. Philosophers, critical theorists, and journalists attributed "cyberspace" the same cosmic resonance LSD promised us in the 1960's––a gateway to a new reality. A glimpse into the Technology section of any used bookstore is likely to reveal shelves full of books with "cyber-" as their prefix, like Douglas Rushkoff's utopian, Cyberia. Books harkening the evolution of "cyberspace" have moved to sad and ironic antiquity faster than most software manuals.
In 1992, the chrome patina of Gibson's cyberspace was administered a colorfully ADHD, social network mock-up in Neal Stephenson's Snow Crash. In this book, the Metaverse was a clustered city on a black sphere, limited and defined by an adherence to real world physics and the organizing principles of urban space. As such, it suffered from the same problems of gentrification, zoning infrastructure, and violence plaguing a real city. Semantics were important to the function of the environment. Each avatar represented a user, each building a piece of software, and so on. The closest approximation of the Metaverse exists at the moment in MMORPG environments such World of Warcraft and Second Life. But why did we move from the idea of cyberspace as a virtual world?
Virtual reality makes sense in science fiction, the same way robots do. Walking, talking C-3POs were a way to inject technology in the form of dramatic personae––talking, moving, reacting, and behaving generally like characters in the story. Similarly, cyberspace was a fantastically surreal background through which to enact adventure via the non-activity of sitting in front of a computer. In practical application, however, personable "robots" have proven for the most part annoying––WaMu ATMs that claim "Sorry! It's been a crazy day!" when it runs out of paper , askjeeves a search engine disguised as a butler, and most notoriously, Microsoft Bob––father of Clippy. We won't accept the metaphor that a machine is a customer service rep, your friend, or anything but a machine.
The same logic ultimately applies to cyberspace. There's not enough function in rendering the abstracts of information and users into semantic constructs, unless it's so the hero can sword fight a black IC daemon over a file. The Internet in practice isn't about escaping reality; video games are. That's what inspired Gibson to create cyberspace in the first place. The 1990's Bay area Aquarian thing was to literally run naked through fields of free information. The 21st century seems to be focused on the tagging, sorting, and defining of actual people, places, and things in meatspace. Second Life is a toy. Google Earth is an application. Real cyberspace is not its own parallel reality, but a digital palimpsest on the actual terrain.
To play with an old neuro-linguistic saw: Whereas once it seemed as though the map would become the territory, what we're seeing instead has been a geolocative, Google-map extroversion where cyberspace overlays itself upon the everyday and mundane like a glossy sheen––applying folksonomy tags and GPS metadata to every object, person, and square inch. The Internet should basically look like this.
More recently than Neuromancer or Snow Crash, SF author Charles Stross has attempted to depict a world of ubiquitous computation in novels such as Accelerando and stories like "Toast"." In his near future, every object possesses AI intelligence. Every human who wishes to remain relevant as the world hurtles up the accelerating Singularity curve carries around an exocortex of dedicated computing power: an emergent swarm of peripheral intelligence and memory banks allowing them to process the daily gigabytes of information necessary to understand the moment-to-moment complex growth of reality.
"About ten billion humans are alive in the solar system, each mind surrounded by an exocortex of distributed agents, threads of personality spun right out of their heads to run on the clouds of utility fog — infinitely flexible computing resources as thin as aerogel – in which they live." ––; Accelerando
The result is, well, strange. Strange beyond strange. It's the struggle of the mammalian brain trying to bend itself around a world so fast and complicated that a few hours offline could render one obsolete, "washed up on the evolutionary beach." Stross's characters exist in a state of perpetual future-shock as computing in the form of sentient clothing, luggage, and everything else pervades their lives.
The synergistic shrinking and strengthening of microchips and the permeation of wireless network capability harkens a future of an interconnected RFID-tagged Internet of Things described in a positive light in this Wikipedia article:
"The idea is as simple as its application is difficult. If all cans, books, shoes or parts of cars are equipped with minuscule identifying devices, daily life on our planet will undergo a transformation. Things like running out of stock or wasted products will no longer exist as we will know exactly what is being consumed on the other side of the globe. Theft will be a thing of the past as we will know where a product is at all times. The same applies to parcels lost in the post."
If the positive aspects of this scenario is that everything is findable and recycled cradle-to-cradle, the negative possibilities were recently made clear to me in a religious pamphlet handed to me in the Times Square subway station describing RFID as the Mark of the Beast.
"The mark will be a bar code, and the number will be 666."
Dylan Thuras introduces us to computer science great Alan Turing, the Turing test, and six Artificial Intelligence Entities that are trying to fool you into thinking they are human. Read more …
Justina White explores what the signing of the Pro-IP Act means for you and wonders what she'd do with $435 million. Read more …