Tales from the Grid

To err is human, but to really foul things up you need a computer. Paul R. Ehrlich

In the commercialized culture of the United States that appears decidedly mired in equal opportunity solipsism, it never ceases to amaze me how impulsively self-centered and short-sighted individuals can be or might eventually become. From the time sink of “social” networking to the world-made-to-order promises of online advertising, one can only wonder what kind of superficial, dehumanized, impersonal, commoditized, vanity-infused biosphere primates will be subsisting in if this virtual egoism continues full throttle. Are the promising tools of technology and the pace of consumer materialism sculpting humanity faster than we can comprehend the ensuing interpersonal consequences? This reminds me of the tendentiously silly “could god make a rock so heavy that he couldn’t lift it” inquiry; but in our case, there could be something very real about “the control problem.”

In a recent NYT op-ed entitled Outing A.I.: Beyond the Turing Test, the author Benjamin Bratton (no, not Button) refers to the pre-Copernican propensity and anthropocentric hubris involved with how we engineer, project upon, and ultimately fear artificial intelligence (A.I.). Bratton makes a useful distinction between “hard A.I.” and “soft A.I.,” with the former being human-level intelligence and the latter being things like smart vehicles and personal assistant interfaces. The seemingly obvious benefits of soft A.I., despite their occasional frustrations and annoyances, have become so commonplace that most of us would feel like we were living in the nineteenth century without them. However, as we slouch toward technotopia, what lurks beneath synthetic intelligence seems to be our innate fear of being replaced with deviant, mechanistic, digitally-based, algorithmic replicas that do not share our most cherished values (think of Capgras syndrome with computer-based imposters). More importantly, what influence does the anthropomorphism of hi-tech machinery have on the reality of such future scenarios?

I’ve never been one to invest too much anxiety in the idea of a technological “singularity” (once described by the biologist Paul Zachary Myers as “rapture for the nerds”), or a doomsday adumbration of artificial intelligence that makes Stephen King’s Maximum Overdrive seem like the 1984 comedy Electric Dreams. After all, other salient topics appear more immediately disconcerting (e.g., antimicrobial resistance, pandemics, sectarian violence, ongoing racism, economic instability, Insane Clown Posse, and climate change). Besides, you can always unplug your electronic nemesis. However, it’s clear that advances in computer technology have resulted in a layer of cultural evolution that has decreased qualitative contact among humans (including proximity and authentic intimacy). Then again, maybe this is why I was never invited to any Y2K parties in Sunnyvale, CA.

If I did have a legitimate concern, it would be the potential of an unreasonable dependence on A.I. that undermines learning processes, usurps human occupations, outsources face-to-face contact, and ensures indolence when it comes to exercising cognition. To make matters worse, if carbon-based hardware no longer has the beguiling appeal of silicon-based hardware, we might begin to viscerally experience the embodiment of Theodore Twombly. My favorite example of social disconnectedness, while ironically being socially “connected,” involves two people sitting next to one another and communicating via text messages (lol). This type of absurd interaction transforms what should be intended as a supplemental communication device into a palm-sized distraction gadget with wireless partition features. One would hope that a rank-and-file smartphone does not represent something infinitely smarter than the user, but I digress. Navel gazing is slowly being replaced by screen gazing with a synchronized source code. If computer technology was meant to improve the human condition, it’s inevitable that our capacity for endless self-admiration through ingenuity could backfire if we try to breathe too much life into processors built primarily for improving computation.

In the final analysis, Marshall McLuhan’s debut in Woody Allen’s Annie Hall seems more apropos than ever, and it’s never safe to read Nick Bostrom’s work without a certified IT chaperone who specializes in algorithmic hermeneutics.

Related: https://www.youtube.com/watch?v=GYQrNfSmQ0M

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s