Neuroinflammation

http://www.pbs.org/wgbh/nova/next/body/brain-inflammation/

Advertisements

The Obsolescence of Senescence

What a drag it is getting old. — Mick Jagger

Attitudes about the aging process vary based on cultural values and demographics. In Greece, for example, it’s an honor to be considered elderly and respect for age is understood rather than hoped for. Likewise, Native American and Chinese cultures revere the elderly and associate advanced age with cultivated wisdom. However, the United States seems to engender mixed feelings regarding the passage of time while its media-driven marketplace seems unable to relinquish an obsession with reversing age.

From a global analysis, the population of senior citizens is projected to increase from 530.5 million in 2010 to 1.5 billion by 2050.¹ Although the United States has a slower aging rate than most countries, the inevitably of having to manage a population that will shift the median age by a significant measure is as certain as the effects of ultraviolet rays on unprotected skin. We better get used to the idea that the fountain of youth is likely to be connected to a much older water pump. So why is aging so often considered undesirable among purveyors of American popular culture?

Losing a sense of vitality or being considered to have outlived one’s usefulness are predominant concerns when the initial pangs of a mid-life crisis eventually subside to unveil the unavoidable realism of advancing years. In addition, many people fear their sexual identity will be irrecoverably compromised with age and they will no longer be considered alluring when their virility or fertility declines. Even worse, the elderly are sometimes infantilized by being portrayed as “sweet but helpless.” As a result, people of all ages spend unquantifiable amounts of time and money on products and services attempting to preserve or reverse the biological calendar. Being older might be okay, but being too old implies inadequacy or irrelevancy and may lead to compulsive attempts to reclaim a semblance of our former selves.

Humans, for the most part, want to be taken seriously while demonstrating to the world that they matter. Criticism based on ageism denigrates the value of our ideas and pursuits as we get older. However, some people always see themselves (or think of themselves) as young, regardless of chronology. In fact, thinking of yourself as “forever 21” is probably one reason why mind/body dualism still remains so pervasive (the brain evolved protective mechanisms to psychologically detach from excessively contemplating its own demise—unless you’re a true existentialist). Unfortunately, I never suffered from this mindset and ironically adopted the reverse attitude of seeing myself as “too old” even when I was obviously too young.

Adhering to conventional age norms never seems to help anything either—especially when it comes to actually being elderly. For example, if you’ve spent most of your life in a state of comparison with others, you’ll eventually imagine or realize everything you missed out on when maudlin contemplation is given carte blanche. This reflective quagmire is what the psychologist Erik Erikson referred to as the psychosocial stage of despair. Likewise, the echo chamber of forlornness only needs a few regrets and a wall of remorse to create a symphony of crestfallen reverberation. Yearning for an ideal self that was never truly manifested, but remains frozen in the mind’s memory of some exuberant past, is a sure recipe for tribulation.

Many religious traditions believe in the restoration of youth during an afterlife. Most everyone has observed a tawdry pamphlet depicting an elderly couple as their disembodied souls float into a refulgent paradise. Of course, the couple’s disembodied souls (also curiously in the form of bodies) just so happen to resemble Cary Grant and Audrey Hepburn in their prime. Nothing more is needed at this point to convince us where the preferred epoch of human flourishing resides.

Time, in a Newtonian sense, is how we measure the dispensation of energy, and humans inevitably move toward a state of higher entropy as they age. From an evolutionary perspective, errors in cell division increase the risk of succumbing to various diseases, and cell division accompanied by the death of cells is a lifelong process. Simply put, the body represents a vehicle to support our reproductive cells long enough to propagate as we pass the “genetic torch” to future generations. If a human lifespan could be captured via time-lapse cinematography, we would see how existence could be compared to the life cycle of an annual plant (from germination to decay). Aging is a destination that makes the process of growing up possible.

The English actor Albert Finney once said there was no impurity greater than age while he was relaxing on a beach in the film Death in Venice. Although I’m speculating on his intended meaning, I believe he was referring to life’s accumulation of traumatic experiences combined with the “unsightly” changes in form as bodies move from youthful pulchritude to full maturity.

Stereotypes of the elderly may have an ingredient of legitimacy, but the real possibilities of self-actualization become accessible only when the impetuousness of inexperience is replaced with the sagacity of time. If we as a culture could replace phrases like “over the hill” with words like “venerable,” while acknowledging that form is ephemeral, maybe our perspectives on aging would also reach full maturity.

1. (according to Pew Research Center data, 2014)

Tales from the Grid

To err is human, but to really foul things up you need a computer. Paul R. Ehrlich

In the commercialized culture of the United States that appears decidedly mired in equal opportunity solipsism, it never ceases to amaze me how impulsively self-centered and short-sighted individuals can be or might eventually become. From the time sink of “social” networking to the world-made-to-order promises of online advertising, one can only wonder what kind of superficial, dehumanized, impersonal, commoditized, vanity-infused biosphere primates will be subsisting in if this virtual egoism continues full throttle. Are the promising tools of technology and the pace of consumer materialism sculpting humanity faster than we can comprehend the ensuing interpersonal consequences? This reminds me of the tendentiously silly “could god make a rock so heavy that he couldn’t lift it” inquiry; but in our case, there could be something very real about “the control problem.”

In a recent NYT op-ed entitled Outing A.I.: Beyond the Turing Test, the author Benjamin Bratton (no, not Button) refers to the pre-Copernican propensity and anthropocentric hubris involved with how we engineer, project upon, and ultimately fear artificial intelligence (A.I.). Bratton makes a useful distinction between “hard A.I.” and “soft A.I.,” with the former being human-level intelligence and the latter being things like smart vehicles and personal assistant interfaces. The seemingly obvious benefits of soft A.I., despite their occasional frustrations and annoyances, have become so commonplace that most of us would feel like we were living in the nineteenth century without them. However, as we slouch toward technotopia, what lurks beneath synthetic intelligence seems to be our innate fear of being replaced with deviant, mechanistic, digitally-based, algorithmic replicas that do not share our most cherished values (think of Capgras syndrome with computer-based imposters). More importantly, what influence does the anthropomorphism of hi-tech machinery have on the reality of such future scenarios?

I’ve never been one to invest too much anxiety in the idea of a technological “singularity” (once described by the biologist Paul Zachary Myers as “rapture for the nerds”), or a doomsday adumbration of artificial intelligence that makes Stephen King’s Maximum Overdrive seem like the 1984 comedy Electric Dreams. After all, other salient topics appear more immediately disconcerting (e.g., antimicrobial resistance, pandemics, sectarian violence, ongoing racism, economic instability, Insane Clown Posse, and climate change). Besides, you can always unplug your electronic nemesis. However, it’s clear that advances in computer technology have resulted in a layer of cultural evolution that has decreased qualitative contact among humans (including proximity and authentic intimacy). Then again, maybe this is why I was never invited to any Y2K parties in Sunnyvale, CA.

If I did have a legitimate concern, it would be the potential of an unreasonable dependence on A.I. that undermines learning processes, usurps human occupations, outsources face-to-face contact, and ensures indolence when it comes to exercising cognition. To make matters worse, if carbon-based hardware no longer has the beguiling appeal of silicon-based hardware, we might begin to viscerally experience the embodiment of Theodore Twombly. My favorite example of social disconnectedness, while ironically being socially “connected,” involves two people sitting next to one another and communicating via text messages (lol). This type of absurd interaction transforms what should be intended as a supplemental communication device into a palm-sized distraction gadget with wireless partition features. One would hope that a rank-and-file smartphone does not represent something infinitely smarter than the user, but I digress. Navel gazing is slowly being replaced by screen gazing with a synchronized source code. If computer technology was meant to improve the human condition, it’s inevitable that our capacity for endless self-admiration through ingenuity could backfire if we try to breathe too much life into processors built primarily for improving computation.

In the final analysis, Marshall McLuhan’s debut in Woody Allen’s Annie Hall seems more apropos than ever, and it’s never safe to read Nick Bostrom’s work without a certified IT chaperone who specializes in algorithmic hermeneutics.

Related: https://www.youtube.com/watch?v=GYQrNfSmQ0M