Monday, November 18, 2019

Four Short Reviews

Spider-Man: Far From Home (2019, Columbia Pictures/Marvel Studios) 
This film is a transitional narrative that acts as an extended epilogue to the Infinity War storyline, while pointing the direction that the Marvel Cinematic Universe (MCU) will take over the next decade. The audience is shown a world irrevocably altered by the events of the last two Avengers films, where eulogizing the recently deceased Tony Stark has become popular obsession. In the midst of this milieu, a narrative unfolds; a parable about climate change and the nature of reality in a post-truth era. Elementals, raging behemoths that appear as giants comprised of water and fire, are revealed to be special effects created by an illusionist, which mask a swarm of destructive human-made drones. Even the public revelation of our hero’s secret identity in the final scenes of the film can be perceived as a commentary on the click-bait culture of ‘gotcha’ moments and online shaming, although editorial bias is far, far older than the Internet.

   
The Boys (2019, Amazon)
A world without moral absolutes is a world of shifting power dynamics, where hierarchical structures jockey to maintain an unnatural asymmetry between have and have-nots. In The Boys, we see these power dynamics play out in a setting where super-humans are tools of corporations, and marketed to the public like pop stars and sports idols. In so many ways, this series is antithesis to idea of the super-hero presented in the MCU; these characters are emotionally stunted, ignorant and self-absorbed to the point of narcissism. The most celebrated collection of these so-called heroes – a group known as The Seven - are a law unto themselves, and yet they are beholden to their corporate masters, who are - in turn – held in thrall by the prospect of state-level influence (ie, government defense contracts). Each episode of this series can be considered an extended meditation on the misapplication and misuse of power in all arenas of human existence, from the biological to the political to the metaphysical.

  
Legion (2017-2019, FX Productions/Marvel Television)
There can be little doubt that this three-season series is the most aesthetically nuanced, surreal and challenging of any Marvel onscreen property to date. Viewers are initiated into the labyrinthine psyche of a gifted yet psychologically-wounded telepath whose personal journey goes from incarcerated mental patient to cult leader, while transcending the bounds of time and space. This psychedelic hero’s journey is truly a trip down the rabbit hole, and probably as close to pixelated LSD as television is likely to get. Psychic conflict on the etheric plane envisioned as a rap battle or a dance-off would never have been considered by most viewers prior to seeing this series; now, popular representations of super-mental abilities – telekinesis, for example – seem archaic next to the consciousness-changing power of shaping reality itself, as imagined on this program. Visionary and provocative, Legion is the most demanding of Marvel Television productions, but it’s also the most rewarding.


Watchmen (2019, HBO)
Perhaps the most relevant to North America’s current political climate, Watchmen revisits many of the perennial themes that made the original graphic novel seem prescient, even though it was published in mid '80s. This updating of the story introduces a new cast of characters in a world that’s like ours, but just a little bit different. Vietnam is a state of the U.S., for example, there’s no Internet, tobacco is a controlled substance, and everybody drives electric cars. Viewers who have read the book (or seen Zack Snyder’s 2009 film adaptation) know why this is so; newbies get to play catch-up as the murder of a police chief in Tulsa, Oklahoma takes center stage. This series is still unfolding at the time of this writing but holds great promise as a worthy successor to one of the 20th century’s great fictions. 

                           

Friday, May 3, 2019

The Last Blockbuster


Blockbuster film sequels used to be liabilities. Now, they’re an essential part of any large motion picture studio’s long-term success. 

Over the last 20 years – since the turn of the century - Hollywood has garnered a significant portion of this success by creating hundred-million-dollar blockbuster franchises based on pop culture icons. 

Yet the era of the blockbuster may have peaked with the release of Avengers: Endgame. It’s an audacious proposition, since the film is on track to become the highest grossing movie ever. But times are changing, and so is the entertainment industry.

If Endgame represents the peak achievement of its genre, popular films will still be made in coming years and lucrative franchises will persist, but the reach of these productions is likely to be increasingly limited by economic and societal factors. 

Growth in the video game industry (which matched the U.S. film industry for revenues in 2018) as well as the presence of online streaming services have permanently disrupted film production, distribution and marketing models of the past century.  

Starting with the motion picture adaptation of J.R.R. Tolkien’s Lord of the Rings (2001-2003), two decades of movie blockbusters have made unprecedented use of cultural and commercial content that wasn’t initially created for the big screen, but for novels, television shows, toys, and comic books.    

This long list need not be revisited at great length, but it includes Transformers, Bladerunner, Twilight Saga, Hunger Games, Chronicles of Narnia, Star Trek, Superman, Batman, Wonder Woman, The Hobbit, et al. 

These franchises, however, are not in the same league as the juggernaut known as the Marvel Cinematic Universe (MCU), which, since the advent of the first Iron Man in 2008, has grown to dominate pop culture like nothing before or since: 22 movies comprising a consolidated, reasonably coherent epic mythology for the new century.    

By one evaluative model, which would measure total box office receipts, production and artistic innovation, and cultural resonance, we can identify three high points in history of movie blockbusters: The Star Wars trilogy (1977-1983), The Lord of the Rings trilogy (2001-2003) and the two most recent Avengers films, Infinity War and Endgame (2018-19). 

These franchises earned billions of dollars in worldwide receipts and produced the highest grossing films during the years they were released, excepting The Fellowship of the Ring in 2001, and Endgame, which just arrived in theaters, but had surpassed $1 billion worldwide at the time of this writing (05/01). 

In the case of production and artistic innovation, note the breakthrough special effects of the original Star Wars trilogy, the digital wizardry that brought Middle Earth to life (Gollum, my precious), and the fact that Infinity War and Endgame represent the first time that Hollywood feature films were shot entirely using IMAX digital cameras. There are many more examples, but for sake of brevity, these three will suffice.

Cultural resonance is a catch-all term used to get at not just a film’s popularity but how it speaks to audiences, contextually and historically. If box office receipts represent the span, or width, of a film’s influence on society, cultural resonance is the depth of that influence. 

Part of Star Wars’ success resulted from its appeal to myth. The idea that technology and magic could exist alongside one another was novel, and because the hero’s journey - a term borrowed from philosopher Joseph Campbell - was integral to the original film’s plot, it connoted ancient, pervasive and recurring cultural notions of identity and purpose. 

As a result, this coming-of-age fairy tale fantasy about space travel and sci-fi samurai changed films and film-making forever, while making later blockbusters possible, including Lord of the Rings (LotR) and The Avengers.

The cultural resonance of LotR requires little (if any) qualification. Author J.R.R. Tolkien created a modern English myth that achieved global popularity and became a template for a unique form of genre fiction after it was published in the mid-20th century.

One can hardly imagine the millennial pop culture landscape without LotR. Middle Earth has a gravity all its own, and holds in its orbit a vast constellation of books, movies, music, games and art. 

The triumphant film adaptations of Tolkien’s books tapped a deep vein in Western culture’s collective consciousness. And as observed earlier, tapping the vein yielded an unprecedented proliferation of films based on numerous pop culture sources

But there was none to equal LotR’s cultural resonance until the MCU got underway.                

There are many reasons for the unique success of the MCU films, not the least of which is the financing and production resources of Disney, Marvel’s parent company. 

Another factor is the remarkable consistency and quality of casting, which gave the films recognizability and star-power in a crowded media landscape.  

But perhaps most important is the fact that the MCU films draw on a great legacy of story, a legacy created by generations of artists and writers working in an industry that, until recently, was considered second-tier to “serious” art. 

This legacy of story, which – in the MCU’s case - originated with the earliest Captain America comic books in the 1940s, is itself linked to an even older tradition of story exemplified in the ancient tale of Gilgamesh, the poetry of Homer and Hesiod, and the epic verse of John Milton and William Blake - all works which spoke of powerful beings having adventures and doing battle in a moral universe governed by gods.

This is the secret behind the success of these films and the stories they tell: We like them because we always have. Movie producers will try to repeat Marvel’s success; DC will focus its scattered efforts into some cinematic distillation, but the result is unlikely to have the coherence of the MCU. 

When the final chapter in the nine-part Star Wars Saga is released into theaters this December, it will conclude a story that captivated a worldwide audience for several generations. In its place will remain a larger, denser, and arguably less accessible "expanded universe" of related films, books, television and games.        

The MCU will go on as well, with a new canon of films brought from page to screen. And even as video games and streaming services vie for more and more of the public’s attention, it will be obvious to all concerned that movies are here to stay. 

Avengers: Endgame may be the last film of its kind, but it isn’t the end.

Monday, March 25, 2019

Time: Genealogy of a Concept

It surrounds us, yet we only observe its effects. It shapes every aspect of our lives, but we don’t interact with it directly. 

We make it, measure it, save it, waste it, compress it, bend it, extend it; we can’t even hold it in our hands. And when we’re asked to define it, every description seems ephemeral and contingent, as if it’s beyond words to explain. 

It is time

What is it? How to explicate time? By a moment, which - whether experienced subjectively or intersubjectively – seems to flow, like a river, from the past to the future? Perhaps time can be quantified by the metrics of cause and effect, and the laws of thermodynamics? 

Will we know time by its absence, when we contemplate eternity? Can we even posit, with intellectual honesty, what time is when we’re always inside of it, three-dimensional beings embedded in the fourth dimension?   

To unpack these questions, it’s useful to think about time’s conceptual evolution like a genealogy, one that’s been mapped onto Western scientific and cultural paradigms of the past two-and-a-half millennia. 

There exists a limited interval in which to complete this genealogy, so some aspects of temporality (used here as a variant of the word time) – its practical relevance in relation to duration, measurement and technological invention, for example - will remain unaddressed so metaphysical issues can be investigated. 

Here, as in all things, time is against us…or is it? 

As we survey perspectives on temporality, a case can be made that there is an inherited prejudice in relation to time, or - to be precise – an inherited prejudice in relation to certain ideas about time. 

This shouldn’t come as a surprise since, next to knowledge of death, temporal awareness is the most persistent reminder that humanity – for all its civilized progress – will never be fully emancipated from nature’s dominion. 

From an evolutionary perspective, modern humans aren’t far from the savannas of Africa where our prehistoric ancestors are thought to have wandered for many centuries, unprotected from nature in the form of sickness and disease, changing weather conditions, and violent or accidental death.

When our ancestors began living in cities and developing written alphabets, ambivalence towards problematic aspects of nature – including time – found its way into language, which informed the way temporality was thought about and discussed. 

Like civilizations before and after them, the Greeks mythologized their relationship with nature, the psyche and society using a pantheon of gods. 

Multiple deities became associated with various concepts of time, but one of the most revered was the titan Chronos. He was also the most feared, having been believed to have eaten his own newborn children to prevent them from taking his place as king of the world. 

By the earliest decades of the Common Era, ambivalence towards time had been sublimated into numerous philosophical and religious systems. 

Plato’s Theory of Forms and the Christian belief in an afterlife were just two of the Western ontologies that established transcendent orders which biased the absolute, the eternal and the everlasting over the transitory, the temporary and the temporal. 

Mathematics and geometry, too, with the stress on fixed shapes and formulas, represented a transcendent reality untainted the vagaries of time.

An example of the ongoing ambivalence towards all things temporal can be found in the beliefs of the ancient Gnostics. This early Christian cult taught the initial descent from the divine state – the primeval fall – didn’t take place in the Garden of Eden, or even when Lucifer was cast from Heaven. 

The first fall occurred when the universe was created, when time itself began. 

For a Gnostic initiate, matter, the material world, and all that proceeded from it – including time - was corrupt. Their goal was a return to a region of light called the Pleroma.

Elements of this doctrine have been articulated in our era, reformulated by physicist David Bohm, who once enigmatically suggested that all matter is light, frozen in time

Of course, to a Gnostic living two millennia ago, the inverse to that viewpoint seems the truer statement; that without time, everything is light

It’s here, reasoning by negation - via negativa - that we intimate what remains in the absence of time; an opportunity to recognize, like Michelangelo seeing a statue in a block of marble, what pieces of stone must be removed to reveal the figure within.  

Maybe Saint Augustine of Hippo used his own sort of negation when he famously uttered, “What, then, is time? If no one asks me, I know; if I want to explain it to a questioner, I do not know.” 

Had he been more forthcoming on the subject, Augustine might have said something like, it is the eternal in us that allows us to perceive time, the eternal part referring to the human soul. 

Like time, the concept of soul is larger than a single religion or philosophy; in the 21st century, scientists call soul consciousness, and seek to know its laws. 

Most of humanity believes soul to be their birthright. This eternal essence of a person, thought to consist of subtle numinous energy, is a wide, soaring arch in the metaphysical lattices of countless worldviews, both ancient and modern. 

Soul was a powerful idea to humanity’s prehistoric ancestors, offering existential certainty alongside a communally acknowledged mortality. 

The idea serves the same function today, even though the focus of the drama is now the soul’s redemption from suffering, death, and the ravages of time by adherence to religious tenets or – more recently – faith in immortality technologies.

The 17th century European Enlightenment traded the transcendent religious abstractions of time and eternity for the transcendent mathematical abstractions of science and the cosmic clock. 

The cosmic clock was invisible and operated perfectly, in perpetuity. Time became an axis on a graph; one of the innovations that allowed the physics of the emerging scientific paradigm to operate with unprecedented predictive power over the natural world. 

This predictive power led, incrementally, to the Industrial Revolution and ultimately made Western modernity possible.  

But the idea that time could be measured the same way - no matter where a person was in the universe - only lasted until 1905, when Albert Einstein married the spatial and the temporal in his Special Theory of Relativity and exploded the cosmic clock forever

After that, it was impractical to think of time in the same absolute terms as people in previous eras had. Temporality – in applied and conceptual considerations - became an area of interest for many fields of inquiry.  

Questions about time, and its relationship to memory, sentience and aging; to society, technology and the physics of gravity, continue to shape our lives in the 21st century. 

Yet even in our high-tech civilization, the temporal prejudice persists, not least in the fact that there never seems to be enough time

Is there a way to mitigate this bias so that a renewed appreciation of time may be affected?    

Human perception was shaped by pressures of natural selection, and senses that ensured better opportunities for reproduction were favored by evolution. We might conjecture this included spatial and temporal awareness, which can be thought to have developed alongside one another. 

The parallel yet differing progression of these two streams of awareness are envisioned by H.G. Wells in The Time Machine:  

“There are really four dimensions, three which we call the three planes of Space, and a fourth, Time. There is, however, a tendency to draw an unreal distinction between the former three dimensions and the latter, because it happens that our consciousness moves intermittently in one direction along the latter from the beginning to the end of our lives.” 
   
Infants learn to navigate the spatial dimension from the moment of birth. For each of us, it is perhaps as close to innate knowledge as we might truly be able to claim; the inner pedagogy that taught our kind to walk upright, with heads raised. 

Perhaps temporal awareness is another sort of innate knowledge, one that develops similarly to the spatial, but more gradually - over a lifetime - so that elders appreciate the folds and creases of temporality in a way that isn’t experientially accessible to the young. 

That, much like infants learning to walk, we embark on a sort of fourth-dimensional ambulation throughout our lives, and it’s only in maturity that an advanced understanding of time becomes possible. Might we call this sort of temporal sense wisdom, if it didn’t undo the nuance of the description just provided…?

With this wisdom comes deeper knowledge of time as a generative force of nature, a force that makes manifest all possibilities - creatio ex tempore, tempore ex creatio – the creative essence from which the world emerges. 

This is the antidote to the prejudice observed in the temporal genealogy; the insistence that time devours all

To this valuation is put the notion of a prolific temporality, and the wisdom to know that when the universe speaks to us, it speaks in time.