Data-Driven Mystery

There’s a phrase…I’m certain that you’ve heard it…that says something to the effect that magic is simply science that we don’t yet understand. The underlying premise of this statement is that we can explain everything if we try hard enough, if we think logically enough. This is a premise that leaves no room for the unknown, that makes failure to understand something wrong, perhaps even difficult to forgive.

I’ve been really drawn to the fact that Marvel’s on-screen adventures, both large and small, have began to explore paranormal characters of late, largely because these characters are in such stark (pun only slightly intended) contrast to the technology-driven and scientifically altered characters that have dominated the broader audience’s exposure to these heroes to date. Part of the reason for my affinity toward these paranormal adventurers is that they are a metaphor for something beyond the physical, a deeper part of our existence that is outside of what we can measure, touch and feel, something so far removed from my profession.

As Lewis told us, the physical part of our world is only a part of the whole, and so much less real in so many ways than the spiritual.

When I was young (read: I’m totally still this way), I used to love post-apocalyptic stories in which science and magic co-existed in the world that had emerged from the ruins (think the world of Thundarr the Barbarian, as an off-the-cuff example), because they symbolize the truth that the physical and the spiritual work together, complement one another. Without either, humanity doesn’t work. To abandon one, or to minimize one in favor of the other, is to set the stage for us to be less than intended. As much as I love my toys, I’m reaching the conclusion that technology ultimately leaves us empty, because it focuses exclusively on the realm of the physical. Technology is our own finite creation. We’ve built it, we can know everything about it. Technology leaves us in the role of God, but pre-supposes that we are gods over a tiny kingdom that appears to us so much larger than it actually is.

Working in technology is creative, don’t get me wrong…as creative as any of my other pursuits. I get to write code that builds some really cool things. Technology, however, takes a poor view of mystery, because mystery implies something that we do not understand. Software can’t (or at least shouldn’t) be released with things that we don’t understand, so not understanding is weakness. If mystery remains in a project, then it is removed and replaced with a different approach that does not contain mystery. Technology is physical, and not only can it be quantified and measured, but must be. The spiritual cannot be. It must leave room for mystery.

Mystery, in technology, cannot be permitted to exist. Interestingly, we view technology as an extension of our lives, lives in which we thus have a perceived need to measure and quantify everything. We don’t want to permit mystery anywhere else, then, either.

Yet mystery is beautiful, because it helps us to understand the limits of our own lives. The fact that our control is illusion, that we are not, in fact, gods.

Because when we understand that, we begin to recognize that there is something so much bigger than us, something beyond our physical world, something that we cannot measure. What we don’t know is as beautiful as what we know, because what we don’t know leaves room for belief.

And belief leaves room for faith.

And faith leaves room for us all to be so much more compassionate, understanding, and…human…than we currently seem to be. I’m sure we can find data support that.

 

Raising the Space Bar

Photo-Jul-09-11-03-58-PMA couple of years ago, I had a debate with a colleague about a comment that I made. The comment was that “our generation” had arguably seen the most significant technological change of any generation in history. He disagreed, feeling that the industrial age had brought more. Whichever side of the debate you might fall on, my rationale was that my grandmother, when she was still alive, seemed to somehow experience an arresting of her ability to grasp technology more advanced than a land-line telephone.

When I was young (and hold on, because I’m about to date myself), my family had a “party line.” That is, we shared a telephone line with my grandmother. If she was using the phone from her home miles away, and we picked it up to make a call, we could hear her conversation, and knew we had to wait until the line was free.

I had my first mobile phone when I was college. It was one of those huge bag phones that went in your console and connected to an antenna on the exterior of your vehicle. 60 free minutes was a big deal then, and I’m still in the realm of ancient history for many of you. I remember my grandmother calling that number and being baffled by the concept of voicemail. I would have messages from her asking if I was there.

When I was very young, I typed DOS commands into a huge, clunky computer in my bedroom. Now, the phone that I carry in my pocket has more processing power than computers that rendered the original Star Wars films.

My point is that, more than an explosion of technology, people of my age have seen an exponential increase of information, and a fundamental change in how we access that information. We forget what it was like “back then.” The idea that we used to keep a hand-written address book for all of our contacts is foreign, the fact that I went through undergrad taking notebooks to class for note-taking bewildering.


Karen and I got rid of cable almost immediately after we married, because there were just too many other ways to watch what we wanted to watch. As such, our daughter has grown up her entire life with no idea of television being anything other than a streaming video service (she knows the difference between Netflix and Amazon). And, yes, I understand that her entire life has been five years, but this has still been her entire life. When we were setting up utilities for this new apartment, however, we got a good deal by agreeing to subscribe to cable also (poor cable providers, struggling so hard to keep an ancient business model alive). We agreed and, for fun, I connected the box, mostly to remember what it was like to watch something on a network’s schedule again.

While we were watching something together a few weekends ago, my daughter and I decided to get a snack during a commercial break. As we got up to go into the kitchen, she pressed the space bar on the computer keyboard to pause the program. It didn’t pause. She pressed it again. It didn’t pause. She gave a confused look.

My attempt to explain the concept of “live TV” to her failed in almost every way, as there is no reference point for her, no scaffolding upon which she can build the idea in her head. It’s amazing to me to think of the lightning-fast pace at which our concept of “normal” accelerates, of how easily we forget…forget in a way that my grandmother, I think, did not, because we forget even the foundation upon which is built our current state of “normal.”

I wonder what our daughter will consider normal when she is my age? I wonder how antiquated the idea of streaming episodes of her favorite programs on Netflix will seem then?

I wonder if that memory will even exist outside of an entry in an external storage device.

I wonder what we will have lost with all of that progress.

Selfie Abandonment

Reinventing the Selfie. Used under Creative Commons.Earlier this week, I updated the operating system on my phone. Karen always says that I should block off two hours for such an event, because I like playing with shiny new things. She’s not altogether wrong (although I think that two hours might be stretching it a bit). In browsing through some of the changes that this particular update brought about, I landed upon something unsettling.

There’s a new album in my photos. It’s called “Selfies.”

I feel a bit…ill.

Because the selfie isn’t something that I do, nor will I ever, it’s interesting to see what this “intelligent” album is including…candid photos of me in a funny hat, for example, that Karen took with my phone one day while we were shopping for something. I mean, if it includes only my own face, then it must meet the definition of a selfie, at least by the software’s calculations, right?

Except I’ve seen this thing in the wild, and this isn’t it.

A few weeks ago, we were visiting family in a different state and were out to dinner. It was Sunday afternoon, and the restaurant was a bit busy. Another family sat in the booth just to the left of us, grandparents and a teenage grand-daughter if appearances were accurate. During a break in our conversation, my attention drifted over to them. Their conversation was still moving along, and I noticed the grand-daughter slip her phone from her purse, smile into it, snap a photo of her face (it was a phone the size of a tablet, so it was difficult to miss what had happened), and begin typing whatever message that was to accompany its posting to whichever medium she was choosing.

What was disturbing about this to me was the alienation, however momentary, of the people with whom she was sharing this moment, in order to, it would seem, take the opportunity to propagate her likeness to people with whom she was not in “real” contact. I struggle with having to keep my back to television screens when I’m in a restaurant so that I can focus on those with whom I’m eating. I don’t need to distract myself further by even considering what my Twitter followers might think of where I am, or whether or not I’m smiling while I eat my food, or whatever. I’m there with people, interacting interpersonally. Isn’t that a larger priority?

Fast forward a couple of weeks, when Karen and I were eating with our daughter during a Saturday afternoon shopping excursion, and I saw a group of teenage girls outside the restaurant window pausing at a set of steps in the mall to snap a quick selfie, taking time to compose the photo just right. That day was the first time I actually saw the “selfie stick” phenomenon in use.

I was equally disturbed.

Now, I know the literature and opinions that claim that the selfie is simply a controlled form of expression of one’s image. There’s actually some research that claims that this is a healthy expression, a way to take back control of one’s image within one’s social circles in a way that one can choose, that one can control, that is not objectifying and is thus empowering. I’m afraid I must disagree. The selfie is the height of narcissism, and it’s distasteful to me not even because I don’t (and don’t think anyone should, in the interest of good health) love myself nearly that much, but because I believe that we should love those around us more.

I see the selfie, however, not as a surprising cultural event, but rather as the natural result of a market-driven society, a society reduces everything and everyone to being a “brand.” Acceptable, perhaps, when marketing a product. Dehumanizing, however, when applied to our interpersonal…and intrapersonal…existences. The selfie is something flat, something one-dimensional, something lacking substance, because it is focusing on image-management, presenting a crafted representation of how the individual wishes they were. I suspect that this gives the person the escapist ability to avoid considering their true condition, to dance around the existential questions that we all grow as human beings for asking.

My goal is not to sound curmudgeonly. I don’t wish everyone to go through life examining their every flaw with no joy or escapist outlets whatsoever. I’m human, and I’m as escapist as anyone else. I’m concerned, though, at the painfully inward, selfish focus that our culture not only permits, but rewards. The less we know…or care about…our neighbor, the further we sink as a people. The less concern with others that we permit ourselves, the less human we become.

I complained on the night that I wrote this that I didn’t get to buy something that I had wanted because we needed to spend the money on our daughter’s educational supplies. I groaned that I feel I don’t get what I want because others need what they need worse.

The more inwardly focused I become, the more miserable I become.

I don’t really want to be miserable while masking it under the guise of a well-planned photo of myself in various surroundings.

I don’t want to be miserable at all.

I certainly don’t want to fake it, either.

Image attribution: Yasmeen under Creative Commons.

Artificial Intelligence

Artificial Intelligence - Innovation should be checked by wisdomIn Brian Michael Bendis‘ story arc for Marvel’s 2013 graphic novel Age of Ultron, we are presented with an unexpected present that one would initially guess to be an alternate future. The artificial intelligence run amok known as Ultron has succeeded in destroying most of humanity. The handful of people who have survived in the world’s major cities have an even smaller handful of heroes among them, hiding underground and attempting to form a strategy to overcome Ultron. While Bendis deals with many themes in these pages, one of the most prominent is the need for what we view as progress. Bendis makes us privy to the internal dialogue of Hank Pym, the Avenger known as Ant-Man and the creator of Ultron, as he wrestles with the potential to benefit humanity that he sees in the concept of the Ultron artificial intelligence. The reader is left feeling…skeptical…of what Pym wants to achieve, understanding that his hopes are mis-placed. However, his motivations are clearly pure. He wants to help.

When faced with this extinction of humanity, Wolverine makes a more difficult choice. He wrestles with the decision of whether or not to travel back in time and end Pym’s life before he can create Ultron. The reader is even more dubious of these intentions, but Wolverine sees no other real alternative. The choice between one life or millions of lives is clear to him in that moment.

In the 2015 cinematic version of Age of Ultron, Tony Stark encourages Bruce Banner to assist in exploring the artificial intelligence that will become Ultron. He presses Banner to accept that this is who they are, the “mad scientists,” and must do what they do.

That is the intelligence that I would find artificial.

C.S. Lewis points out a sound philosophical truth: just because we can do something, doesn’t necessarily mean that we should. Yet, the logical fallacy that “can” must necessarily lead to “do” drives much of what we view today as progress. Humans as a race are always pressing forward, always confronted with our own mortality, seeking to make life more palatable not only for ourselves, but for our successors, our children. Once we discover that we are capable of something that we perceive as good, we feel an overwhelming drive to do that thing, hang the consequences.

Part of the tragedy of the character of the mad genius is that s/he works in isolation much of the time, experiencing an absence of feedback from other people about their plans. No one can see all of the failings of their own plans…everyone needs another party to hear their ideas, to proofread their work, as it were.

I think that our decisions, sometimes very important decisions, are becoming rushed. A desire to help others is a noble thing, but not every wonderful idea to better mankind turns out to be such a wonderful idea. In short, innovation must be checked by wisdom, and that wisdom is in short supply when crowd mentalities rush to gather around what is popular, without giving careful thought to what it is that they might be supporting.

I’ve been accused of being a futurist, because I become excited about the potential of what new discoveries and technologies can offer us. I see the problems that they solve, and dream of how much simpler life might be with that problem solved. Then I have to pause, I have to step back and examine whether or not my excitement is equivalent to Pym’s excitement as he dreams of Ultron. Sometimes I continue to see minimal negatives, and sometimes I feel uneasy, a misgiving that gives pause, and is usually justified when I think the issue through carefully. I’m no inventor…I don’t build exciting new things. I’m certainly no entrepreneur…while I dream of new stories and worlds, I don’t formulate new strategies to change the world as we know it. So, granted, I’m not in a position to truly understand many of these things. I am, however, a critical thinker. I believe in examining things through a lens of close observation. I think of what great science fiction writers have written, warning us of the potential outcomes of some of our innovations, and I recall Tillich’s observation that artists are the prophets of our time, warning of dangers before the rest of us can see them.

And I wonder about the dangers, unseen in the excitement over the good.

I wonder.

Photo, by Pascal, is public domain.

Blogging Nostalgia

Perhaps it’s my age, but I’m prone to nostalgia lately. More, in fact, than I would care to admit over the past couple of years. It’s not just music, mind you, although I’ve pined my share over that. It’s not just old Saturday morning cartoons, or even old breakfast cereals, though I’ve certainly found myself drawn to those quite often of late. No, the chronology of my longings isn’t nearly so narrowly defined. In fact, other things, things from barely a decade ago, have piqued my reflective longings recently.

And yes, I do realize just how much I’ve dated myself in that last statement.

Is there a point to this? Yes. The point is this post from a blog that I began following years ago when I was writing prose more than code (and beginning the novel that I swear I’m going to finish at some point). As the comments poured in over the subsequent weeks, it became obvious that I wasn’t the only reader with whom Mr. Bransford’s thoughts had resonated. I’ve enjoyed reading those thoughts. I always have enjoyed reading others’ thoughts. That’s what was always so powerful about the blog.


I began writing a blog as an experiment back in 2005, and, although I rarely read that first post, when I do, it makes me pause to think about what’s changed about the writing and the writer over that decade. The purpose of this space changed as my focus and interests became more defined (“faith, art, and culture” came more than two years after I began blogging), an epiphany that happened in large part because of my writing here. I found my voice as a blogger…so different from that first post…along with that focus. Simply, I came to call myself a blogger, to take this seriously. Certainly, I’ve waxed and waned a bit in my frequency of posting over the years, but I’ve never left. I’ve waxed and waned in my reading of others’ blogs, as well, no longer finding the time to peruse my feeds every day, but more likely once weekly.

I initially found these blogs through a bit of a curated experience, of course. I began, as many bloggers did, with Blogger (I was writing there before it’s acquisition by Google), and, like many bloggers, I outgrew it. Like many bloggers, I used blogrolls to discover and be discovered. I was always looking for a new blog to add to my reading list, because the things that you discovered, the things that you learned, by reading the thoughts of people from all over the globe, was so amazingly enriching, so profoundly important.

I met friends through blogs. People passionate about blogging, and passionate about writing. People passionate about faith and theology, about the arts and so many of my other interests. Some faded away over the years, and I’ve lost touch. Others I’ve met in person and continue to communicate with to this day.

I commented on posts. I subscribed to comments. My posts received comments. We interacted, those other bloggers and I. We discussed, almost always civilly, and, in doing so, we learned things and grew.

This wasn’t just about entertainment. It never was for me. It’s more important than that. More profound.


So, nostalgia. Nostalgia because I miss what it was. I’m not saying that blogging is no longer existent, or no longer important, or that it’s only on the fringes and important to only a few writers who refuse to accept change. There are those who say that, and I couldn’t disagree more. Blogging isn’t the only option, now, and it isn’t the only way to discover other people and discover their thoughts. I don’t comment nearly as much as I used to, nor do my posts receive as many comments, even though the number of you reading these posts has only grown. That’s okay…it’s the evolution of the medium. I sort of miss it, though, because the discussion is what made this so special, so different from the streams of consciousness that are social networks, for better or worse.

What feels most void is that I miss the discovery of other’s blogs. I miss going looking for new blogs. I miss not having the discovery process dominated by the algorithms of Facebook or Twitter. To be honest, I miss having the time to do this discovering.

Many of the blogs that populated my feed years ago are no longer active. They exist, but with most recent posts of two or three years past. Some no longer exist at all…they’ve been taken down, domain names now belonging to others. I’ve no intention of doing that for some time to come, although I’m not nearly naive enough to believe that this medium will never be replaced by another and that this will never cease to exist at some point, replaced in the evolution of technology. There are, however, a lot of very active blogs out there, and I don’t fall into the “it’s over and I’ll always miss it” sort of nostalgia of many of the commenters on Bransford’s post. There are fewer personal blogs, perhaps, as more have become focused on what we do for our livings as professional and personal are tragically forced to meld beyond healthy boundaries. But there are still blogs, good blogs, waiting for readers with the time to engage in the writers’ thoughts.

Not just their in-the-moment impulses. Their thoughts. The stuff that makes us grow, that expands who we are as people, that helps us to know each other better…and hopefully even, in an ideal circumstance, hurt each other less.

That’s why this is so important, and why I’m nostalgic for what it was, even while being fascinated by what it becomes.