Information Shelf Life

Glasses laying on an open textbookWhen I was a student, I developed certain habits and ways of thinking. Most students do, and I imagine that these habits are remarkably similar from one to another. When Karen and I encounter a new and unfamiliar problem, for example, we find a book (or several) on the topic. We know which books to pull from our shelves should we need to reference them for some obscure detail.

Because of our academic careers, we instinctively know an important tenant of research: the oldest source is almost always the most valid. Newer research is, by nature, regarded with some scholarly skepticism because it hasn’t yet been subjected to rigorous debate by the academy. In fact, depending on the discipline, the date of the research can be the most important factor when citing it in other works.

Fast forward to today. Now I make my living on the web, and I am forced to change this habitual way of thinking. Research and relevant information in my field comes from blogs more than books, because books become out-dated too quickly. I can bookmark these posts that contain essential information, but these bookmarks are fleeting, just like the content to which they point. Research that is older is treated with disdain – newer is always seen as more relevant, because the technology moves so fast. A year old is often seen as worthless.

This has been a difficult mindset to which to adapt, because it seems to eschew the wisdom of what came before. Youthful enthusiasm and “disruption” is prized above experience in a way that academics…or many other professional disciplines…would not tolerate. I see the negative impact of this on my profession, as well: burnout, insane amounts of over-complication, pressure to learn and then leave behind. It’s interesting how we place so much emphasis, so much salvific hope, in our technology, while the exponential pace at which that very technology rushes ahead betrays us, leaves us behind, our perceptions now scattered…damaged goods, as it were.

New research is not a bad thing. We progress because of it. I would go so far as to say that we need it to thrive as a civilization. Forgetting its place, though…allowing it to push aside all that has come before it in the name of progress…counters all of the good that it might do. Critical thinking is important, and forgetting that when we’re caught up in the moment of what seems to be a revolutionary new perspective, is imprudent.

And we are the worse for it.

Books as Hardware

My nookI subscribe to the Atlantic. I have off and on over the years. Most recently, my subscription is digital. I receive the latest issue each month on my tablet from Barnes and Noble. I’ve wrestled with ebooks since my first experience with them, but magazines make much more sense to me digitally. They feel less permanent by nature. Recently, however, I went back to reference a great article that I had read in the Atlantic, only to discover that issues past a certain date were no longer available.

As it turns out, this is an apparent choice on the part of the magazine, as all of their articles are available on their website after a period of time. I actually think that this is an excellent choice on their part, although I am frustrated that I can no longer access those issues when I want.

My discovery led to other disclosures, also, and these were much more disturbing. I can no longer download purchased ebooks to my local drive for backup or archival purposes. Barnes and Noble has intentionally removed the ability to do so, as has Amazon. What’s more, I can no longer open previously downloaded books. This is strikingly different from music and movie purchases from, for example, iTunes, which I can easily backup and archive. This decision on the part of the booksellers forces us to trust their clouds with our purchases instead of being able to have what we’ve purchased to read whenever we like. The opportunity for active censorship of what we have available to read in this scenario should make your hair stand on end.

Books aren’t software. What’s concerning about this trend is what it reveals. We hold books in lower regard than other mediums. We view them as fleeting, ephemeral–no more important than a blog post. Yet, it is in them that we preserve our cultural identity, in them that we experience other points of view and begin to wrestle with the most important aspects of our human condition. Our books contain such a vital piece of our humanity, because we’ve entrusted that to them. In devaluing them in this way, we’ve devalued our own human-ness, as well. We’ve declared that it’s expendable, that it’s only data…that we are only data.

Can we be surprised, then, at the way our civility devolves around us? I don’t think that we can.

 

Divesting Facebook

"Facebook." Photo of a woman holding a plain blue book in front of her face. Used under Creative Commons.

I suppose that I was a relatively early adopter of social media. I remember when Twitter functioned primarily by text message, but my roots go back even further. While I never boasted a MySpace account, I joined Facebook during grad school, when it was only available to students and faculty. I’ll be honest…I joined because one of my colleagues told me that it was a great place to meet girls.

Turns out that she was right: I met Karen on Facebook. As it expanded and grew, I found, or was found by, more and more old friends from the past (oddly, though, never anyone from my undergrad days). I posted to those friends updates to our 24-hour labor experience when our first daughter was born. Facebook was a huge part of my life for a long time.

As I became more and more aware of how carelessly the network regarded my privacy, though, my use of it waned. My profile sat for four years with no use, save the occasional professional necessity. Facebook was obviously becoming a rough neighborhood, even before recent scandals, so, a little over two months ago, I finally followed through with what I had wanted to do years prior. I deleted Facebook.

I wasn’t careless. I exported my data, I confirmed that what I wanted to keep was present, I sorted photos to make certain everything was there. Karen wanted to preserve our chats from when we were dating and engaged, but those were sadly unavailable…apparently Facebook doesn’t keep messages beyond a certain point. Then, I clicked delete.

For those of you considering this, Facebook gives you 30 days to change your mind. All you have to do is log back in! And certainly I was tempted…so much of my life was invested there, recorded there. I held firm, though. I didn’t need the noise in my life.

I was then forced to return to what I suppose would be considered an older way of doing things. I still ascribe to the belief that you should never delete anyone from your address book, personal or professional. Perhaps this comes from the fact that I am old enough to remember keeping a hand-written address book. I intentionally reviewed many of those contacts to make certain that I had them…the groomsmen from our wedding, for example. I’m also still connected to several of these people on other networks…LinkedIn, or Twitter…but there are some that I realize now that I missed. I mourn that I may have lost connection with those people, one the person who recommended that I join Facebook in those early days, the person one could say was responsible for Karen and I meeting.

Even more do I mourn the fact that we have permitted a state of affairs in which losing contact with loved ones is as easy as leaving a social network. We’ve allowed someone else to hold that most valuable part of ourselves for their profit, certain to lose some or all of our connectedness unless we choose to be complacent to their nefarious motives. I wish that we had kept this, were intentional about caring for one another deeply enough to make certain that we know how to keep in touch with each other….and then following that with the action of doing so. As revolutionary as social networking was, and as ubiquitous as it has become in our daily landscape, the effort of keeping addresses, and even of writing letters, meant that we truly stayed in touch.

I hope that I can find the space in my life for that intentionality once again.

Image attribution: Alatr0n under Creative Commons.

Data-Driven Mystery

There’s a phrase…I’m certain that you’ve heard it…that says something to the effect that magic is simply science that we don’t yet understand. The underlying premise of this statement is that we can explain everything if we try hard enough, if we think logically enough. This is a premise that leaves no room for the unknown, that makes failure to understand something wrong, perhaps even difficult to forgive.

I’ve been really drawn to the fact that Marvel’s on-screen adventures, both large and small, have began to explore paranormal characters of late, largely because these characters are in such stark (pun only slightly intended) contrast to the technology-driven and scientifically altered characters that have dominated the broader audience’s exposure to these heroes to date. Part of the reason for my affinity toward these paranormal adventurers is that they are a metaphor for something beyond the physical, a deeper part of our existence that is outside of what we can measure, touch and feel, something so far removed from my profession.

As Lewis told us, the physical part of our world is only a part of the whole, and so much less real in so many ways than the spiritual.

When I was young (read: I’m totally still this way), I used to love post-apocalyptic stories in which science and magic co-existed in the world that had emerged from the ruins (think the world of Thundarr the Barbarian, as an off-the-cuff example), because they symbolize the truth that the physical and the spiritual work together, complement one another. Without either, humanity doesn’t work. To abandon one, or to minimize one in favor of the other, is to set the stage for us to be less than intended. As much as I love my toys, I’m reaching the conclusion that technology ultimately leaves us empty, because it focuses exclusively on the realm of the physical. Technology is our own finite creation. We’ve built it, we can know everything about it. Technology leaves us in the role of God, but pre-supposes that we are gods over a tiny kingdom that appears to us so much larger than it actually is.

Working in technology is creative, don’t get me wrong…as creative as any of my other pursuits. I get to write code that builds some really cool things. Technology, however, takes a poor view of mystery, because mystery implies something that we do not understand. Software can’t (or at least shouldn’t) be released with things that we don’t understand, so not understanding is weakness. If mystery remains in a project, then it is removed and replaced with a different approach that does not contain mystery. Technology is physical, and not only can it be quantified and measured, but must be. The spiritual cannot be. It must leave room for mystery.

Mystery, in technology, cannot be permitted to exist. Interestingly, we view technology as an extension of our lives, lives in which we thus have a perceived need to measure and quantify everything. We don’t want to permit mystery anywhere else, then, either.

Yet mystery is beautiful, because it helps us to understand the limits of our own lives. The fact that our control is illusion, that we are not, in fact, gods.

Because when we understand that, we begin to recognize that there is something so much bigger than us, something beyond our physical world, something that we cannot measure. What we don’t know is as beautiful as what we know, because what we don’t know leaves room for belief.

And belief leaves room for faith.

And faith leaves room for us all to be so much more compassionate, understanding, and…human…than we currently seem to be. I’m sure we can find data support that.

 

Raising the Space Bar

Photo-Jul-09-11-03-58-PMA couple of years ago, I had a debate with a colleague about a comment that I made. The comment was that “our generation” had arguably seen the most significant technological change of any generation in history. He disagreed, feeling that the industrial age had brought more. Whichever side of the debate you might fall on, my rationale was that my grandmother, when she was still alive, seemed to somehow experience an arresting of her ability to grasp technology more advanced than a land-line telephone.

When I was young (and hold on, because I’m about to date myself), my family had a “party line.” That is, we shared a telephone line with my grandmother. If she was using the phone from her home miles away, and we picked it up to make a call, we could hear her conversation, and knew we had to wait until the line was free.

I had my first mobile phone when I was college. It was one of those huge bag phones that went in your console and connected to an antenna on the exterior of your vehicle. 60 free minutes was a big deal then, and I’m still in the realm of ancient history for many of you. I remember my grandmother calling that number and being baffled by the concept of voicemail. I would have messages from her asking if I was there.

When I was very young, I typed DOS commands into a huge, clunky computer in my bedroom. Now, the phone that I carry in my pocket has more processing power than computers that rendered the original Star Wars films.

My point is that, more than an explosion of technology, people of my age have seen an exponential increase of information, and a fundamental change in how we access that information. We forget what it was like “back then.” The idea that we used to keep a hand-written address book for all of our contacts is foreign, the fact that I went through undergrad taking notebooks to class for note-taking bewildering.


Karen and I got rid of cable almost immediately after we married, because there were just too many other ways to watch what we wanted to watch. As such, our daughter has grown up her entire life with no idea of television being anything other than a streaming video service (she knows the difference between Netflix and Amazon). And, yes, I understand that her entire life has been five years, but this has still been her entire life. When we were setting up utilities for this new apartment, however, we got a good deal by agreeing to subscribe to cable also (poor cable providers, struggling so hard to keep an ancient business model alive). We agreed and, for fun, I connected the box, mostly to remember what it was like to watch something on a network’s schedule again.

While we were watching something together a few weekends ago, my daughter and I decided to get a snack during a commercial break. As we got up to go into the kitchen, she pressed the space bar on the computer keyboard to pause the program. It didn’t pause. She pressed it again. It didn’t pause. She gave a confused look.

My attempt to explain the concept of “live TV” to her failed in almost every way, as there is no reference point for her, no scaffolding upon which she can build the idea in her head. It’s amazing to me to think of the lightning-fast pace at which our concept of “normal” accelerates, of how easily we forget…forget in a way that my grandmother, I think, did not, because we forget even the foundation upon which is built our current state of “normal.”

I wonder what our daughter will consider normal when she is my age? I wonder how antiquated the idea of streaming episodes of her favorite programs on Netflix will seem then?

I wonder if that memory will even exist outside of an entry in an external storage device.

I wonder what we will have lost with all of that progress.