Speeding Up, Dumbing Down

The first time that I heard of the concept of “speed reading,” I think that I was in elementary school. Being firmly in the realm of the imaginary at the time, I liked to pretend that I could do it (I also boasted of various super-powers at different times…ah, childhood). I wanted to be the superstar that could blow through the novel that everyone else was sluggishly complaining their way through for class.

In the real world of today, I’m a relatively slow reader. I read a lot in volume, but I still progress through books slowly, at least compared to my friends and to Karen, who devours a 300-page novel in a night. One of the reasons that I progress so slowly through books is that I pause periodically to digest what I’ve read. I like to think about it, reflect on it. For that reason, I tend to  only move through a few chapters at a time before putting a book down for the night. I would rather really know a few books than to be loosely acquainted with many. I guess I read like an introvert.

Increasing my quota by hastening my reading time was never really something that held any appeal for me. Whatever the pace with which one read, I reasoned, pausing to engage and think through what you’ve just read is important. That part of the process just simply can’t be rushed.

The idea of speed-reading is useless to me in my “old age.” I think that’s why I cringe when I hear of a popular app like ReadQuick, which is built for the purpose of teaching us to read faster. If history shows us anything, it’s that speed and quality are almost always mutually exclusive. When multi-tasking is a prized activity and there’s always more and more to accomplish on any given day, sacrificing our engagement with the written word is something that could carry very drastic and long-reaching consequences.

Perhaps I should take comfort in the fact that trends tend to be cyclical. Maybe slowing down will one day once again by all the rage.

Not-So Private Eyes

Last week I was having one of those random conversations with a colleague that occur when you both need to take a break from doing whatever task it is that you’re attempting to accomplish. Specifically, we were talking about Google Glass, because only days before I had experienced my second encounter with Google Glass “in the wild.” My experience had been during a professional networking dinner. During this dinner, I had been disabused of the notion that Glass requires a verbal command from it’s user to do things like record video or take photos. I had learned that, with a few taps on a connected tablet, images and video could be taken with no one else in the room any the wiser.

I still suffer a bit of a creep factor when thinking about Glass.

This led to a discussion of how often we are recorded each day. Which led to talking about the absence of the expectation of privacy in a public place, which is how traffic cameras and the like are both legally and ethically justified. Glass is different, though, because being with someone who is wearing it degrades our expectation of privacy in private spaces, something to which we have previously been entitled.

My colleague thinks that recording everything has positive implications, because video records are the ultimate historian. The camera, in theory, doesn’t lie (although it’s amazing what can happen to the truth with a few edits).  His perspective is that history would be preserved more accurately if everything were recorded at all times.

Well, theoretically, its difficult to disagree with him there, although one would have to wonder how history would account for the negative space (you could watch someone do something, but perhaps never piece together why they had done what they did).

Stepping beyond the theory into the realm of the pragmatic, however, I think that there’s another issue at play. John Twelve Hawks toys with the idea in his novel, The Traveler, the idea of the panopticon. In society, just as in Bentham’s prison, people will always behave as though they’re being watched if they believe that they are, or could be, watched at any given time.

My concern is that we are already languishing in a culture that is driven by appearance, eschews depth whenever possible, and brands and markets everything, including people.  With that level of shallowness already in place, what are the implications of feeling as though we are watched all the time? What would be our reaction? Could we ever be (to use an over-used expression) authentic with anyone again? The potential social damage of Glass goes beyond even more immersion in data from an augmented reality technology. It threatens to bring a decline to what are already tenuous human relationships that occur only on the periphery of our screens.

Inwardly Linked, and a Downward Spiral

LinkedIn pen

I’ve never made a secret of the fact that I hate the monster that Facebook has become. I avoid that network at nearly all costs (as the digital sagebrush blowing across the face of my dusty profile will attest). I use several other social networks, though (you know, the non-monsters), and one of them is LinkedIn. This is because, in my new vocation especially, it’s how you get jobs. LinkedIn profiles are just what you do, and what you use, if you work in the world of technology.

I like LinkedIn, though, because it’s relatively isolated and specialized as a network, which is exactly what a professional network should be. I post professional items of interest there that generally wouldn’t go into my other networks, mostly because they just wouldn’t be of as much interest to people elsewhere. In that way, it remains a bit of a mystery to me, because (here’s a shocker) I really don’t get corporate culture. I don’t understand business-speak. I sometimes roll my eyes at the digital  presences of those who are acting professional, as all of us do, in the workplace…that is to say, different than they would anywhere else. That doesn’t take away from the fact, however, that LinkedIn has its uses, and they are very credible uses, at that.

If our culture has ever been guilty of anything, it’s the lie of the self-made-man. “Work hard, play hard” has somehow morphed into “work until you drop if you know what’s good for you, and then play if you have any time and/or energy left.” America is, if anything, a culture of hard-workers. Perhaps I sound cynical as I say that, but as it becomes easier and thus more expected to work from anywhere, then it becomes more natural to work all the time. And, because that’s often the professional expectation in today’s world, it also filters down to the educational realm. The drive to succeed in school at earlier and earlier ages (read: pre-university) becomes more and more intense, robbing children of their childhoods way too early.

Don’t hear me say that I don’t value education…quite the contrary (as our daughter’s love of books and extensive vocabulary would prove, to say nothing of the bookshelves of old grad school books lining our walls).  I think, though, that pressure to achieve doesn’t belong in our academic settings before the university level.

So, the story that ran a few days ago that LinkedIn will now allow children as young as thirteen years of age to have profiles caused my heart to sink. Because a professional  presence isn’t something that a thirteen-year-old should ever, ever have to concern themselves with. They will reach an age when they have to do that all too quickly, when they spend increasingly long hours at their job just to make ends meet, at which point, no matter how much they might love what they do, something of their innocence is lost.

This is, at the end of the day, a profit-seeking move for LinkedIn, I’m sure. They have to, after all, compete with other networks (although that is something that I don’t understand…they do one thing well, and I’m a firm believer in stopping there). This will prove a profitable venture for LinkedIn, I’m sure, but it will prove a socially costly mistake if it leads us to expect higher professional and academic achievement from our children at earlier and earlier ages.

Let them learn. Let them play. Let them hold onto those days to which many of us wish that we could return.

And let’s not make them rising corporate stars quite yet, okay?

Photo Attribution: Sheila Scarborough under Creative Commons  

Virtual Theatre

Concert lights, attributed to iurte under Creative Commons

Sometimes I feel as though I’m glued to a computer screen way too much.

Now that I’ve changed careers for my day job, I spend most of my day in front of a computer writing code or designing page layouts. It’s fun, don’t get me wrong. But I lament (as does my back after several stationary hours) the loss of the chances to be more physically active that I used to have. I’m still involved in theatre, and this is a huge outlet for me to be physically active…something that I desperately need, now. Between those two things and trying to keep some kind…any kind…of writing rhythm, I stay incredibly busy.

And I continue to be amazed at how much everything is alike in so many ways.

I listened to an interview with a web designer several months ago. She, too, had worked in theatre before beginning a career in the web, so there was immediate common ground for me there. She likened scenic design to web design, and I’m inclined to agree with her. There’s a creative component and technical component to each. The scene designer begins with sketches for what the setting of the fictional world should look like. Then, logistical issues are considered. Models are built. Working construction drawings are made, and then the lumber and drills and saws come into play as the sketches begin to take shape. Everything I know about power tools I learned in a college scene shop. In my experience in the theatre, there is almost always a technical director who oversees the construction of what the designer envisioned. He or she works off the drawings the designer provides and handles the technical details of the building.

The web designer also begins with drawings, but of a digital variety. Wireframes and prototypes are built in software like Photoshop and result in a visual model of what the website will look like. A developer then takes those prototypes and begins writing the code that will build them for your browser.

I began my theatre experiences on the technical side. I spent a lot of time doing the technical work of implementing others’ designs. I did my share of designing, as well, but me and a sketchbook were awkward companions, at best. I do the same thing for the web. I do some design work, but generally only page layouts, not real graphics work. I spend a lot of my time coding what others have designed.

Another similarity is that the web has its own sort of rehearsal process. As I write this, I’m getting ready to move a big project to a testing server for a dry run of how it will work. This is only for a select few people…the world won’t see the site until we’re settled that everything works the way that we want. It’s the Internet’s version of a dress rehearsal. A tiny audience will preview what the real event will look like before the curtain goes up on opening night.

There are a lot of other similarities, as well, too many for one post, I think. Isn’t it so fascinating, though, how different disciplines are so much alike the more one gets to know them?

Photo Attribution: iurte under Creative Commons

Number Five is Alive?

Robotics is one of those fields that always sort of interested me, but only in a passing way. I thought, during my bachelor days, that it would be fun that have a little cybernetic dog that greeted me at the door when I arrived home in the evening. Vacuums that took care of the floor on their own have always seemed an ingenious idea to me. So, occasionally, when I stumble across news stories about advancements in the field, I read them with interest, but then go about my life.

This one, however, was an exception, when I read about how the testing of the robot took place. One specific reference was how attempts were made to knock the robot over while it was walking or climbing. Perhaps it’s because I had only recently read of a robot guiding children on museum tours, but I found myself thinking for a moment, “That’s just mean to try to knock over the poor robot. He’s just doing the job that you told him to do.”
It’s interesting how we alter our perception of something that we’ve built when we build in the shape of something organic. When my phone was doused in liquid a few months ago, I had no such reaction, I simply went to the Apple Store to get a replacement phone. When I noticed a scratch on my car a few weeks ago, I shrugged and went on with life. Those things are just tools. And, so is a robot built in human form, but with one notable difference. The latter is made in its creator’s image.
I doubt that I’m the only person who has this sort of reaction to this, but I also think that those of us who do may well be in the minority. I suspect that there are many of a more engineering mindset who recognize robots as the machines that they are, regardless of whether or not they mimic the shape of bipedal humanoid. I don’t think that makes them cold, calculating individuals. They’re just more realistic than I am.
Except I wonder if that realism becomes something more cold and calculating. That is, I wonder that, when we have no issue with abusing our own creations in the (legitimate) name of testing, if it becomes  easier to have a more detached regard for fellow human beings. Certainly, we see a level of detachment begin to manifest in individuals who practice certain professions (some types of medicine, for example, or law enforcement) simply as a coping skill as they are faced with enormous amounts of tragedy so frequently. Perhaps knocking around the robot that you spent years of your life developing could have the same result.
Part of the launching point of a theology of technology is that we are creators, and that we ultimately will attempt to create in our own image. Robotics is currently, I think, the most obvious way in which this appears. If so, then the way in which we treat our creation says a great deal about us, especially as more sophisticated ways are explored to make these creations artificially intelligent. Where does the line between testing to diagnose a problem and outright calloused experimentation lie? How would feel differently if the robot could think independently at any level? This is the philosophical foundation for good science fiction…the kind that often becomes fact.

Photo Attribution: epSos.de under Creative Commons