Everything in Moderation

Last week I listened to a great conversation over at FLOSS Weekly regarding social media, conversations, and moderation. In case you don’t read the news often…or social media…in which case this might not interest you but I digress…there’s a been a bit of a stir around Twitter lately. The short version is that it’s about to become a privately held company controlled by an eccentric person with a lot of money who isn’t interested in curtailing anyone’s free speech.

Go ahead, I’ll let you catch up…

So, yes, regardless of how you feel about this…and my feelings are mixed…I think we all can agree that Twitter is about to become a very different neighborhood.

Like most of you, I’ve used social media for a long time. I would even have called myself a power user at one point, although I’ve stepped back from a lot of platforms, including deleting Facebook. Twitter has been the one that I’ve generally held onto, although lately I’ve been staying with the sites that were mainstays back in the day….Reddit, Digg, and so forth…because Twitter is beginning to become a platform for people to scream at each other, as well as making really frustrating and isolating decisions about how it can be used.

On the podcast, there was discussion about how, if Twitter is the public square for conversation in America, what moderation is necessary and appropriate? In short: is Musk’s vision of reducing moderation a pipe dream? The panel talks about how Reddit is heavily moderated, and, as a result, new users are often moderated out and leave. This poses the question, is that level of moderation a good thing?

Some level of governance is necessary for social discourse. However, the idea that the right kind of governance…taking the form of content moderation…can resolve the noisy echo chamber that Twitter has become is faulty at its premise, because it’s trying to fix a cultural problem with technology. We can’t moderate how people feel about each other, even if we can how they interact with each other.

The problem with Twitter, or any other social network, isn’t that there aren’t correct rules. The problem is that it gives everyone a platform to speak, but no one knows how to have civil discourse. To the contrary, it’s become fashionable to not be civil. As the panelists point out, when moderation reaches its extreme and people are banned from a network, they just create a parallel network. These are just echo chambers.

The problem is cultural. The problem is that we view anyone who disagrees with our perspective as “other,” as a hostile. The problem is that no dissenting views are tolerated in our so-called public spheres. The problem is that America’s version of discourse is to scream louder than the other person so that no one can hear them.

Let me say again, a functioning community must have some rules. Classrooms, faith communities, neighborhood gatherings, all have some level of expectations of behavior, if nothing else. If Twitter is indeed our public square, then I also have to wonder if the scope of the rules is different. If so, however, then I think that it has to be pubic and democratic, not private. There needs to be expectations of how to behave, but this will be useless if those engaging don’t care about those expectations.

Of all social networks, Twitter still doesn’t know what it is. It has grown into something unintentional, and can’t facilitate the conversations of a culture un-educated in civility. We can try to fix this with moderation all we like, but those efforts will fail. The problem lies much, much deeper than the platform which gives it voice, and trying to use more technology to resolve this will not be effective.

This is a problem that our tools cannot fix.

Image attribution: Pete Simon under Creative Commons.

I’ll Never Let You Go – The Grief of Losing a Dog

Just before I was in high school, my family got a dog. He was a small dog, and I’m honestly not entirely sure of his breed except to say that there was Chihuahua in there somewhere. We got him as a puppy, and this was at a fairly formative time in my life…I was old enough to take on a lot of the responsibility of him. He grew up through my high school years, faithfully waiting for me every afternoon when I disembarked from my ridiculously long bus ride. I made up funny voices for him to try to verbalize the expressive facial expressions that we came to know and love. I picked on him like a little brother. In college, I would come home on weekends and he was always there to greet me, faithful as ever.

When that dog died, it was a gut punch. If you’ve lost a pet, you know…there’s a grief process on par with losing a family member. I felt it for a while. Even though I didn’t live there any longer, it felt like a betrayal when my parents got a new dog. How could my old friend ever be replaced? It hurt that they tried.

This has come up a few times lately as our children are…passionately….expressing their desire for us to own a dog. I haven’t owned one since we lost that beloved friend. I don’t want to go through that loss again. The grief is not trivial.

Still, to go to the extremely expensive…and, I would argue, unethical…lengths of cloning a pet would be foreign to me. When I read this column about the industry that has grown up around this practice…yes, you read that correctly…I was more than a bit amazed. And, quite troubled, as well. What disturbes me is not so much the cost of doing this business, but rather the underlying assumptions that creep in through the writer’s descriptions.

If you read the column, you’ll notice that the writer feels the need to point out that cloning a pet is like resetting a phone…similar model, but new data. The comparison is to a cloned animal not having the memory or experiences of the original. I find it disturbing that our accepted cultural analogies to living things have become operating systems. I sort of get it…we are created as creators, and the lens through which we see our world is that which we have built…but there is inherent in this a disrespect for the living thing.

I’m not immune to this. Several years ago, we went through a weekend with no power after a nasty ice storm in North Carolina. When we left to stay with friends who still had power, our daughter’s betta fish didn’t survive the 40-degree nights. She was young at the time, too young, we decided, to have that conversation. So, as she hadn’t noticed when we returned, I made a late-night run to a pet store to insist to the mystified employee that I needed a betta that was a very exact color and appearance. They had one, and when my wife texted to check on my progress, I replied that I was inbound with the “Mark II.”

The source of this flippant disrespect for the living world around us can be found in abundance in the wording of the column. The process of a surrogate pet having the cloned pet is described not as a miraculous event of life continuing (even though it has been meddled with), but in purely scientific terminology. The new cat is an “embryo.” The focus is on the DNA of cells from the original animal, as though the animal is nothing more.

In his analysis of C.S. Lewis’ thought, Joe Rigney coins the expression “scientific reductionism.” He is using it to encapsulate one of Lewis’ central thoughts in the Abolition of Man. His definition is the audacity to believe that if we know all of the facts about a thing, that we know the essence of the thing (my paraphrase). That’s what I find at work here. Even though the subject of the column recognizes that her cloned cat is not the same as her first pet, there is a presumption that we have the right to artificially create a Frankenstein animal because of our grief process, because the animal has no substance other than its DNA. Essentially, in this view, the animal is no greater than the sum of its parts.

This reductionism is a fatally flawed premise. While mostly just gallows humor when we think about it in relation to pets, it becomes significantly more dystopian when framed in terms of humanity. Because, at its core, it requires the rejection of the recognition that humanity is more than just chemicals and electrons. There is no more value in life than that. When there is no more value in life, then war is acceptable. Murder of the unborn is acceptable. Mucking around with processes in our bodies that we don’t understand is acceptable.

Despite all of the science fiction through the decades that has warned us of exactly this issue.

Sometimes, when I stop to remember, and especially when I visit my parents today, I still miss that dog. Naively, I sometimes wish that he could have lived forever. I would never presume, however, to have a hand in re-creating his life, because I didn’t create it to begin with.

We’re playing God. And we’re enormously under-qualified.

Image attribution: Shadowgate under Creative Commons.

Futurist Retrospective

There are lot of ways that I’m a futurist.

I think that this is much to Karen’s chagrin. I tend to not just adapt to, but seek change in many ways, especially around technology. We were created as creators, after all, and I see the digital sphere as a grand, if occasionally misguided, expression of our creativity. That’s not to say that I grab every new toy that becomes available. Even if our budget were to allow, I believe in a spiritual discipline of avoiding materialism. I also believe that every technology should solve a problem for you, and that, if it doesn’t, it’s likely excessive to have it in your life.

That said, as the technology world goes, I supposed I’m still a bit of curmudgeon. I use some social media, but generally my perspective is that without it, we would have fewer problems. I read my news digitally, but I still prefer to read the paper every morning, even if it is in digital format. I use an RSS reader of sources that I know are reputable rather than allow someone else’s algorithm to feed me information. When I was splitting a lunch bill with some colleagues once, I asked if they had a PayPal account that I could send the money to, and they looked at me blankly as though I were an illiterate luddite.

There are also areas of my life in which I’m anachronistic. I refuse to use modern technology to make my coffee. I grind it in a hand grinder, measure the water carefully, and use a press for my morning caffeination. While my to-do lists are digital because I see a legitimate need to be able to access them from anywhere, any important thoughts or notes that I have about life or inspiration or reflection go into a leather traveler’s notebook that Karen gifted me for Christmas several years ago. There’s something about the discipline of slowing down long enough to write something by hand that is deeply important.

Some of my family finds this amusing. My father-in-law jokingly says that he likes watching me make coffee because I’m a “mad scientist.” I’m fairly certain I’ve gotten some strange looks on flights while journaling my thoughts. It’s just not something that one sees often any longer.

So, while there are ways that I’m a futurist, I suppose that these aren’t among them.

I remember a conversation some years ago with an old friend during our weekly meeting at a local coffee shop. We were discussing how, in Victorian times, everyone kept a journal. Publishing the private journals and papers of influential thinkers, often posthumously, has long been a valued practice in the academic community. I recall making the point in that conversation that blogs were the modern equivalent of this practice, only with the added benefit of inviting conversation from others on the thoughts recorded. Today, I think that I would be more uncertain of whether or not I was onto something there, and, even if I were, the algorithms of social media have all but degraded blogs to the backs of our minds (who has time to ready 200 word posts?) and, even if they haven’t in some circles, the beauty of a blog is the conversation, and almost no one comments on posts these days. So, even if I was correct and we were onto something important there, I think we’ve mostly managed to lose it among the noise.


There’s a theory out there that digital technology never actually makes anything easier for us (I’m specifying digital here, because I don’t think most of would argue against innovations like machines that do our laundry for us). As our work becomes more knowledge-based and less physical, we have developed the capacity to work from anywhere. While that’s a luxury that affords us more time, it also consumes more of our time because we can never switch it off. Sometimes I wonder if the Internet was a better place when it was a place we went to when we intentionally sat down behind a computer and initiated a connection, rather than having it in our pockets all of the time and always on. We’ve rushed to achieve so much, and we have largely succeeded. To paraphrase Captain America, though, they didn’t tell us what we’ve lost. There’s a point of connection that we don’t have if we see each other primarily on a screen.

I guess my point here is that everything becomes progressively more frenetic. And I know that I’ve written about this before, but it’s something that always seems to be on my mind of late, because everything keeps happening faster, and faster, and..it was too fast already when I began thinking about this topic.

I wish sometimes that we could go back. I think I’ve made it apparent here that I’m not against digital progress. It’s that I think that we hit a sweet spot some years ago, and things would have been really great if we had collectively pressed pause and broken free of the illusion that we can never appreciate this great thing that we’ve done, but rather have to immediately rush onto the next thing. And while that sweet spot would be defined slightly differently by different people, I really think that, if we could just rewind a bit…back to before social media spiraled out of control, back to before the web was in our pockets and on our wrists at all times, back to when people read books more than screens…I think that would be collectively better for doing so.

Anyone who has ever tried to downgrade an operating system will tell you, though, that you can’t go back. We can only make the best of what we have and move forward. Perhaps if we just decided to settle in, though, and work on making the best of it before rushing into what’s next….

I guess that wouldn’t be progress, though. And I wouldn’t be much of a futurist if I recommended it.

Or would I?

When We Know Too Much

There’s an old adage which claims that ignorance is bliss. There was a point in my life in which I think that this bothered me, assuming that it was an excuse for not wanting to educate oneself on a given topic. Anyone who has worked an unpleasant job, however…you know, the sort of part-time gig that pays the bills while you’re in college?…has learned the truth that, the more you know, the more that is expected, and likely decided that you just didn’t want to know.

This doesn’t stop when we enter the professional world, though. I discovered this the first time that I was in a leadership role. There was a heavy self-examination that took place before I would accept the responsibility. I recall my father coming home and discussing how he turned down, or had no interest in, more leadership responsibilities than he already had at work. He wanted to just do his job and come home. The extra burden was a weight that he chose to live without, and I will always have respect for his courage to make that decision.

Sometimes, I consider this when I think of how much information thrusts itself into my daily life. A few years ago, I used to have conversations, as I’m sure most of us did, around how “it’s never been easier to access the information that we need,” or words to that effect. Now, we have conversations about how information is always there, whether we want it to be or not, like the illegal off-switches predicted in Max Headroom. My phone includes a screen time monitor that, among other data, tells me how many notifications I receive each week. The first time that I saw the number, I was astounded at how many I receive on an average day. The number was huge.

And that number is another piece of information, another data point, which makes its way into my life.

I think that we forget that knowledge brings with it responsibility. Just like that old college job, a truth in life is that the more that we know, the more that is required of us, because that knowledge brings with it a burden as well as a benefit.

“Knowledge is a burden–once taken up, it can never be discarded.”

Stephen Lawhead, from The Paradise War

I thought of this a few days ago when I read about a new service offered by the U.S. Postal Service called Informed Delivery. While it’s a really interesting capability, and while I can imagine use cases for certain people and scenarios, especially surrounding the holiday rush that will impose itself upon our lives all too soon, my initial reaction was that I have no place in my life for this information. This would be yet another notification, yet another data point showing up on a device, something that I would be checking periodically, all for information that I can easily live without.

And that, I think, is the key. What information can we live without? I think that the answer is a greater amount than we think. In a way, the older that I get, the more my view of progress changes, the more that I consider the wisdom that, just because we can do something, doesn’t mean that we should. The rapid pace of our digital milieu seems to be based entirely on doing everything that we can, simply because we can.

I’m far from a luddite. I really like new toys. Lately, though, I’ve been working through the clutter and identifying what is too much, keeping what is necessary, and leaving behind what is not. I think that some people have a use for Informed Delivery, as well as for many other new technologies and tools that we hear about every day. I just caution that we don’t all have a need for all of the things that are out there, and that, if you don’t, perhaps…just perhaps…your life might be better off without.

Just a thought.

A Review of “Digitized: Spiritual Implications of Technology”

Screenshot of the cover of Digitzed: Spiritual Implications of TechnologyThis book intrigued me because I’m always fascinated by interdisciplinary explorations, especially when the thoughts surround theological implications of how we live our daily lives. As I’ve always been a bit of geek, and now make my living in technology, thinking theologically about that technology and how it impacts not only what I do, but how I live, is an exercise that I do regularly in any case. Hearing someone else’s thoughts on this is always welcome to me.

So, Bernard Bull’s Digitized: Spiritual Implications of Technology popped out to me as a must-read. I’ve never heard of Bull prior to this book, or read any of his other work, though he is published elsewhere. What I expected was a theological treatment of technology and daily life. What I got, to my disappointment, was a more religious recommendation of how to utilize technology in practice.

Bull’s examinations are of a very surface level. Spread widely through his book are definitions of basic concepts, such as social media and blogs. While establishing definitions early is important in any scholarly work, Bull dwells on these definitions at length, targeting readers who are not technically savvy at the expense of those who are. As a result, he manages to alienate readers such as myself (who are drawn to what the book appears to be about) in his earliest chapters. His recommendations at orthopraxy are low-level, extremely basic, and backed by views that smack of the very legalism that Bull insists he is trying to avoid.

That said, the book is not entirely without value. Bull spends time discussing the spiritual perils of a cultural obsession with efficiency, emphasizing that a Christian theological worldview insists that people are created in God’s image, and thus are more than the numbers to which the business world attempts to reduce us. He also includes thought-provoking discussion on the concept of identity and how this is effected by our digital presentations of ourselves, the implications of which are a relative concept of our true selves and how that relatively is, by definition, untrue.

Continuing on this concept of relativity, Bull speaks a timely truth in regards to how digital expression impacts our perceptual filters of the world in which we live:

“We are inclined to believe that which is presented in the most persuasive manner rather than that which is true. We celebrate social and political commentary that appears in 140 characters…We grow disinterested in lengthier explanations. We turn to ad hominem attacks on those with whom we disagree instead of respectfully debating the issues. We value news as much for its entertainment value as for its accuracy and information. If we are not careful, such practices breed skepticism about truth.”

Bernard Bull, “Digitized: Spiritual Implications of Technology”, p. 152

While Bull attempts to give us practical applications at the conclusion of his book (most of which I forced myself through as they appeared to be targeting those of an unrealistic level of technological illiteracy), his best practical take-away, perhaps ironically, comes from someone else. He borrows from Neil Postman and his contribution to the field of media ecology. Bull encourages the reader to answer the following questions when adopting any new technology (taken from pp. 130ff):

  1. What is the problem to which this technology is a solution?
  2. Whose problem is it, actually?
  3. If there is a legitimate problem that is solved by this technology, what other problems will be caused by using this technology?
  4. Am I using this technology, or is it using me?

Personally, the answers with which I found myself after asking the final of these four questions were…troubling….in regard to some pieces of technology that have a place in my life. Despite the large percentage of the book that was disappointing to me, there was much value in this application, though I question whether it is more Postman’s application than Bull’s.

Altogether, this book is worth reading for the 10% that is thought-provoking, assuming the reader is willing to either skip the rest or force themselves through it. Digitized is far from what I expected, but not completely without value.