How to stop your innovation from inspiring a Black Mirror episode

Warning: I’m going to criticize and question something beloved by millions of people—stationary exercise bikes.

I’m also going to be…cynical.

Earlier this year The New York Times published an article about Peloton, and how the brand has transformed from a digital/physical hybrid fitness offering to a full-blown exercise-as-entertainment platform.

The reporter described how logging into a session “feels like syncing up with a human iPhone, always swiping toward some new distraction.”

The experience made her feel that the real product went beyond an exercise bike. She wrote, “The company’s more significant offeringis this: The total curation of the mind.”

She was hooked.

After reading it, I felt that exercise bikes could be the basis for the plot of a future Black Mirror episode. In fact, I later found out, they already had been.

You don’t want your innovation on Black Mirror

Black Mirror is a science fiction series on Netflix that explores sinister and unintended consequences of humanity’s technological innovations. Some episodes may seem extreme and far off, but they’re rooted in the realities of today.

As a innovation consultant, it makes me wonder: Are the design thinking questions sparked by desirability, viability, and feasibility enough to prevent a product from being the basis of a future episode of Black Mirror?

Clearly not. If they were, we might not have cyber terrorism, biased AI, or even light pollution from the proliferation of space satellites.

The DVF lenses fail to examine how technology might negatively impact not only user satisfaction, but our social and moral values, and risk endangering humanity itself.

How can we innovate without inspiring a Black Mirror episode?

Well, for starters, watch some episodes of Black Mirror or the more recent Upload. Watch 2001: A Space Odyssey. This should help you imagine how technology may not help us the way we hoped.

Lest you think tech-gone-bad scenarios exist only in science fiction, you should also read thought leaders like Tristan Harris and Sam Harris. Listen to L.M. Sacasas go over 41 questions we should ask of the technologies and tools that shape our lives.

But it’s not only science fictions writers and philosophers who should consider what unintended consequences innovation may cause.

It’s our job, too.

See innovation through a humanity lens

In addition to the big DVF questions, we must also put technology through a Humanity lens and ask how technology aligns with a value system:

Humanity – how will this align with our value systems?

  • How will this technology make humanity better?
  • How will this technology make humanity worse?
  • How does this technology add or subtract from a person’s agency?
  • How would humans 200 years in the past and 200 years in the future view this technology?
  • What values might be compromised with adoption (even small adoption) of this technology?

These questions will help identify unintended consequences—good consequences, but probably mostly bad—that a technology might create.

Consequences could be as small as people no longer remembering any phone number by heart thanks to their smartphone’s contact list, to bigger considerations about what knowledge humans should delegate to computers, privacy rights, what is “consciousness,” and more.

These questions are daunting and the real “wicked problems” of our time—questions related to the humanity of a technology and how human centered it may or may not be.

While the answers to these Humanity questions are more subjective and speculative than those of Desirability, Viability, and Feasibility, the thought exercise is invaluable.

You are debating the solutions and technologies that will demarcate the next era of mankind, after all, so ask the tough questions.

Don’t let your next innovation exercise become inspiration for the next Black Mirror episode.

Let’s talk about how we can help move your business forward.

Contact us today →
Sketch of a phone
Delve has merged with Bresslergroup. Learn more →