Skip to content
CIO Advisory

What If We're Already Dead? Digital Ethics and Technology Governance

Pierre-Jean L'Hôte

Pierre-Jean L'Hôte

Strategic CTO Advisory • Founder Etimtech

6 min read
ethics
governance
technology
society
leadership

The First Gesture of the Morning

This morning, like every morning, your hand reached for your smartphone before your brain was truly awake. Not a decision. A reflex. An automatism hardwired by years of silent conditioning.

Waze decides your route. Your news feed shapes your opinions. The algorithm suggests your desires. The cloud stores your memory. Screens mediate your relationships. Social media validates your existence.

You signed nothing. You consciously accepted nothing. And yet, every fragment of your daily life now passes through a technological layer you don't control.

Jacques Ellul wrote in 1954: "Modern man believes he uses Technology, but it is Technology that uses him." Seventy years later, this sentence is no longer a philosophical warning. It's a clinical observation.

Survivalists stockpile canned food for the collapse. But the collapse is already here. Not the collapse of the system. The collapse of our ability to exist outside of it. Gradual. Invisible. Consented.

The Digital Voluntary Servitude

This isn't romantic technophobia. Technology has produced tremendous advances, in healthcare, education, productivity. The problem isn't the tool. The problem is the inversion of the power dynamic between humans and machines.

Three silent tipping points have occurred without most leaders even noticing:

The cognitive tipping point. Recommendation algorithms no longer just respond to our queries. They anticipate them. They shape our preferences, steer our reading, filter our perception of reality. An MIT Media Lab study demonstrated that false information spreads six times faster than true information on social media. Not because people are gullible, but because algorithms optimize for engagement, not truth.

The memory tipping point. We've externalized our memory to the cloud. Phone numbers, dates, knowledge, documents : everything is "somewhere in the drive." The day the service goes down, the access gets cut, the terms of use change, what's left? Not a technical outage. A collective amnesia.

The decisional tipping point. In companies, AI-driven dashboards guide strategic trade-offs. Predictive models influence hiring, investments, product choices. Who's really deciding? The executive committee, or the algorithm that prepared the data on which they're basing their judgment?

The IT Leader's Paradox

Here's the question too few CIOs and CTOs are asking themselves: how do you govern a technology that simultaneously governs you?

The IT leader is caught in a paradoxical vise. On one side, they must accelerate digital transformation : AI, cloud, automation, autonomous agents. On the other, they are the first witness to the systemic dependency these technologies create.

In aviation, they talk about the "control loop." The pilot remains in command of the aircraft as long as they understand what the automated system is doing and can take back control at any moment. When that understanding is lost, disasters happen. The Boeing 737 MAX story is its tragic illustration.

In IT, we've collectively let go of the control loop. Not on a single system, but on our entire cognitive and operational infrastructure.

The real risk isn't the outage. It's the loss of decisional sovereignty.

An Ethical Governance Framework for IT Leaders

Facing this reality, the answer isn't technological rejection, that would be as naive as it is ineffective. The answer is building an ethical governance framework, in the same way we've built financial, security, and regulatory governance frameworks.

Here are the five pillars I propose to IT leaders who want to take back control:

1. The Existential Dependency Audit

Beyond the classic technical audit (vulnerabilities, obsolescence, compliance), conduct an existential dependency audit. Which business processes completely stop functioning if a cloud provider cuts access? Which strategic decisions rely entirely on models whose internal logic you don't understand? Which human competencies have you stopped developing because a tool replaces them?

This isn't a theoretical exercise. It's the map of your actual vulnerability.

2. The Principle of Systematic Reversibility

Every technology choice should be evaluated through the lens of its reversibility. Can you exit within six months without paralyzing the organization? If the answer is no, you don't have a vendor, you have an owner.

In Europe, the GDPR, Data Act, and DMA lay legal foundations. But technical reversibility is an architecture discipline, not a legal clause. It's designed upstream, not as a reaction.

3. Algorithmic Transparency as an Executive Requirement

The EU AI Act mandates transparency levels for high-risk systems. But don't wait for regulation. Demand from your teams and vendors explainability for the models that influence your decisions. If an algorithm recommends cutting a product line, hiring one profile over another, or prioritizing a market, the leader must be able to understand the reasoning, challenge it, and overrule it.

A leader who doesn't understand the recommendations of their systems is no longer a leader. They're an executor.

4. The Intentional Preservation of Human Competencies

Automation creates a vicious cycle: the more you delegate to the machine, the more you lose the human competency needed to supervise the machine. In aviation, manual flight hours are mandatory to maintain pilot reflexes. In IT, nothing of the sort exists.

Integrate into your HR strategy programs to maintain critical competencies: negotiation without AI, analysis without dashboards, architecture without templates. This isn't conservatism. It's organizational resilience.

5. The Technology Ethics Committee

Not a token committee that meets twice a year to rubber-stamp hollow charters. An operational body, composed of technical, business, legal, and philosophical profiles, that challenges every major technology deployment on three simple questions:

  • What human autonomy does this system reduce?
  • What irreversible dependency does it create?
  • What world are we building if we generalize this choice?

The Uncomfortable Question

These words you're reading right now, are you thinking them, or are you consuming them because an algorithm judged them relevant to your profile?

The question isn't rhetorical. It's the ultimate test of your residual freedom.

Ellul was right: technology is never neutral. It carries within it a logic of expansion and autonomization that exceeds its creators' intentions. The role of the IT leader in 2026 is no longer just to deploy high-performing systems. It's to guarantee that humans remain at the center of the decision loop, not out of nostalgia, but out of strategic lucidity.

Freedom was saying no. Governance is organizing the conditions so that "no" remains possible.

Organizations that integrate this ethical dimension into their IT governance won't just be more resilient. They'll be the only ones that deserve the trust of their employees, their customers, and their regulators in the decade ahead.

Want to go further?

Related Articles