top of page
Search

The Coming Age of Hyper-Mediation

  • avitalbalwit
  • 4 minutes ago
  • 4 min read


Increasing complexity requires increasing delegation. As a company grows, it needs more layers of management to delegate tasks, process information, and make decisions between the most junior worker and the CEO. If you're driving farther and to more locations, you won't be able to memorize the routes anymore or to keep track of roadwork in real time-- you'll need Google Maps. If there are more news sources publishing more frequently, you won’t be able to read them all -- you will need to rely on some aggregator to filter up what’s important. 


The more complex the world, the more mediation it requires. I experience this as a creeping shallowness in daily life. Where I once engaged deeply with a few things, I now skim across many. I defer to others' opinions more frequently, lacking the time or expertise to form my own views on every topic I encounter. 


There is greater satisfaction in unmediated experience. Consider building a piece of furniture vs. hiring someone to build a piece of furniture for you. You have the same piece of furniture at the end of it -- but I can guarantee one of those versions produces an entirely different feeling of ownership, pride, and satisfaction -- but also with the corresponding loss of the many hours you spent laboring. Which is the whole issue. How much efficiency will we give up in exchange for a less mediated experience of the world? I frequently hear a similar complaint from ICs who became managers: directing those doing the work is less satisfying than doing it yourself, even if you have greater impact by influencing more total work in your new role. Your experience dilutes the farther upstream you go from the “real" work. 


This isn't a new phenomenon. We, of the laptop and the factory, are already more distanced from our labor than the small plot farmer or the weaver on the hand loom. 


Marx recognized this in the industrial revolution: workers became alienated from their labor as production shifted from artisan workshops to assembly lines. “The alienation of the worker in his product means not only that his labor becomes an object, an external existence, but that it exists outside him, independently, as something alien to him, and that it becomes a power of its own confronting him,” he wrote, “It means that the life which he has conferred on the object confronts him as something hostile and alien."^


But with powerful AI, we will become more mediated still.


If complexity is a side effect of the amount of information one must process, AI is about to introduce a lot more information. Powerful AI can be seen as introducing millions (billions?) more productive agents into the world. They're going to be generating information, taking actions, and operating at a faster speed. Meanwhile, we will still be moving and processing information at the same pace.


So we'll end up in a world that feels as if it's inhabited by more people, and these people will be operating more quickly. There's going to be more information to track, more projects happening. How will we operate in that world?


Well, it seems like we'll need to use tools and systems to help us process it. That introduces another layer of mediation between us and the world. We'll need to rely on an AI system to help us understand what's going on with the other AI systems.


Consider contracts. AI won't eliminate contracts, but by bringing the price of generating contracts way down, will allow them to become thousands of pages long, comprehensible only to other AI systems. You'll need your AI to interpret and negotiate terms with another AI, removing you one step further from understanding the agreements governing your life.


This pattern will repeat across domains -- financial systems managed by algorithms too intricate for human comprehension, industrial processes that no human could meaningfully contribute to. The world will become too complex to navigate without the aid of AI. There will be little room left for our unaided choice, except the choice to be at a disadvantage by opting out.


This requires a lot of trust -- trust that the system is representing things accurately, acting in your best interests -- but it also risks deskilling. Outsourcing skills might be necessary, but we also have to reckon with the risk of giving away something essential. 

Marx warned that "the more the worker spends himself, the more powerful becomes the alien world of objects which he creates over and against himself, the poorer he himself -- his inner world -- becomes, the less belongs to him as his own."^


What will belong to us as our own? We must carefully consider our own inner world. Each capability we delegate, we should ask “Am I enriched or diminished by giving this away?” Much outsourcing will enrich you! That’s the point. But some will not. 


Using tools does not inherently diminish us. I think about the ballet dancer with her pointe shoes. The musician with his instrument. Our minds are greater than our bodies. They can imagine things that our bodies, unaided, cannot do. Or in a different light, I think about being part of a team. A piece of something greater than one's self. Part of a company. A country. We need not always contain all the necessary capabilities within ourselves or perform each step for some endeavor to empower us and give us meaning. By choosing to use a more powerful tool or to join some greater effort, you become greater than what you could have been on your own.


We must find what, in the end, we wish to still own, what is uniquely ours, what we will always add. And it may not be smarts; it may be something else: it may be taste, it may be values, it may be experience in the sense of experiencing the world. As we choose what to delegate to AI, the measure shouldn't be efficiency alone but whether the mediation helps us become more fully what we aspire to be.






^ Both Marx quotes come from the Economic and Philosophic Manuscripts of 1844

 
 
 

© 2024 Avital Balwit

bottom of page