|
I recently finished reading ‘Program or Be Programmed: Eleven Commands for the AI Future’ by Douglas Rushkoff. Its central claim is simple. The technologies we use are not neutral tools. They carry assumptions about time, identity, truth, relationships, and value. When we accept defaults without awareness, we end up living according to those assumptions. Most modern systems are optimized for efficiency, scale, engagement, and prediction. Those priorities are not inherently wrong, but they are not synonymous with human flourishing. If left unexamined, they quietly reshape our habits, our expectations, and even our sense of what it means to be present with one another. Rushkoff’s eleven commands function less as rules and more as calibration points. They help us recognize the built-in biases of digital systems and reclaim agency in how we use them. I recommend reading the book, but I also wanted to share the eleven commands here. For each one, I’ve included the bias it addresses, the liability it creates, the opportunity it enables, and a tiny practice you can use to practically incorporate the command into your daily life. Image generated with ChatGPT. 1) Time — Do Not Be Always OnTech Bias: Platforms are engineered for continuous engagement. “Now” is the only time that matters. Notifications are gravity wells for attention. Liability: You live in reactive mode and confuse urgency with importance. Sleep, focus, and deep work erode. Opportunity: Treat your attention like a telescope. A telescope is powerful because it’s aimed. Constant scanning doesn’t reveal faint galaxies. Stillness does. Tiny Practice:
2) Place — Live In PersonTech Bias: Remote, scalable interaction is rewarded. Embodied local life is treated like inefficiency. Liability: You get lots of contact and less connection. Context collapses. Everything becomes a comment thread. Opportunity: In-person life is high-bandwidth. Libraries understand this instinctively. A room full of humans is a different internet. An internet that is slower, warmer, and more accountable. Tiny Practice:
3) Choice — You May Always Choose None of the AboveTech Bias: Interfaces push binary choices: Like/dislike, accept/decline, upvote/downvote, subscribe/leave, buy now/miss out. Liability: You get shepherded into options that serve the platform’s goals, not yours. Opportunity: “None of the above” is a superpower. It’s how you reclaim agency. Tiny Practice: Before clicking anything important, ask:
4) Complexity — You Are Never Completely RightTech Bias: Algorithms reward certainty and confidence. Nuance performs poorly. Outrage and anger performs extremely well. Liability: You get pulled toward overconfidence. You start arguing to win, not to learn. Opportunity: Complexity is not a weakness. Reality is layered, contingent, and rarely just black and white. Tiny Practice: Add one sentence to your hot takes:
5) Scale — One Size Does Not Fit AllTech Bias: Digital systems love scale: Uniform rules, one interface, one policy, one feed, one “community standard”. Liability: Local needs get steamrolled. People become “users”. Edge cases become invisible. Opportunity: Build small, adaptable systems where feedback can actually change the shape of the tool. Libraries are anti-scale by design. Even in a large system, each branch community adapts its own way of doing things. Tiny practice:
6) Identity — Be YourselfTech Bias: Platforms encourage performative identity: Branding, engagement metrics, persona maintenance. You become a product with a posting schedule. Liability: You drift from authenticity into optimization. You start “being” for the algorithm. Opportunity: Identity is not a static profile; it’s a living process. AI makes this tricky because it can mirror you back a cleaner, more marketable version of yourself. Don’t confuse that with your actual self. Tiny Practice:
7) Social — Do Not Sell Your FriendsTech Bias: Social networks are monetized. Relationships become data. Sharing becomes extraction. Even the language shifts as friends become “connections”. Liability: Social life becomes transactional, trackable, and subtly performative. Opportunity: Rebuild a commons mentality. Relationships are not inventory. Communities should not be strip-mined for engagement. Tiny Practice:
8) Fact — Tell The TruthTech Bias: Virality outruns verification. AI can generate plausible nonsense at industrial scale. Incentives reward the compelling, not the correct. Liability: Epistemic collapse: You stop trying to know what’s real, or you pick a tribe (a “truth team”). Opportunity: Truth-telling becomes a cultural skill again: Cite sources, verify claims, contextualize, revise, and employ nuance. Tiny Practice: Before sharing, pause and verify one key claim.
9) Openness — Share, Don’t StealTech Bias: Copy is effortless. Ownership is muddy. AI training and scraping amplify this by treating creation as raw material. Liability: Creators get hollowed out. People stop making original work because it feels pointless. Opportunity: Practice ethical sharing: Credit sources, ask permission when needed, and build reciprocity. Tiny Practice:
10) Purpose — Program Or Be ProgrammedTech Bias: Tools shape behaviour. If you use default settings, you accept default goals. Many systems are optimized for revenue, engagement, surveillance, and lock-in. Liability: You become a passenger in your own life—nudged, directed, puppeted. Opportunity: Purpose is writing the requirements document for your tech. What is this tool for? What is it not for? Tiny Practice: For any new app or workflow, complete the following sentences:
11) AI — Value The HumanTech Bias: AI reduces the world into what can be measured, predicted, categorized, and optimized. It’s a powerful pattern engine. Liability: You outsource judgment. Machine confidence replaces human wisdom. People get treated like inputs and outputs. Opportunity: Use AI as a tool, not an authority. Tiny Practice:
Stay CalibratedEvery tool has a bias: Toward speed, scale, extraction, certainty. Mindfulness means noticing that bias. Curiosity means questioning and asking whether it aligns with your values. Agency means adjusting accordingly. Remain attentive to the technologies you use and the biases they carry. With curiosity and mindfulness, you can ensure your tools serve your purposes rather than quietly programming your life. Technology should serve you. Not the reverse. Image generated with ChatGPT.
0 Comments
Leave a Reply. |
Categories
All
Archives
April 2026
Insights and Innovations Across the UniverseDelve into the realms of AI, astronomy, and philosophy. |