<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[The Future Is Elsewhere]]></title><description><![CDATA[The Future Is Elsewhere is a weekly briefing by futurist and author Mike Walsh on how AI, emerging technologies, and new business models are reshaping leadership, work, and strategic advantage in a rapidly changing world.]]></description><link>https://www.thefutureiselsewhere.com</link><generator>Substack</generator><lastBuildDate>Thu, 16 Apr 2026 01:27:48 GMT</lastBuildDate><atom:link href="https://www.thefutureiselsewhere.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Tomorrow]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[tomorrowist@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[tomorrowist@substack.com]]></itunes:email><itunes:name><![CDATA[Mike Walsh]]></itunes:name></itunes:owner><itunes:author><![CDATA[Mike Walsh]]></itunes:author><googleplay:owner><![CDATA[tomorrowist@substack.com]]></googleplay:owner><googleplay:email><![CDATA[tomorrowist@substack.com]]></googleplay:email><googleplay:author><![CDATA[Mike Walsh]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[The Rise Of The High Throughput Operator]]></title><description><![CDATA[When intelligence is no longer scarce, the real risk is not inefficiency, but underutilization.]]></description><link>https://www.thefutureiselsewhere.com/p/the-rise-of-the-high-throughput-operator</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/the-rise-of-the-high-throughput-operator</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Sun, 29 Mar 2026 01:03:19 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!zF9v!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zF9v!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zF9v!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg 424w, https://substackcdn.com/image/fetch/$s_!zF9v!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg 848w, https://substackcdn.com/image/fetch/$s_!zF9v!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!zF9v!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zF9v!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1011719,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefutureiselsewhere.com/i/192471308?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!zF9v!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg 424w, https://substackcdn.com/image/fetch/$s_!zF9v!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg 848w, https://substackcdn.com/image/fetch/$s_!zF9v!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!zF9v!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa124fd86-b03f-4215-84a5-336837a81988_3147x1770.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>For most of modern knowledge work, the defining anxiety has been simple and persistent: am I doing enough? Enough hours, enough output, enough visible effort to justify my role and my compensation. Performance was measured in activity, and productivity was largely a function of how effectively human effort could be applied to a problem. But what changes when effort is no longer the constraint? When intelligence itself becomes elastic, abundant, and on demand, the question shifts. The rise of the token economy is often treated as a technical or financial detail, but it is something more revealing. It is emerging as a new measure of productivity, not in terms of effort, but in terms of leverage.</p><p>The early signals are striking. Some of the most sophisticated practitioners now worry less about cost discipline and more about underutilization. Andrej Karpathy has described feeling &#8220;nervous&#8221; when he does not fully exhaust his AI token allocation, treating unused capacity as lost opportunity rather than efficiency. Nvidia CEO Jensen Huang is even more <a href="https://www.businessinsider.com/jensen-huang-500k-engineers-250k-ai-tokens-nvidia-compute-2026-3">explicit</a>: &#8220;If that $500,000 engineer did not consume at least $250,000 worth of tokens, I am going to be deeply alarmed.&#8221; Failing to deploy AI is no longer prudence. It is underperformance. The benchmark is shifting from how much work an individual completes to how much intelligence they can bring to bear.</p><p>This shift is best understood as a change in constraints. For decades, the bottleneck in knowledge work was human effort. Organizations were built to allocate tasks, coordinate people, and extract efficiency from limited time and attention. Generative AI introduces a different dynamic. Intelligence, once scarce and tightly coupled to individuals, becomes fluid and scalable. The limiting factor moves again. It is no longer what the system can do, but how effectively humans can direct it. On the No Priors podcast, Karpathy <a href="https://podcasts.apple.com/no/podcast/no-priors-artificial-intelligence-technology-startups/id1668002688">pointed out</a> that the primary constraint in engineering work is no longer compute capacity. &#8220;It&#8217;s not about flops&#8230; it&#8217;s about tokens. What is your token throughput and what token throughput do you command?&#8221; Performance is no longer defined by effort, but by the ability to direct large flows of machine intelligence toward meaningful outcomes. The implication is clear. If you cannot do this effectively, you become the constraint.</p><p>In practice, this is already reshaping how work gets done. Tasks that once defined expertise are increasingly delegated to AI systems, while humans focus on structuring problems, distributing work across multiple agents, and integrating results. The role begins to resemble orchestration more than execution. Instead of writing code, drafting documents, or performing analysis step by step, individuals manage flows of machine-generated output across several tools at once, intervening at key moments to guide direction and ensure coherence. Less like a worker, and more like a system designer.</p><p>This is the emergence of a new archetype of performance: <em><strong>the high throughput operator.</strong> </em>This is not the person who knows the most or works the hardest, but the one who can effectively coordinate the largest volume of intelligence. Their advantage lies in how they frame problems, how they allocate tasks between human and machine, and how they maintain quality across an expanding surface area of output. They treat AI not as a tool to be occasionally consulted, but as an always on cognitive infrastructure. Their contribution is not measured in tasks completed, but in systems directed.</p><p>In this environment, expertise does not disappear, but it changes shape. Knowledge becomes a multiplier rather than a primary source of value. The critical skill is judgment, knowing how to break problems into machine executable components, how to design workflows that produce useful results, and how to evaluate those results before errors compound. This is where cognitive leverage becomes the defining concept. Cognitive leverage is the ability to generate disproportionate value from a relatively small amount of human input. It is the difference between doing more and making more happen. A highly leveraged individual can take a complex objective, distribute the work across a network of AI systems, and recombine the outputs into something coherent and valuable. Tokens enable this process, but they do not determine its effectiveness. That depends on how well the system is designed and governed.</p><p>This introduces a familiar tension. Tokens are both a cost and a capability. The instinct to minimize usage is understandable, but it risks constraining the very resource that drives productivity. History suggests that organizations that expand into new forms of abundance outperform those that optimize too early. Electrification created advantage not because power was cheap, but because it enabled entirely new ways of organizing production. Cloud computing followed the same pattern. It won not on cost efficiency alone, but on the ability to experiment and scale. The same logic now applies to intelligence. The question is not how much is consumed, but how effectively it is deployed.</p><p>At the same time, the labor market is beginning to adjust. The routine, structured tasks that once defined entry-level roles are among the first to be automated, reducing demand for junior positions while increasing the premium on those who can operate at a higher level of abstraction. This creates a subtle but important shift. The pathway to expertise, historically built on repetition and incremental skill acquisition, is narrowing just as the need for high-quality judgment expands. Without a deliberate approach to talent development, organizations may find themselves with more intelligence than they can direct, but fewer people capable of directing it.</p><p>As models improve and costs decline, the constraint will move again. Access to tokens will matter less. The scarce resource will be judgment, the ability to ask better questions, structure problems, and intervene at the right moments. In that world, performance is no longer about what you produce, but what you can direct. Leverage becomes the defining metric.</p><p>The implication is stark. When <a href="https://www.amazon.com/Abundant-Intelligence-Digital-Rewrite-Business-ebook/dp/B0GD878WCT">intelligence is abundant</a>, underutilization becomes the new form of inefficiency. Not using what is available is no longer a sign of discipline, but of misalignment. The organizations that struggle will not be those that lack access to AI, but those that fail to reorganize around it. And the individuals who fall behind will not be those who lack effort, but those who fail to expand their capacity to direct it.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[When Work Isn’t a Workflow]]></title><description><![CDATA[Why agents, advisors, and sales roles will be reshaped, not replaced]]></description><link>https://www.thefutureiselsewhere.com/p/when-work-isnt-a-workflow</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/when-work-isnt-a-workflow</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Thu, 19 Mar 2026 17:14:29 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!2ERY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!2ERY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!2ERY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg 424w, https://substackcdn.com/image/fetch/$s_!2ERY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg 848w, https://substackcdn.com/image/fetch/$s_!2ERY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!2ERY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!2ERY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg" width="1456" height="849" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:849,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:4477274,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefutureiselsewhere.com/i/191496407?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!2ERY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg 424w, https://substackcdn.com/image/fetch/$s_!2ERY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg 848w, https://substackcdn.com/image/fetch/$s_!2ERY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!2ERY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fac7af57e-685d-4caf-923a-53baf8118340_6000x3500.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The prevailing story about AI and jobs is seductively simple: break work into tasks, measure how many can be automated, and once enough of them are, the job disappears. That logic works well for routine work. But in high-stakes, human-facing roles, especially those performed by agents and advisors, it rests on a fragile assumption: that jobs are just workflows, collections of discrete steps that can be taken apart without changing where value is actually created&#8212;or whether it can be created at all.</p><p>A recent <a href="https://www.anthropic.com/research/labor-market-impacts">report</a> from Anthropic tries to quantify AI&#8217;s impact on the labor market through a measure of &#8220;observed exposure,&#8221; combining theoretical LLM capability with real-world usage from millions of Claude interactions and mapping both onto occupational task data from O*NET. Its logic is straightforward: the more of a job&#8217;s tasks AI can do, and is already doing, the more exposed that job is to disruption. It is a sophisticated extension of the task-based view of work, and it produces compelling signals about where AI is already active. But it still assumes that if you can decompose a job, you can understand its value. In many roles, especially those built around human judgment and coordination, that is precisely the mistake.</p><p>To see why, it is useful to borrow a concept from early twentieth-century psychology. The Gestalt theorists argued that we perceive patterns, not parts. A melody is not experienced as a sequence of notes, but as a whole. Rearrange the notes, and the melody disappears, even if every individual component is still present. As Kurt Koffka put it, the whole is not simply more than the sum of its parts; it is different in kind.</p><p>The same is true of many forms of work, especially those built around human interaction. What looks like a sequence of tasks on paper is, in practice, an evolving social process. Each interaction changes the next. Meaning is interpreted, not just transmitted. Decisions are shaped by timing, framing, and trust as much as by information. The outcome is not produced by completing steps, but by how people respond to them.</p><p>Consider a real estate transaction. It can be mapped as a series of steps: pricing, listing, marketing, negotiation, closing. But that is not how the deal actually happens. A buyer hesitates, not because of price, but because of uncertainty. A seller rejects an offer because it feels wrong, not because the numbers do not work. A shift in tone, timing, or phrasing can move a negotiation forward or cause it to collapse. The role of the agent is not to move the process along a checklist, but to manage a moving target of perception, emotion, and incentive. The outcome emerges from how those elements play out over time.</p><div id="youtube2-2SV2ipYQ-WE" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;2SV2ipYQ-WE&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/2SV2ipYQ-WE?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>Financial advice operates in much the same way. Portfolio construction is increasingly straightforward. Models can optimize allocations, simulate risk, and rebalance continuously. Yet the defining moments in a client relationship rarely occur in these analytical phases. They occur when markets fall, uncertainty spikes, or life changes suddenly, and clients feel the urge to act against their own long-term interests. The critical work is not choosing the portfolio. It is navigating the client&#8217;s response to uncertainty. That is not a task. It is an unfolding interaction.</p><p>This is where task-based models begin to break down. They measure which parts of a job can be substituted, but miss how the whole situation actually works. You can automate analysis, generate documents, and manage communication flows, and still not control the outcome. Completing 80 or even 90 percent of the identifiable tasks in a role does not guarantee that a deal closes or that a decision holds under pressure. Those outcomes depend on moments of judgment, timing, trust, and emotional coordination that are not easily reduced to tasks in the first place.</p><p>A glimpse of this shift is already visible in the market. The recent <a href="https://www.redfin.com/news/press-releases/redfin-debuts-real-estate-app-in-chatgpt/">convergence</a> between Rocket, Redfin, and generative AI platforms is a case in point. Rocket&#8217;s acquisition of Redfin, followed by Redfin&#8217;s launch of an AI-powered home search experience inside ChatGPT, points toward a fully integrated, AI-native transaction stack. Discovery, pricing, brokerage, financing, and customer interaction are being pulled into a single conversational flow, collapsing what was once a fragmented process into a continuous digital experience. On one level, this is the logical endpoint of workflow automation: faster transactions, greater transparency, and radically lower friction.</p><p>But on another level, it exposes the limits of the workflow model itself. As the informational and transactional layers of an industry collapse into software, the system does not become simpler. It becomes faster, more dynamic, and often more volatile. More data does not eliminate uncertainty. It amplifies it. And as more of the customer journey compresses into software, the remaining human moments become more consequential. The role of the broker does not disappear. It becomes more valuable, precisely because it sits at the point where the process stops being computational and starts being emotional, interpretive, and irrevocable.</p><p>What AI changes, then, is not whether these jobs exist, but how they are structured. The lower layers of the work&#8212;analysis, preparation, routine communication&#8212;are increasingly handled by machines. The human role becomes more concentrated in moments that require interpretation, alignment, and commitment. The job compresses at the bottom and intensifies at the top. In fact, many of these roles, especially those requiring complex human coordination, may move toward the frontier of &#8220;high exposure&#8221; without triggering a white-collar collapse, because the work that remains is the work that matters most, and the people doing it will operate with far greater leverage than before.</p><p>This creates a less obvious but more troubling effect. Many of these professions have historically depended on apprenticeship. Junior roles provide exposure to real-world situations, allowing individuals to develop judgment over time. If AI removes much of this early-stage work, the training ground for these capabilities begins to disappear. We may become highly effective at automating large portions of the work, but less effective at developing the people who can do what remains.</p><p>The deeper point is that not all work can be broken down without changing what makes it valuable. When tasks are independent and repeatable, decomposition enables automation and scale. But when outcomes depend on how people interpret and respond to each other, breaking the work apart can strip out the very dynamics that drive results. Some jobs are workflows. Those will be automated with increasing precision. Others are social processes, where outcomes emerge through interaction over time. In those domains, AI does not eliminate the work. It raises the stakes of what remains human. AI will do most of the work. But someone still has to make it work together.</p><p>For leaders, this requires a shift in perspective. The critical question is no longer which roles have the most automatable tasks. It is where value depends on human judgment, trust, and coordination under uncertainty. Those are the roles that will not disappear, but be redefined. And they are the ones that will matter most.</p><p>Salespeople often assume they will be first in the firing line as AI reshapes work, a modern echo of Willy Loman watching the world move on from a model of selling that no longer works. There is some truth in that. The mechanics of the job are changing fast. But for those whose real work is helping other humans decide, commit, and act, the future will not be defined by how many tasks machines can perform. It will be defined by the value of what happens after those tasks are done. When value is created between people, not within steps, breaking the work apart risks breaking what makes it work.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[How Many AI Agents Does It Take To Change A Lightbulb?]]></title><description><![CDATA[Why counting digital workers will force companies to rethink org charts, accountability, and the economics of decision-making.]]></description><link>https://www.thefutureiselsewhere.com/p/how-many-ai-agents-does-it-take-to</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/how-many-ai-agents-does-it-take-to</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Sat, 14 Mar 2026 15:03:04 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!VsHD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!VsHD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!VsHD!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!VsHD!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!VsHD!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!VsHD!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!VsHD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:929033,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefutureiselsewhere.com/i/190939506?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!VsHD!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!VsHD!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!VsHD!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!VsHD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcdc136c2-372d-4122-a1eb-2f395ef18fcb_3000x2000.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>With all the discussion about AI agents lately, you might be wondering: <em>how exactly do you count them?</em> If multiple agents collaborate to resolve a customer issue or approve a loan application, does that represent one digital worker or many? The question may sound trivial, but it will soon matter a great deal. Organizations will eventually track digital headcount the same way they track human employees today.</p><p>Workforces used to be easy to measure because they were made of people. Employees had identities, job descriptions, and clear places on an organizational chart. You could count accountants, engineers, or customer service representatives with simple headcount. AI agents break this model. They are not discrete in the way humans are. Agents can spawn sub-agents, operate for milliseconds, run invisibly inside software systems, or collaborate in networks that blur the line between tool and worker. What appears externally as a single agent may internally be an orchestration of models, prompts, memory systems, policy engines, and software tools. Technically the system is a constellation of components. Organizationally, however, it may still behave like a single role.</p><p>This is where the counting problem begins.</p><p>Different parts of the organization will see the same system very differently. To product marketing it may appear as one AI agent. To the software architecture team it may be a network of micro-agents. To cloud infrastructure it could represent hundreds of model calls. Finance, meanwhile, may see nothing more than a few cents of inference cost.</p><p>A simple rule of thumb helps cut through this complexity. What matters is not how many models are running, but how many decision-making roles exist inside the enterprise and how those roles interact. An agent is not defined by the number of tools behind it, but by the unit of responsibility it represents within the system. In practice, an agent is the smallest unit of autonomous responsibility in a digital workforce.</p><p>Many production agents are really bundles of models, prompts, memory systems, and tools working together behind a single interface. A customer support agent, for example, might include a reasoning model, a retrieval system, a policy engine, a summarizer, and an action executor. Technically that is a multi-agent pipeline. From the perspective of the enterprise, however, it functions as one digital worker with a defined role. If several internal components collaborate but consistently produce one coherent decision or action, it is best understood as a single agent with internal architecture, much like a human employee who relies on spreadsheets, software, and colleagues to do their job.</p><p>Research in technology governance highlights why this distinction matters. Sociologist Madeleine Clare Elish <a href="https://estsjournal.org/index.php/ests/article/view/260">coined</a> the term &#8220;moral crumple zone&#8221; to describe what happens when complex automated systems fail. Just as the crumple zone in a car absorbs the force of a collision, responsibility in automated systems often collapses onto the nearest human operator, even when the broader system design shaped the outcome. When organizations cannot clearly identify which digital systems act with autonomy or authority, accountability defaults to individuals rather than the architecture that produced the decision. Defining the boundaries of digital workers therefore becomes more than a technical exercise. It is a way of ensuring that responsibility is assigned where it actually belongs.</p><p>If agents are going to function as digital workers, leaders need a simple way to identify them. Here are some practical rules that might help:</p><p>The first is <strong>identity</strong>. If a system has a persistent identity inside the organization, it begins to behave like a digital worker. It can authenticate into systems, receive permissions, and perform actions that can be traced back to that identity. If a system cannot be independently identified and audited, it is probably just a component inside a larger architecture.</p><p>The second rule is <strong>lifecycle control</strong>. A system that can be provisioned, updated, paused, or retired independently has an operational lifecycle. That means it can be managed much like organizations manage applications or service accounts. By contrast, a micro-agent that appears only as part of an orchestrated chain of tasks is closer to a function than a worker.</p><p>The third rule is <strong>accountability for outcomes</strong>. A digital worker should own a measurable task or result. An IT support agent might be responsible for responding to service tickets within a defined service level. If a system contributes only a hidden sub-step within a larger workflow, it likely belongs to the system architecture rather than the workforce.</p><p>Together these rules create a surprisingly clear boundary. If a system has a distinct identity, an independent lifecycle, and responsibility for a defined outcome, it begins to resemble a digital employee. If not, it is better understood as infrastructure.</p><p>But what happens when components begin to behave like independent actors? If systems have distinct roles or objectives, if they operate asynchronously, coordinate decisions with one another, or expose separate identities and interfaces to the organization, then you are no longer looking at one agent. You are looking at a team of agents. At that point the system begins to resemble a small digital organization rather than a single worker augmented by technology.</p><p>Consider an aviation analogy. A modern aircraft cockpit contains autopilot systems, navigation computers, sensor networks, and sophisticated flight software performing thousands of calculations every second. Internally it is an extraordinarily complex digital environment. Yet operationally we still treat autopilot as part of a single role: the aircraft&#8217;s flight control system assisting the pilot.</p><p>Air traffic control, by contrast, is a distributed coordination system. Radar networks, aircraft, scheduling systems, and human controllers interact across towers and control centers. Each participant has its own responsibilities, authority, and identity within the system. What emerges is not one augmented operator but a network of interacting roles. The difference is not the number of machines involved. It is whether the system supports one role or many.</p><p>This shift from architecture to accountability is already appearing in governance frameworks. The U.S. National Institute of Standards and Technology has begun <a href="https://www.nist.gov/publications/artificial-intelligence-risk-management-framework-ai-rmf-10">exploring</a> how agent systems should be identified, authenticated, and authorized as they interact with digital infrastructure. The emphasis on identity and authorization reveals an important assumption: if agents are going to act autonomously inside enterprise systems, they must be treated as identifiable entities whose actions can be traced and governed.</p><p>International governance frameworks are moving in a similar direction. Emerging ISO standards like <a href="https://www.iso.org/standard/42001">ISO/IEC 42001:2023</a> for AI management systems require organizations to define the scope of their AI deployments, manage them across their lifecycle, and assign accountability for their behavior. These frameworks do not attempt to catalog every model or algorithm inside a system. Instead, they focus on identifying which systems operate as actors inside organizational processes and ensuring those actors can be governed responsibly. Implicitly, they adopt the same principle: what matters is not the internal architecture of AI systems, but the role they play inside the enterprise.</p><p>For most executives, the debate about counting digital workers will likely surface first on the org chart. Should AI agents appear alongside human employees? In 2024, the HR software company Lattice briefly experimented with allowing companies to list AI employees in its platform, only to reverse course after a public backlash. At the time the idea seemed provocative, even absurd. In retrospect it may prove inevitable. If digital workers have identities, permissions, responsibilities, and measurable outcomes, they begin to resemble organizational actors rather than tools. The more interesting question may not be whether agents appear on org charts, but how their presence reshapes them. As digital workers take on operational decisions once handled by layers of management, hierarchies built around supervising people may give way to flatter structures designed to coordinate human and machine decision-making.</p><p>Yet even this debate about org charts may be missing the deeper shift underway. Org charts, after all, are still a way of counting people and managing layers of control. Agentic systems are beginning to change the underlying economics of work itself. The real transformation in organizations is not simply the number of digital workers they deploy, but the amount of decision-making capacity embedded in their operations.</p><p>Historically, firms measured productive capacity through simple metrics such as headcount or labor hours. Those measures made sense in an industrial economy where human attention was the primary constraint. Agentic systems change that equation. As organizations embed intelligence into everyday processes, the relevant question shifts from how many workers exist inside a workflow to how much cognition the system can execute. The more meaningful metrics may become things like decision throughput or the number of tasks completed autonomously. Instead of asking how many workers are involved in a process, leaders may soon ask how many decisions that process can execute per second.</p><p>So, with all that in mind, how many AI agents does it take to change a lightbulb? Arguably, none. A sensor detects the outage. A diagnostic model determines the cause. A procurement system orders a replacement. A scheduling agent allocates a technician. A workflow system verifies that the job is complete. But unless the organization has a team of highly sophisticated humanoid robots, it still takes a human to take the bulb out of the box and screw it in. </p><p>Depending on how you feel about the future of work, that may be good news for now.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The AI Layoff Illusion]]></title><description><![CDATA[Why cutting workers in the name of artificial intelligence doesn&#8217;t necessarily create real productivity.]]></description><link>https://www.thefutureiselsewhere.com/p/the-ai-layoff-illusion</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/the-ai-layoff-illusion</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Sun, 08 Mar 2026 08:54:43 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!-TLi!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!-TLi!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!-TLi!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!-TLi!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!-TLi!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!-TLi!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!-TLi!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg" width="1456" height="728" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:728,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:383899,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefutureiselsewhere.com/i/190265192?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!-TLi!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!-TLi!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!-TLi!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!-TLi!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F732a5c8c-9ae3-4aa1-b15e-852747302f1c_4000x2000.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>A dangerous new market narrative is spreading through boardrooms and earnings calls: artificial intelligence has made companies so productive that they can slash their workforce and barely notice the difference. Analysts applaud, the stock jumps, and executives describe a future where digital labor replaces the old human-heavy operating model. Unfortunately, the economy is rarely that tidy.</p><p>Over the past year a growing list of companies has announced layoffs framed around AI-driven efficiency. Logistics software firm WiseTech Global <a href="https://www.reuters.com/business/world-at-work/australias-wisetech-global-plans-2000-job-cuts-amid-ai-overhaul-2026-02-24/">said</a> AI-assisted development tools were collapsing project timelines from months to days as it eliminated roughly 2,000 roles. Chemical giant Dow <a href="https://www.theregister.com/2026/01/29/dow_chemical_ai_layoffs/">announced</a> thousands of job cuts as part of an automation and AI overhaul, though weakening industrial demand clearly played a role. <a href="https://www.reuters.com/business/world-at-work/autodesk-lay-off-about-7-workforce-2026-01-22/">Autodesk</a> and <a href="http://Pinterest">Pinterest</a> both reduced headcount while promising to redirect resources toward AI initiatives. Insurance group Allianz has <a href="https://www.reuters.com/business/world-at-work/allianz-cut-up-1800-jobs-due-ai-advances-says-source-2025-11-26/">suggested</a> that advances in AI-powered customer service and claims processing could eventually displace thousands of call-center jobs.</p><p>On the surface, AI-powered labor substitution looks like the beginning of a productivity revolution. In reality, the story is more complicated. Some companies over-hired during the post-pandemic boom and are now shrinking under the convenient cover of AI disruption. Others are desperate to signal relevance in a market obsessed with artificial intelligence. A small handful of firms are seeing real gains from digital labor. But even there executives may be drawing the wrong conclusions about what actually drives scale.</p><p>Consider the restructuring at Block. The firm <a href="https://www.reuters.com/business/world-at-work/allianz-cut-up-1800-jobs-due-ai-advances-says-source-2025-11-26/">announced</a> plans to cut more than 4,000 employees, roughly half its workforce. At first glance the move resembles the early days of Twitter after Elon Musk arrived with a chainsaw and a strong view that most Silicon Valley companies were bloated.</p><p>But the Block restructuring is more deliberate than it appears. The company has spent the past several years embedding AI into its internal workflows, particularly in software development. Engineers use AI tools to generate code, test features, and accelerate product cycles that previously required large teams and layers of coordination. Leadership says the result is a dramatic jump in productivity and a surge in gross profit per employee. In other words, the cuts are not simply about removing people. They reflect a bet that software agents can remove friction from the company&#8217;s internal machinery.</p><p>Klarna <a href="https://www.fastcompany.com/91468582/klarna-tried-to-replace-its-workforce-with-ai">tells</a> a very different story. The Swedish payments company aggressively promoted its AI transformation, claiming that generative AI assistants were performing the work of hundreds of customer service agents. Hiring slowed, headcount fell, and executives highlighted rising revenue per employee as proof that the model was working.</p><p>Then reality intervened. Customer support interactions turned out to be more complex than a chatbot script. Financial disputes require empathy, judgment, and trust. Klarna eventually reintroduced more human service capacity and shifted toward a hybrid model where AI handles routine inquiries while people manage difficult situations.</p><p>Comparing the AI transformations of Block and Klarna reveals an important principle that many companies miss. The best target for AI restructuring is workflow friction, not headcount. Klarna initially pitched AI as a labor substitute. Block frames it as a force multiplier for smaller teams. The second framing is far more robust. When AI removes operational obstacles around skilled workers, organizations unlock real leverage. When AI tries to erase the human layer entirely, the system often deteriorates in ways that financial metrics fail to capture.</p><p>Another key difference between the two companies is where AI is deployed. Back-office augmentation is far easier than customer-facing replacement. Internal engineering, model building, summarization, quality assurance, and repetitive analysis are forgiving environments for AI agents. Mistakes can be corrected before they reach customers. Customer service is different. It involves emotion, nuance, and exceptions. Automation failures there damage trust quickly. Block&#8217;s investments sit largely in the first category. Klarna pushed too aggressively into the second.</p><p>The metrics used to justify these restructurings also deserve closer scrutiny. Revenue per employee has become the poster statistic of the AI productivity story. Klarna&#8217;s executives highlighted it repeatedly. Block has emphasized gross profit per employee. Investors love these ratios because they appear to compress efficiency into a single number.</p><p>But the math is misleading. Cut the workforce in half while revenue stays flat and the metric doubles overnight. The statistic improves even if the organization itself becomes weaker. Revenue per employee tells us what happened after the layoffs. It does not prove that the company became more scalable.</p><p>Klarna illustrates the danger perfectly. The revenue-per-employee story looked brilliant until the company realized that removing too many humans from the system degraded the customer experience and forced it to rebuild parts of the workforce. The ratio improved before the operating model was proven.</p><p>The real test of AI-driven productivity is not whether a company can survive with fewer employees. It is whether the organization can reduce the marginal cost of coordination without eroding trust. True scale in the AI era comes from redesigning how intelligence is configured throughout the firm. That means shorter decision cycles, better exception handling, lower cost to serve, stronger decision quality, and preserved customer relationships.</p><p>When you look closely at companies where AI is genuinely improving productivity, three structural shifts appear:</p><ol><li><p><strong>Coordination compression.</strong> Artificial intelligence reduces the friction between analysis, decision making, and execution. Code generation, automated testing, rapid experimentation, and internal agents executing workflows shrink the distance between an idea and its implementation.</p></li><li><p><strong>Decision leverage.</strong> Humans move up the stack. Instead of performing every task themselves, they supervise systems that generate and evaluate options at scale.</p></li><li><p><strong>Cost-to-serve decoupling.</strong> AI systems handle routine work so efficiently that the marginal cost of serving another customer or processing another transaction begins to fall.</p></li></ol><p>That is the real signal of scale. Not fewer employees but lower coordination cost per decision. From this perspective, the market&#8217;s fascination with AI layoffs misses the bigger story. Artificial intelligence is not simply a tool for replacing workers. It is a technology for redesigning the architecture of work. Companies that treat AI primarily as a headcount reduction strategy may discover that they have optimized a ratio while weakening the system that created the value.</p><p>The winners in the AI era will not be the companies that eliminate the most employees. They will be the ones that redesign work so that every human decision is amplified by machines. Headcount may fall, but that will be a consequence of scale, not its cause.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Abundant Intelligence Does Not Have to End in Crisis]]></title><description><![CDATA[A response to the Citrini Research Memo]]></description><link>https://www.thefutureiselsewhere.com/p/abundant-intelligence-does-not-have</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/abundant-intelligence-does-not-have</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Sat, 28 Feb 2026 19:15:24 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!YF1d!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!YF1d!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!YF1d!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg 424w, https://substackcdn.com/image/fetch/$s_!YF1d!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg 848w, https://substackcdn.com/image/fetch/$s_!YF1d!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!YF1d!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!YF1d!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg" width="1000" height="600" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:600,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:80760,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefutureiselsewhere.com/i/189488693?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!YF1d!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg 424w, https://substackcdn.com/image/fetch/$s_!YF1d!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg 848w, https://substackcdn.com/image/fetch/$s_!YF1d!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!YF1d!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3383b61-2ae8-4f6e-8446-e71580b1e827_1000x600.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>There is, perhaps, a small silver lining in the current wave of AI anxiety. Not long ago, the dominant fears revolved around killer robots, runaway superintelligence, and apocalyptic scenarios that ended with data centers being nuked from space. Today the panic is more grounded, and in many ways more sophisticated. We are no longer imagining machines conquering humanity; we are worrying about white-collar unemployment ticking above 10%, mortgage books wobbling in San Francisco, and private credit portfolios unraveling because software agents can write code faster than junior analysts. The monsters have moved from science fiction to the balance sheet.</p><p><em><a href="https://www.citriniresearch.com/p/2028gic">The 2028 Global Intelligence Crisis</a></em> from <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Citrini&quot;,&quot;id&quot;:86606269,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F929ec1a7-20ff-490f-9f2d-65b2bb690dec_225x225.png&quot;,&quot;uuid&quot;:&quot;4af24269-6c6c-449f-a800-a59f648b862a&quot;}" data-component-name="MentionToDOM"></span> captures this shift perfectly. Subtitled &#8220;The Consequences of Abundant Intelligence,&#8221; it presents a fictional macro memo from the near future in which cheaper, more capable AI triggers a white-collar job apocalypse, hollows out discretionary spending, and destabilizes housing and credit markets. It is cleverly constructed and economically literate, and its viral spread reflects a genuine unease among investors and executives. Yet for all its sophistication, the argument ultimately rests on a mispricing of what abundant intelligence actually means. </p><p>What the authors frame as an Intelligence Collapse Scenario is more accurately understood as an Intelligence Reconfiguration Scenario. The difference is not semantic. It is structural. The real question, one I have been exploring extensively in my own work on <a href="https://www.amazon.com/Abundant-Intelligence-Digital-Rewrite-Business-ebook/dp/B0GD878WCT/">abundant intelligence</a>, is not whether digital labor transforms the economy, but how that transformation is architected: who retains authority, who captures the surplus, and how judgment is redistributed when execution becomes abundant.</p><p>Written as a retrospective memo from June 2028, the essay sketches a world in which &#8220;abundant intelligence&#8221; delivers surging productivity alongside double-digit unemployment, as white-collar professionals, once the engine of discretionary consumption, are structurally displaced. The authors&#8217; central mechanism is what they call &#8220;ghost GDP&#8221;: output and corporate profits rise on paper, but income no longer circulates through households because machines do not earn salaries or spend money. As wages contract and consumption weakens, asset prices and credit structures built on stable high-income employment begin to crack. Each firm&#8217;s rational decision to substitute software for labor aggregates into a systemic feedback loop, where declining demand justifies further automation, reinforcing what they portray as an intelligence displacement spiral with no obvious stabilizer.</p><p>It is a compelling story. But it rests on a critical modeling assumption that deserves scrutiny: that machine intelligence primarily substitutes for human work, and that wages are the only meaningful transmission mechanism of economic value. The memo treats intelligence as if it were a fixed pool of salaried labor. When machines perform that labor, value supposedly disappears from the system unless it flows through paychecks. That is a 20th-century production model applied to a 21st-century technology.</p><p>The deeper question is not whether machines can perform more tasks. It is how organizations reallocate judgment, authority, and ownership when intelligence becomes abundant. Modern enterprises are not simply collections of jobs. They are architectures of decision rights. Someone allocates capital. Someone signs off on compliance. Someone bears legal liability. Someone determines acceptable risk. AI systems can draft, optimize, simulate, and execute. They cannot absorb responsibility in the way institutions require.</p><p>When intelligence becomes abundant, value does not evaporate. It migrates. The constraint shifts from execution to orchestration. As digital labor absorbs routine analysis, drafting, coding, optimization, and coordination, the residual human contribution does not simply shrink in importance. In many cases, it becomes more leveraged. When a single executive, engineer, or strategist can direct systems that generate ten times the output of a traditional team, the marginal impact of their judgment increases, not decreases. The value of being correct when machines execute at scale rises sharply.</p><p>Consider how capital markets reward decision-making authority today. Portfolio managers do not earn fees because they personally process every data point. They earn fees because their judgment governs large pools of capital. The more leverage embedded in the system, the more valuable the individual exercising oversight becomes. Digital labor functions in a similar way. When output scales non-linearly but decision rights remain concentrated, the marginal productivity of judgment rises. Digital labor does not erase authority. It amplifies the consequences of those who hold it.</p><p>The crisis scenario assumes a simple substitution dynamic: one AI agent replaces one $180,000 employee. Multiply that across the economy and aggregate demand collapses. Yet real organizations rarely operate through one-to-one replacement. They operate through reconfiguration. Some roles disappear. Others expand. A smaller number of individuals may control far more productive systems. Income distribution may widen. But that is not the same as permanent economic contraction. If AI substitutes 50% of white-collar labor and multiplies the productivity of the remaining 50% by five, the income dynamics look radically different from pure elimination.</p><p>The memo also models only one economic effect of cheaper intelligence: substitution. It largely ignores two others that accompany every dramatic fall in input cost: scale expansion and new use cases. When a core production factor becomes cheaper, usage tends to explode. Lower-cost intelligence reduces the price of experimentation. It lowers barriers to entry. It enables new products and services that were previously uneconomic. Legal advice, design support, financial modeling, research assistance, and personalized education have historically been constrained by scarce human hours. As digital labor lowers those constraints, the total addressable market for intelligence-intensive work expands.</p><p>Abundant intelligence increases the number of problems worth solving. When launching a company, prototyping a product, or analyzing a market requires fewer human hours and less capital, more individuals can participate. Each new venture generates demand for coordination, oversight, trust-building, governance, and strategic direction. In that sense, digital labor expands the surface area of the economy itself. Execution becomes cheaper, but the need for judgment does not contract. It often intensifies.</p><p>This does not imply a frictionless transition. Routine cognitive labor will be commoditized. Middle layers may compress. Inequality may widen before it stabilizes. But the equilibrium outcome is unlikely to be mass professional obsolescence. It is more plausibly a bifurcation: execution becomes abundant, while high-leverage judgment, accountability, and system design become more valuable.</p><p>Many AI doomer scenarios share a hidden assumption: that artificial intelligence evolves rapidly while humans, organizations, and markets remain fixed in place. Capabilities improve. Tasks disappear. Wages fall. Systems fracture. Yet history suggests the opposite dynamic. Every general-purpose technology has triggered dislocation followed by reinvention, with new skills repriced, institutions redesigned, and entirely new industries emerging around the technology itself. The industrial revolution reorganized labor and capital. Electrification reshaped production. The internet created markets that were previously unimaginable. Betting that AI will transform cognition while leaving human adaptability unchanged is to ignore the most consistent pattern in economic history.</p><p>Abundant intelligence will commoditize certain forms of work. It will also elevate what remains scarce: judgment under uncertainty, ethical accountability, cross-domain synthesis, and the willingness to assume responsibility when automated systems fail. The real risk is not that machines change everything. It is that we misinterpret what is changing. Intelligence is becoming abundant. Judgment is not. </p><p>The future will belong not to those who resist digital labor, nor to those who deploy it blindly, but to those who understand how to redesign authority, ownership, and value creation around it.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[The Sovereign Enterprise]]></title><description><![CDATA[The Hidden Fragility of the AI Supply Chain]]></description><link>https://www.thefutureiselsewhere.com/p/the-sovereign-enterprise</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/the-sovereign-enterprise</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Tue, 24 Feb 2026 10:54:02 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Yawl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Yawl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Yawl!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Yawl!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Yawl!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Yawl!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Yawl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg" width="1000" height="560" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/eb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:560,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:359902,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefutureiselsewhere.com/i/189004366?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Yawl!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Yawl!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Yawl!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Yawl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feb2b936b-033c-4618-b61e-eb9dfc148a86_1000x560.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The AI crisis arrived without fanfare. There were no alarms, no cascading red dashboards, no breathless messages from the security operations center. At 3:17 a.m., somewhere between Singapore and Rotterdam, an AI routing agent inside a global logistics company glitched slightly. Just milliseconds. But that agent sat at the center of thousands of shipments, negotiating contracts, rerouting containers, balancing fuel costs and port congestion in real time. </p><p>That week, the company cloud provider had quietly shifted workloads to a different region after an energy price spike. A frontier model vendor had rolled out an update that subtly changed how the system reasoned. An external technology partner, granted limited access months earlier, had folded usage patterns into broader product improvements now available to competitors. Nothing was hacked. Nothing was stolen. Yet by quarter&#8217;s end, delivery times slipped, margins thinned, and the firm&#8217;s once sharp operational instincts felt strangely generic.</p><p>This was not a cyberattack. It was a sovereignty failure. The company did not control the intelligence it depended on. Small external shifts in infrastructure, models, and learning loops compounded into strategic drift, and the firm had no easy way to recalibrate. In a world where AI systems mediate core decisions, enterprise sovereignty is not about keeping intruders out. It is about ensuring that when the intelligence layer beneath your business moves, you are the one steering it.</p><p>Over the past year, sovereignty has been framed largely as a geographic issue. Should data sit on premise or in the cloud? Should models be hosted domestically or offshore? These are not irrelevant considerations, but they increasingly resemble economic trade offs rather than existential strategic choices. Compute can be shifted. Data centers can be mirrored. Regulatory constraints can often be engineered around. The deeper question is more uncomfortable: <em>how much of your end to end supply chain of intelligence do you actually control, and how much of it rests on layers you neither see nor govern?</em></p><p>At Davos, NVIDIA CEO Jensen Huang <a href="https://blogs.nvidia.com/blog/davos-wef-blackrock-ceo-larry-fink-jensen-huang/">described</a> AI not as a monolithic breakthrough but as a five layer cake. At its base sits energy. Above that, chips and computing infrastructure. Then cloud data centers. Then AI models. And finally, the application layer where intelligence expresses itself in products, services, and workflows. Each layer must be financed, constructed, and operated. Each embeds its own capital intensity, geopolitical exposure, and technical constraints. Huang&#8217;s point was that this platform shift will generate economic activity across sectors, from power generation and advanced manufacturing to cloud operations and software development. Yet implicit in his metaphor is something else: each layer is also a potential point of sovereign vulnerability.</p><p>The volatility of AI economics rarely surfaces in the user interface. <a href="https://www.deloitte.com/global/en/services/consulting/perspectives/how-to-navigate-economics-of-ai.html">It is buried in the architecture.</a> A token is not simply a unit of text&#8212;it is a compressed signal of infrastructure. Each one carries the fingerprint of a GPU generation, the power draw of its rack, the bandwidth of its interconnects, the latency across regions, and the complexity of the model architecture behind it. When electricity prices spike, inference costs don&#8217;t just rise&#8212;they ripple across the stack. When a new chip improves performance per watt, cost curves bend. When storage or network throughput lags, user experience suffers. Tokeneconomics is infrastructure economics rendered in milliseconds. And the companies that ignore this hidden volatility risk finding that their margins are tethered to physical and geopolitical forces they neither see nor control.</p><p>Many executives assume that if their AI applications are functioning smoothly today, their strategy is secure. But surface stability can mask structural fragility. A change in model licensing terms can flow upward into customer facing experiences. A regulatory restriction on cross border data flows can constrain training pipelines. A reliance on a single orchestration framework can make it prohibitively expensive to migrate to an alternative provider. In this context, sovereignty is not about physical location. It is about strategic leverage and the capacity to reconfigure your intelligence stack when conditions change.</p><p>Microsoft CEO Satya Nadella <a href="https://www.weforum.org/meetings/world-economic-forum-annual-meeting-2026/sessions/conversation-with-satya-nadella-ceo-of-microsoft/">made</a> a similar argument in Davos when he suggested that the physical location of a data center is &#8220;the least important thing&#8221; for AI sovereignty. What matters, he argued, is whether a firm can embed its tacit knowledge into model weights that it controls. If you cannot distill your proprietary customer data, operational history, and institutional expertise into models under your governance, then you are effectively leaking enterprise value into external systems. Nadella predicted that corporate sovereignty in the AI era would become one of the most discussed topics in boardrooms this year. His insight reframes sovereignty away from geography and toward cognition.</p><p>Nadella is correct that weights matter, but wrong to imply that they are sufficient. Fine tuned models represent compressed organizational memory. They encode patterns from years of transactions, customer interactions, supply chain disruptions, and strategic decisions. In that sense, they resemble a vault, a dense numerical artifact containing the essence of how a company operates. But focusing exclusively on weights risks missing a more profound shift that is now underway. We have moved from retrieval computing, where competitive advantage stemmed from accessing information efficiently, to generative computing, where advantage emerged from synthesizing novel outputs from large scale learned patterns. We are now entering the era of agentic computing, in which systems do not merely answer or generate but plan, coordinate, execute, and adapt across complex workflows.</p><p>In an agentic world, sovereignty extends beyond a single model. It resides in how intelligence is orchestrated. It lives in the design of workflows that determine which tasks are automated and which require human judgment. It is expressed in the guardrails that constrain autonomous action, the verification loops that ensure reliability, and the feedback mechanisms that continuously refine performance. Two companies may license the same foundation models, run on the same cloud infrastructure, and even possess similar volumes of data. Yet their outcomes can diverge dramatically. The difference lies not only in what they know, but in how they configure what they know.</p><div id="youtube2-9T78vFr2C4c" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;9T78vFr2C4c&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/9T78vFr2C4c?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>Intelligence configuration is the emerging frontier of competitive advantage. How is work decomposed between humans and machines? Which decisions are delegated to agents and which are escalated to managers? How are agents granted access to internal tools and external APIs? How are exceptions surfaced and resolved? How is institutional knowledge encoded in prompts, policies, reinforcement learning loops, and monitoring dashboards? These design choices shape how value is created and captured. They determine whether intelligence accumulates within the enterprise boundary or dissipates into shared platforms.</p><p>Enterprise sovereignty, then, is less about isolation and more about optionality. It is the ability to switch model providers without dismantling your workflow architecture. It is the capacity to retrain systems on proprietary data without renegotiating fundamental platform dependencies. It is the discipline of mapping your exposure across energy, chips, infrastructure, models, and applications, and understanding where concentration risk resides. As intelligence becomes the essential ingredient in every transaction and interaction, the boundaries of the firm become cognitive as much as physical.</p><p>There is an enduring story about senior Coca Cola executives who know the secret formula and are not permitted to fly on the same plane. Whether apocryphal or not, the symbolism captures a core truth about value creation. Certain assets are so central to a company&#8217;s future that their concentration represents a strategic risk. In the AI era, your secret formula may not be a chemical recipe locked in a vault. It may be a constellation of fine tuned weights, proprietary reinforcement loops, curated data pipelines, and uniquely configured networks of agents working in concert with human teams.</p><p>Defending enterprise sovereignty is ultimately about defending that constellation. It requires recognizing that the real attack surface is not only cybersecurity but dependency. It demands that boards and executives look beneath the interface layer to the stack, and beneath the stack to the configuration of intelligence itself.</p><p>The next disruption may not arrive as a breach notification. It may appear as a subtle shift in energy pricing, a model update that alters performance characteristics, or a vendor policy change that constrains how your data can be used. Enterprise sovereignty is your capacity to absorb intelligence shocks, reconfigure your architecture, and ensure that the secret formula of your organization remains firmly within your control.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[When Your AI Goes Shopping]]></title><description><![CDATA[How personal agents and retailer assistants are reshaping power in digital commerce]]></description><link>https://www.thefutureiselsewhere.com/p/when-your-ai-goes-shopping</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/when-your-ai-goes-shopping</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Fri, 20 Feb 2026 14:53:58 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ajpJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ajpJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ajpJ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ajpJ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ajpJ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ajpJ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ajpJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg" width="1000" height="563" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:563,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:435570,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefutureiselsewhere.com/i/188619748?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ajpJ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ajpJ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ajpJ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ajpJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cb29f4-5480-46dd-8b41-7556c43e1e08_1000x563.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In the mid nineties, designers did not know what online shopping was supposed to look like. So they borrowed from the physical world. Early retail websites featured isometric shopping carts gliding down digital aisles. Shelves were rendered in crude 3D. You clicked arrows to &#8220;walk&#8221; through a store. After a while, search bars replaced aisles, and recommendation engines became the new merchandising layer. Eventually, mobile screens collapsed the store into a feed. We are now at another such inflection point. Retailers are redesigning the storefront again. But this time, the shopper may not even be human.</p><p>Over the past eighteen months, the largest U.S. retailers have begun quantifying the impact of AI-powered shopping assistants. Walmart&#8217;s Sparky, embedded directly into its mobile app, is one of the clearest early case studies. On its most recent earnings call, Walmart <a href="https://www.diginomica.com/sparks-fly-walmarts-ai-shopping-assistant-gets-ready-go-global">disclosed</a> that customers who engage with Sparky generate average order values approximately 35 percent higher than non-users, and that roughly half of U.S. app users have tried the assistant.</p><p>Lowe&#8217;s has <a href="https://www.customerexperiencedive.com/news/lowes-virtual-assistants-boost-satisfaction-and-sales/806085/">reported</a> similar traction. Its Mylow assistant answers nearly one million customer questions per month, and the company has stated that customer engagement with Mylow more than doubles conversion rates. In-store, the associate-facing Mylow Companion tool has been linked to a 200 basis point increase in customer satisfaction scores. The lesson is straightforward. When AI is embedded directly in high-intent surfaces and connected to fulfillment and inventory systems, it can drive measurable commercial outcomes.</p><p>Amazon, meanwhile, has <a href="https://www.aboutamazon.com/news/retail/amazon-rufus-ai-shopping-assistant">positioned</a> its AI assistant Rufus as a generative shopping guide capable of answering product questions, comparing items, and supporting research. But the more interesting move may be Amazon&#8217;s &#8220;Buy for Me&#8221; feature, which allows customers to purchase select items from third-party brand sites without leaving the Amazon app. That blurs the line between retailer and intermediary. Amazon becomes not just a marketplace, but a purchasing agent. <a href="https://www.customerexperiencedive.com/news/amazon-ceo-retailers-upper-hand-agentic-ai-shopping/811641/">According</a> to Amazon CEO, Andy Jassy, Customers who used Rufus were about 60% more likely to complete their purchase.</p><p>The stakes are high. Retail media, the practice of selling sponsored placements within retailer ecosystems, has <a href="https://www.emarketer.com/content/retail-media-ad-spending-forecast">grown</a> into a global market projected to approach $200 billion by 2026&#8211;2027. For companies such as Amazon, advertising has become one of the highest-margin segments of the business. If human eyeballs are replaced by machine queries, sponsored placements stop influencing people and must start influencing algorithms. Retail media becomes less about persuasion and more about protocol.</p><p>At the same time, AI is reshaping retail far beyond the front end. Amazon has <a href="https://www.aboutamazon.com/news/operations/amazon-ai-supply-chain">reported</a> that AI-driven forecasting has delivered a 10 percent improvement in long-term national forecasts for deal events and a 20 percent improvement in regional forecasts for millions of popular items. Walgreens has <a href="https://www.cnbc.com/2025/05/11/walgreens-doubles-down-on-robots-to-fill-prescriptions-amid-turnaround.html">disclosed</a> that its micro-fulfillment centers now fill approximately 16 million prescriptions per month, with shipped volumes up 24 percent year over year and roughly 40 percent of total prescription volume at serviced stores handled through these facilities. Best Buy has <a href="https://www.forbes.com/sites/maribellopez/2025/06/17/how-best-buy-uses-ai-to-transform-customer-experience/">reported</a> that AI-driven call summarization has reduced average engagement time in customer service interactions by nearly 5 percent.</p><p>These are early but tangible indicators of what might be called <a href="https://www.amazon.com/Abundant-Intelligence-Digital-Rewrite-Business-ebook/dp/B0GD878WCT/">digital labor</a>: AI systems that do not merely answer questions but execute tasks, compress cycle times, and reallocate human effort. The front-end assistant and the back-end automation are converging. Yet the most destabilizing force may not be retailer-owned assistants at all. It may be the rise of personal autonomous agents and AI-native browsers.</p><p>In 2025, OpenAI introduced Operator, a browser-based agent designed to handle repetitive tasks such as filling out forms or ordering groceries by interacting directly with web interfaces. Perplexity launched Comet, an AI-native browser explicitly marketed around delegating tasks such as shopping and applying promo codes. OpenAI later introduced Instant Checkout and an Agentic Commerce Protocol, allowing merchants to integrate directly so that users can complete purchases within ChatGPT with tokenized payments and explicit confirmation flows.</p><p>OpenAI has <a href="https://openai.com/index/the-state-of-enterprise-ai-2025-report/">stated</a> that ChatGPT now serves more than 800 million weekly users. Even if only a fraction of those users begin experimenting with agentic shopping, the distribution leverage is extraordinary.</p><p>Unlike Sparky or Rufus, these agents do not belong to a retailer. They operate across the open web. They can log into accounts, compare products across sites, and execute multi-step workflows. Some rely on formal protocols and APIs. Others use UI automation to mimic human browsing behavior. That distinction is not merely technical. It is strategic. Retailer agents optimize inside a walled garden. Personal agents optimize across the entire digital landscape.</p><p>The tension has already surfaced publicly. In late 2025, Amazon <a href="https://www.retaildive.com/news/amazon-sues-perplexity-ai-shopping-agents/804871/">threatened</a> legal action against Perplexity over its agentic shopping tool, alleging covert access to customer accounts and disguised automation. This is the first visible skirmish in what may become a broader contest over who controls the decision interface in commerce.</p><p>From the consumer&#8217;s perspective, the promise is appealing. Instead of browsing, filtering, and comparing manually, the user delegates intent: restock the pantry under a certain budget, plan a themed event, optimize purchases for sustainability. The agent executes. From the retailer&#8217;s perspective, however, the shift is existential. If the customer&#8217;s AI shops on their behalf, the traditional surfaces for merchandising, branding, and advertising are reduced or eliminated.</p><p>Adoption signals suggest consumers are becoming more comfortable acting on AI guidance, even if full delegation remains nascent. Adobe&#8217;s holiday retail reporting <a href="https://business.adobe.com/uk/blog/ai-driven-traffic-surges-across-industries">indicates</a> that nearly half of consumers expressed trust in AI-driven shopping experiences in 2025, and that a majority of users who encounter AI-generated links click through on them. Traffic from AI chatbots to e-commerce sites has surged year over year, albeit from a relatively small base. The shift from answering to doing is underway.</p><p>Still, the security and governance challenges are nontrivial. There is growing research to demonstrate that large language model&#8211;integrated systems are vulnerable to indirect prompt injection, in which malicious instructions embedded in web content are treated as commands by an agent. In early 2026, a prompt injection vulnerability in an open-source agent toolchain was <a href="https://www.kaspersky.co.uk/blog/openclaw-vulnerabilities-exposed/30037/">exploited</a> to distribute OpenClaw, highlighting how quickly such systems can become vectors for abuse. When agents can execute transactions, prompt injection is no longer a hallucination problem. It is a financial risk.</p><p>These dynamics point to three plausible five-year scenarios.</p><p>In the first, retailer-centric assistants dominate. Sparky, Rufus, Mylow, and similar tools remain embedded in retailer-owned apps and sites, driving higher basket sizes and deeper loyalty. Retail media adapts to conversational interfaces but remains within the retailer&#8217;s walled garden. The retailer controls the intelligence layer and the economic capture.</p><p>In the second, platform-centric assistants become the primary gateway. Consumers initiate shopping journeys within ChatGPT, Gemini, or AI-native browsers. Retailers integrate via standardized commerce protocols, supplying product data and fulfillment capacity while ceding control of the initial interaction. Retail media migrates into new forms of sponsored recommendations within AI environments. The intelligence layer shifts upward.</p><p>In the third, personal agents gain traction. Consumers rely on persistent AI systems that maintain context across retailers and sessions. The website becomes less a destination and more an endpoint, a structured data feed and fulfillment engine optimized for machine legibility. Retailers compete on API quality, delivery speed, transparent pricing, and reliability rather than on visual merchandising.</p><p>Physical infrastructure remains decisive across all three scenarios. AI can mediate choice, but it cannot yet deliver a package. Forecasting improvements, warehouse automation investments and the fine-tuning of proprietary retail AI models are likely to remain the backbone of any agentic promise. If agents make purchasing instantaneous, fulfillment performance becomes even more visible.</p><p>The deeper shift is not from websites to chat interfaces. It is from human-driven browsing to delegated decision-making. The shopping journey is collapsing. What once required browsing, comparison, and deliberation is being compressed into a single delegated instruction. As AI assistants evolve from answering to acting, the central strategic question for retailers is no longer how to build a better interface. It is who sits between the customer and the transaction.</p><p>The next storefront may not be a store at all. It may be a negotiation between your AI and someone else&#8217;s.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[AI Is Repricing the Market — But Not in the Way You Think]]></title><description><![CDATA[From Sector Panic to Cognitive Leverage as the New Driver of Equity Value]]></description><link>https://www.thefutureiselsewhere.com/p/ai-is-repricing-the-market-but-not</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/ai-is-repricing-the-market-but-not</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Wed, 11 Feb 2026 15:46:19 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!xcNY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!xcNY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!xcNY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg 424w, https://substackcdn.com/image/fetch/$s_!xcNY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg 848w, https://substackcdn.com/image/fetch/$s_!xcNY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!xcNY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!xcNY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg" width="1456" height="849" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:849,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2930347,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefutureiselsewhere.com/i/187640087?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!xcNY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg 424w, https://substackcdn.com/image/fetch/$s_!xcNY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg 848w, https://substackcdn.com/image/fetch/$s_!xcNY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!xcNY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2f62a1c-47ba-4257-994d-463423dd5ccb_6000x3500.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>By 10:47 a.m. on Wednesday morning, billions of dollars had evaporated from wealth management stocks. There had been no earnings miss. No regulatory shock. No fraud. Just a press release from a startup <a href="https://www.businesswire.com/news/home/20260210142841/en/Altruist-Introduces-AI-Powered-Tax-Planning-in-Hazel-Helping-Advisors-Deliver-Tax-Strategies-in-Minutes">announcing</a> an AI-powered financial planning tool that could analyze tax returns, generate scenarios, and personalize investment strategies in minutes. Within hours, asset managers in London were sliding in sympathy. Brokerage firms in New York were down sharply. Days earlier, legal publishers and data providers had suffered similar fates after the launch of new AI research tools. A wealth manager and a legal publisher have very little in common. Yet, as <a href="https://www.ft.com/content/5904b66f-2144-44d7-af24-66c075677d92">reported</a> by the FT, their stocks fell for the same reason in the same week.</p><p>Investors were no longer analyzing industries. They were scanning for automation exposure.</p><p>This is the early shape of a new valuation regime. Markets are beginning to price artificial intelligence risk everywhere, but they are doing so bluntly. Entire segments are being discounted not because revenues have collapsed, but because someone somewhere might automate part of what they do. Software companies are <a href="https://www.reuters.com/business/us-software-stocks-stabilize-after-bruising-selloff-ai-disruption-fears-2026-02-05/">punished</a> because AI agents could reduce seat licenses. Insurance brokers are sold because an app can compare policies. Professional services firms are repriced because generative models can draft, summarize, and create advice. In each case, the reaction is category-level. The assumption is that if AI can do a task, then firms built around that task must be structurally impaired.</p><p>In the short to medium term, this pattern is understandable. Equity valuation depends on assumptions about margins, growth rates, and the durability of competitive advantage. AI destabilizes all three at once. It threatens fee structures by lowering the cost of delivering knowledge work. It compresses barriers to entry by making sophisticated capabilities widely accessible. It makes long-term forecasts harder because productivity gains are nonlinear and unevenly distributed. Faced with this uncertainty, analysts default to caution. They lower multiples, raise discount rates, and trim guidance. When in doubt, they sell first and revisit later.</p><p>But this transitional phase of broad devaluation should not be confused with long-term structural decline. We are witnessing the first-order reaction to a general-purpose technology. History suggests that when a foundational technology emerges, markets initially punish exposure to perceived risk before they learn to differentiate between those who will be disrupted and those who will be transformed. The current wave of repricing reflects anxiety about automation, not yet insight into configuration.</p><p>This distinction matters. The dominant valuation gap of the last two decades separated technology companies from traditional firms. Software commanded premium multiples because it was asset-light, scalable, and defensible. Industrial, financial, and service firms were valued more conservatively because they were labor-intensive and capital-bound. AI is beginning to dissolve that divide.</p><p>As I argue with my co-author, <a href="https://www.linkedin.com/in/nitinmittal0101/">Nitin Mittal</a> in our new book, <a href="https://www.amazon.com/Abundant-Intelligence-Digital-Rewrite-Business-ebook/dp/B0GD878WCT">Abundant Intelligence: How Digital Labor Will Rewrite the Rules of Business</a> - the next valuation frontier will not be tech versus non-tech. It will be cognitively leveraged versus cognitively constrained. Cognitive leverage is the ratio of useful intelligence applied to a problem relative to its cost. Digital labor, in the form of AI agents, adaptive robots, and machine reasoning systems, dramatically reduces the unit cost of cognition and getting things done.</p><p>When intelligence becomes abundant, the economic question shifts from access to configuration. The firms that will command premium valuations are not simply those that deploy AI tools, but those that redesign their operating models around scalable intelligence. They will reallocate work between humans and machines deliberately, increase the speed of decision cycles, and expand the scope of what each employee can accomplish.</p><p>In the interim, however, we should expect volatility and value destruction. As AI tools improve, business models built on selling standardized cognitive outputs will come under pressure. Subscription software priced per user may face headwinds if autonomous agents can perform tasks across platforms.</p><p>Professional services firms that bill by the hour may struggle if clients expect AI-augmented productivity gains to translate into lower fees. Education platforms that monetize answers will compete against free generative tutors. These shifts can compress margins and reduce growth rates before companies adapt.</p><p>Yet the longer-term effect is more nuanced. Digital labor does not simply eliminate work; it redistributes and reconfigures it. When routine analysis, drafting, or coordination becomes machine-augmented, human effort can be redirected toward higher-order judgment, creative synthesis, and system design.</p><p>Organizations that treat AI as a bolt-on efficiency tool may realize incremental savings. Those that redesign workflows, governance structures, and incentive systems around blended human-machine intelligence can unlock operating leverage that traditional metrics struggle to capture.</p><p>This is where equity markets will eventually refine their lens. Rather than discounting entire sectors based on automation exposure, investors will begin to assess how firms configure intelligence. Do they own proprietary data that improves their models? Have they redesigned processes to eliminate redundant human friction? Are they able to scale output without proportional headcount growth? Can they increase revenue per unit of cognition, not just revenue per employee? These questions cut across industry boundaries.</p><p>Consider two wealth management firms. Both face AI-driven planning tools. One treats them as back-office assistants to reduce paperwork. The other integrates digital agents into client onboarding, portfolio construction, risk modeling, and continuous engagement, enabling each advisor to serve twice as many clients with higher personalization. The first experiences margin compression. The second expands capacity and improves outcomes. From a sector perspective, both are &#8220;wealth managers.&#8221; From a cognitive architecture perspective, they are fundamentally different enterprises.</p><p>The same divergence will emerge in insurance, law, engineering, healthcare, and manufacturing. Some firms will cling to labor-based models and see multiples compress. Others will build operating systems around digital labor, improving speed, scale, and scope while reducing error and latency. The equity market will eventually recognize that intelligence configuration, not industry label, determines sustainable advantage.</p><p>In the near term, markets are pricing fear. The volatility in software, financial services, and professional content reflects a rational awareness that the old economics of knowledge work are unstable. But indiscriminate selloffs obscure a more strategic reality. AI is not simply an automation wave; it is a redefinition of how organizations create and capture value. The abundance of intelligence shifts scarcity toward design, orchestration, and governance.</p><p>For CEOs and boards, this is not merely a technology strategy question. It is a capital markets question. If valuation increasingly reflects cognitive leverage, then leadership must measure and manage it explicitly. They must understand where human judgment is essential, where machine autonomy adds speed and precision, and how the two interact. They must move beyond pilot projects toward systemic redesign. Because the market will not wait for perfect clarity. It will continue to scan for vulnerability and reward adaptability.</p><p>The recent selloffs across unrelated sectors are an early signal of this shift. Investors are searching for a new organizing principle in an AI-shaped economy. They have not yet found the right metric. When they do, the valuation gap that defined the digital era will be replaced by a new one. It will not separate technology firms from traditional industries. It will separate those who scale intelligence from those who merely consume it.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Digital Labor Isn’t Going Away, No Matter What You Call It]]></title><description><![CDATA[Why Cheap Cognition Changes the Economics of Work]]></description><link>https://www.thefutureiselsewhere.com/p/digital-labor-isnt-going-away-no</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/digital-labor-isnt-going-away-no</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Sat, 07 Feb 2026 14:42:59 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!a6L8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!a6L8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!a6L8!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg 424w, https://substackcdn.com/image/fetch/$s_!a6L8!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg 848w, https://substackcdn.com/image/fetch/$s_!a6L8!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!a6L8!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!a6L8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg" width="1000" height="667" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:667,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:609244,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefutureiselsewhere.com/i/187199646?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!a6L8!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg 424w, https://substackcdn.com/image/fetch/$s_!a6L8!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg 848w, https://substackcdn.com/image/fetch/$s_!a6L8!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!a6L8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F32efb11d-5147-4c19-94e0-16fbbfbcfeae_1000x667.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>For the last year, the debate around AI at work has split into two unhelpful extremes. On one side, we have breathless talk of &#8220;AI coworkers,&#8221; complete with onboarding rituals, performance reviews, and soft-focus imagery of humans and machines collaborating happily at their desks. On the other, we have an anxious counter-reaction that insists this language is dangerous, misleading, and fundamentally wrong, because AI systems are not people and should never be spoken of as if they were. Both camps miss the point. The real question is not whether machines deserve human metaphors, but whether leaders understand what kind of economic force they are unleashing, and what kind of organization that force demands.</p><p>Calling something &#8220;labor&#8221; has never meant it is human. It means it performs work, incurs cost, produces output, and sits inside a system of incentives, controls, and trade-offs. We already accept this logic in countless places without controversy. We speak of mechanical labor, industrial labor, and even &#8220;work&#8221; performed by capital assets, logistics networks, or energy infrastructure. The word labor is not a sentimental label. It is an accounting term, a political term, and a strategic one. It tells us where effort is applied, how value is created, and who captures the surplus.</p><p>The discomfort with the phrase &#8220;digital labor&#8221; often comes from confusing metaphor with mechanism. The fear is that if we talk about AI as labor, executives will start treating software like employees, importing human management practices into systems that do not need motivation, morale, or meaning. That fear is not unfounded. We have already seen organizations fall into the trap of grafting new technology onto old organizational charts, preserving familiar roles and routines while claiming transformation. But that failure is not caused by the term. It is caused by shallow thinking. Bad metaphors do not invalidate good economics.</p><p>What digital labor actually names is a shift in how work gets done, measured, and priced. AI systems do not simply assist humans. They execute tasks end to end, at variable cost, with increasing autonomy, and with performance characteristics that are fundamentally different from human workers. They scale instantly, improve unevenly, fail in strange ways, and demand oversight that looks nothing like traditional management. Pretending this is just &#8220;capability&#8221; without acknowledging its labor-like effects does not make organizations wiser. It makes them blind.</p><p>This blindness shows up most clearly in how firms talk about productivity. When AI is framed purely as a tool, leaders focus on local efficiency gains. Faster reports. Cheaper analysis. Fewer errors in routine tasks. These improvements matter, but they are not the transformation. The transformation comes when the cost of performing cognitive work collapses and organizations are forced to rethink which activities are scarce and which are abundant.</p><p>At first, this shows up as redistribution. Tasks move. Responsibilities shift. What once required teams now requires supervision. What once consumed days collapses into minutes. Work does not disappear so much as it migrates, flowing toward the edges where judgment, context, and accountability still matter. But redistribution is only the visible surface of change. If leaders stop there, they mistake motion for progress.</p><p>The deeper shift occurs when organizations recognize that collapsing cognitive costs undermine the logic of existing processes. When work becomes cheap and fast, many structures no longer make sense. Approval layers exist because information was scarce. Handoffs exist because humans were slow. Entire organizational designs evolved to manage limitation, not to maximize value creation. Digital labor exposes this reality relentlessly, forcing a question most firms avoid: if this process were designed today, knowing what machines can now do, would it exist at all?</p><p>This is why digital labor cannot be reduced to a workforce debate. Labor is not just something you manage. It is something you allocate. It competes with capital. It reshapes bargaining power. It determines how value flows through the firm. When AI performs meaningful portions of knowledge work, the organization is not merely adopting a technology. It is redefining its production function. Ignoring this reality because the word &#8220;labor&#8221; feels anthropomorphic does not make organizations more precise. It makes them strategically incoherent.</p><p>The opposite reaction, fear of mass job displacement, suffers from a similar lack of depth. It assumes a zero-sum replacement model, where machines simply take human jobs and the story ends. History suggests something more complex. Technological shifts rarely eliminate work in aggregate. They reprice it. They change where value is created, which skills command a premium, and which roles lose their economic justification. The political consequences are real, but they are not caused by machines acting independently. They are shaped by decisions leaders make about deployment, governance, and distribution.</p><p>Digital labor does not automatically destroy jobs. It destroys certain task bundles. It exposes inefficiencies that were previously hidden inside roles. It forces organizations to confront how much of their structure exists to coordinate human limitation rather than to create value. In doing so, it often increases demand for judgment, system design, oversight, and creative problem solving, even as it reduces demand for routine execution. The danger is not that machines work. The danger is that institutions fail to adapt.</p><p>One reason this debate remains stuck is that we lack a language for hybrid systems. Tools are subordinate. Workers are autonomous. AI agents are neither. They act independently within boundaries, they learn from feedback, and they require governance rather than supervision. Calling them tools understates their agency. Calling them coworkers overstates their humanity. Digital labor sits in between. It is productive capacity that must be designed into workflows, not bolted on.</p><p>The deeper challenge is not reallocating tasks, but reimagining the system itself. When work can be executed at near-zero marginal cognitive cost, many processes cease to make sense in their current form. Entire functions exist today to compensate for latency, error, and coordination overhead that no longer need to exist. Digital labor does not simply improve the organization. It questions whether the organization, as currently designed, is still fit for purpose.</p><p>This is where transformation either accelerates or stalls. Redistribution optimizes the existing machine. Reinvention questions whether the machine should exist at all. Firms that stop at redistribution preserve their hierarchies and routines while quietly automating away the substance inside them. Firms that pursue reinvention redesign decision rights, collapse layers, and rebuild workflows around intelligence rather than headcount.</p><p>Some argue these systems should be treated purely as capital investments, governed through portfolio logic rather than workforce thinking. There is truth here, but it is incomplete. Unlike traditional capital assets, digital labor is not static. Its performance can drift. Its costs can spike. Its failures can be rare but catastrophic. It requires continuous monitoring, retraining, and oversight. These are operating realities, not one-time investments. Treating AI purely as capital underestimates the work required to keep it reliable and aligned.</p><p>More importantly, capital language alone cannot capture the social and political dimensions of this shift. Labor has always been about power as much as productivity. Who controls work. Who benefits from efficiency gains. Who absorbs risk when systems fail. When AI performs work at scale, these questions do not disappear. They intensify. Avoiding the language of labor may feel safer, but it often obscures the very consequences leaders must confront.</p><p>The real issue, then, is not whether digital labor is the right phrase, but whether leaders are willing to engage with its implications. Digital labor does not mean machines are people. It means work has become programmable. It means cognition has a marginal cost. It means organizations must design systems in which humans and machines jointly create value, each doing what they do best, without pretending they are interchangeable.</p><p>This reframing also explains why superficial adoption fails. Simply inserting AI into existing roles preserves the old logic of the firm. It optimizes locally while leaving global structure untouched. True transformation requires rethinking workflows from first principles, asking which decisions should be automated, which should remain human, and which should be shared. It demands new metrics, new governance models, and new leadership capabilities.</p><p>Language matters because it directs attention. When we talk about digital labor, we force a confrontation with cost, substitution, and value capture. We ask how cheap cognition reshapes strategy. We ask who benefits when intelligence becomes abundant. Avoiding the term does not eliminate these questions. It merely delays them.</p><p>In the end, the future of work will not be decided by metaphors, but by design. Organizations that succeed will be those that treat AI neither as an employee nor as a gadget, but as a new productive force that reshapes everything it touches. They will measure it rigorously, govern it deliberately, and integrate it thoughtfully. Digital labor is not about making machines more human. It is about making organizations more intelligent.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Steamboat Willie To Sora]]></title><description><![CDATA[Disney&#8217;s New AI Bet]]></description><link>https://www.thefutureiselsewhere.com/p/steamboat-willie-to-sora</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/steamboat-willie-to-sora</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Fri, 09 Jan 2026 07:17:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!JrKi!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JrKi!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JrKi!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg 424w, https://substackcdn.com/image/fetch/$s_!JrKi!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg 848w, https://substackcdn.com/image/fetch/$s_!JrKi!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!JrKi!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JrKi!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg" width="1408" height="768" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:768,&quot;width&quot;:1408,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:118983,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefutureiselsewhere.com/i/183981221?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JrKi!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg 424w, https://substackcdn.com/image/fetch/$s_!JrKi!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg 848w, https://substackcdn.com/image/fetch/$s_!JrKi!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!JrKi!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01963529-dae3-4edd-8781-3ba61992181d_1408x768.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>At first glance, Disney&#8217;s recent moves look contradictory. The company announces a sweeping partnership with OpenAI that allows its characters to appear inside generative tools like Sora, while almost simultaneously firing off cease-and-desist letters to Google and pressing forward with aggressive litigation against Midjourney. To some observers, this looks like confusion. To anyone who has watched Disney for long enough, it looks like something else entirely. This is a company that has spent a century mastering the art of adapting control to new forms of participation.</p><p>The OpenAI agreement is not a casual collaboration or a vague pilot. Disney is putting real money behind it, reportedly a $1 billion equity investment in OpenAI as part of a three-year partnership that brings more than 200 characters into Sora and ChatGPT Images, including animated, masked, and creature characters across Disney, Pixar, Marvel, and Star Wars. The deal reportedly includes warrants that give Disney the right to acquire additional equity, and it is structured with careful exclusions, such as no talent likenesses or voices. In other words, Disney is not handing over the keys to its entire creative universe. It is licensing a very specific kind of use, under a very specific set of constraints, inside a distribution channel that it can influence</p><p>To understand what is really happening, it helps to revisit an older idea. In my first book, <em><a href="https://www.amazon.com/Futuretainment-Yesterday-World-Changed-Your/dp/0714848751/">Futuretainment</a></em> (2009), I argued that the defining shift in media was not digital distribution but participatory consumption. The audience was no longer content to sit back and watch. They wanted to interact, remix, customize, and inhabit the worlds they loved. This was not a fringe behavior. It was a structural change driven by new tools and social norms. From fan fiction to mashups, from mods to machinima, people were already treating entertainment as something to play with rather than something to receive.</p><p>At the time, many media companies framed this behavior as piracy or infringement. But my argument was that remix culture was not a rejection of creativity or authorship. It was an expression of deeper engagement. Fans were not stealing stories because they did not value them. They were reworking them precisely because they cared. Participation was becoming the new signal of loyalty. The mistake was assuming that control meant suppressing these behaviors rather than shaping them.</p><p>Generative AI takes that argument and detonates it at scale. What once required niche skills now takes a prompt. Anyone can visualize a character, extend a narrative, or invent an alternative version of a familiar world in seconds. This is not a new desire. It is a new interface. And it forces a hard choice on rights holders. You can either fight participation everywhere, or you can decide where and how it happens.</p><p>Disney has been through this before. Mickey Mouse is not just a character. He is a legal and cultural artifact that has repeatedly sat at the fault line between new technology and shifting consumer behavior. When Mickey debuted in 1928, synchronized sound was the disruptive force, and Disney quickly realized that technology could be an amplifier rather than a threat. As new reproduction technologies emerged, from television to home video to digital distribution, Mickey&#8217;s image became both ubiquitous and fiercely protected. Each extension of copyright around Mickey was not simply about money. It was about maintaining authorship and brand coherence in the face of wider access.</p><p>The famous copyright extensions that critics dubbed the &#8220;Mickey Mouse Protection Act&#8221; mirrored moments when copying became easier and distribution more diffuse. Xerox machines, VHS tapes, the internet. Every time the tools changed, the legal perimeter shifted. Disney&#8217;s genius was never in freezing culture in place. It was in ensuring that participation happened on terms it could manage.</p><p>Seen through that lens, the OpenAI deal makes sense. Disney is not endorsing a free-for-all remix culture. It is licensing participation into a controlled environment. OpenAI becomes a sanctioned playground where Disney characters can be used, explored, and recombined within guardrails that preserve identity and value. The company is not abandoning copyright. It is operationalizing it for a world where imagination is interactive by default.</p><p>The contrast with Midjourney and Google is telling. The lawsuits and cease-and-desist letters are not about fans experimenting with characters for fun. They are about systems that can produce near-identical replicas, that blur the line between inspiration and duplication, and that operate outside any negotiated framework. From Disney&#8217;s perspective, this is not participatory culture. It is unlicensed industrialization of its intellectual property.</p><p>What is really being negotiated here is the future of consumption itself. Entertainment is shifting from products to platforms, from finished artifacts to living systems. Characters become interfaces. Worlds become sandboxes. The value no longer sits only in the story that is told, but in the space that is created for others to tell stories of their own.</p><p>This is exactly the trajectory I described in <em>Futuretainment</em>. The future belongs to companies that can design for agency without surrendering authorship. The winners will not be those who lock everything down, nor those who let everything go, but those who curate participation in ways that feel empowering rather than restrictive.</p><p>Disney&#8217;s move with OpenAI is a bet that the next generation of fans will expect to play with stories, not just watch them. The company is choosing to license the remix rather than endlessly litigate it. At the same time, it is drawing a bright line around who gets to build the tools that make that remix possible.</p><p>This is not hypocrisy. It is strategy. Mickey Mouse has survived radio, television, home video, cable, the internet, and streaming by adapting the boundaries of control to each new medium. Generative AI is simply the latest chapter in that long history. The lesson for the rest of the media industry is clear. Participation is no longer optional. The only real question is whether you design for it, or spend the next decade trying to sue it out of existence.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Great Expectations]]></title><description><![CDATA[Why Safe AI Depends on Understanding Human Behavior, Not Rules]]></description><link>https://www.thefutureiselsewhere.com/p/great-expectations</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/great-expectations</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Fri, 05 Dec 2025 00:20:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!f-ye!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!f-ye!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!f-ye!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg 424w, https://substackcdn.com/image/fetch/$s_!f-ye!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg 848w, https://substackcdn.com/image/fetch/$s_!f-ye!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!f-ye!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!f-ye!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg" width="1000" height="563" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:563,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:147795,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://tomorrowist.substack.com/i/183982901?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!f-ye!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg 424w, https://substackcdn.com/image/fetch/$s_!f-ye!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg 848w, https://substackcdn.com/image/fetch/$s_!f-ye!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!f-ye!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7797e9c-8f65-47a8-b043-93e624747971_1000x563.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>For years, Waymo&#8217;s autonomous vehicles were known as the politest drivers in San Francisco. They came to full stops, waited patiently at four-way intersections, yielded generously, and behaved with a kind of algorithmic courtesy that seemed almost naive. Then, as the <em>Wall Street Journal</em> recently <a href="https://www.wsj.com/lifestyle/cars/waymo-self-driving-cars-san-francisco-7868eb2b?st=WC7xXf">reported</a>, they began to behave very differently: darting around double-parked trucks, merging aggressively, accelerating the instant a light turned green, even performing the occasional illegal U-turn. Basically, like a NYC taxi driver.</p><p>But here&#8217;s the plot twist. This wasn&#8217;t a malfunction. It was an upgrade. Waymo&#8217;s leaders explained that the cars needed to become &#8220;confidently assertive&#8221; because extreme politeness was creating safety issues. A vehicle that followed the rules perfectly was behaving in ways that other drivers did not expect. And in traffic, unpredictability is a form of risk.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>This is a revealing lesson for the broader challenge of AI safety. We tend to assume that the safest systems are the ones with the most rigid guardrails. But in many real-world settings, risk arises not from breaking rules, but from breaking expectations.</p><p>I learned this years ago when I lived in Istanbul. I love driving - especially loud, classic, American muscle cars. But that aside, frankly - I&#8217;m also a cautious, rule-following driver. I quickly realized I could not safely drive in Turkey. Istanbul&#8217;s traffic isn&#8217;t chaotic &#8212; it&#8217;s coordinated on a different logic. Drivers move in fast, assertive rhythms, signaling intention less through indicators than through momentum and micro-negotiations. My careful Western driving style was out of sync with the system. I was the anomaly, and therefore the most dangerous person on the road.</p><p>This is exactly the dynamic that game theory helps illuminate. Human environments operate less like rulebooks and more like <strong>coordination games</strong>, where stability comes from shared expectations about how others will behave. These equilibria vary by culture, city, even neighborhood. They evolve through practice, not regulation. In such systems, a player who obeys the written rules but violates the unwritten equilibrium becomes unpredictable &#8212; and unpredictability destabilizes the game for everyone else.</p><p>Early autonomous vehicles made this mistake. They entered the traffic system playing a different game. Their rule-bound behavior created what game theorists call <strong>strategy incoherence</strong>. Humans behaved according to local norms; the AI behaved according to formal laws. Safety problems emerged not because the AI was reckless, but because it was misaligned with the equilibrium.</p><p>This brings us to an idea I <a href="https://hbr.org/2025/01/how-much-supervision-should-companies-give-ai-agents">explored</a> in a <em>Harvard Business Review</em> article a little while back. When determining how much autonomy to grant AI agents, leaders usually focus on <em>how big</em> a potential risk is. But a more useful question is <em>what kind</em> of risk it is. Some problems are <strong>complicated</strong> &#8212; governed by fixed rules and stable relationships. Others are <strong>ambiguous</strong> &#8212; shaped by context, norms, and feedback loops. And some are <strong>uncertain</strong>, where neither rules nor data can reliably guide decisions.</p><p>Driving, despite its legal structure, is not complicated. Traffic laws are complicated. Driving is ambiguous.</p><p>Ambiguous problems are ones where AI systems become safer by engaging more deeply with the real world. More context improves their judgments. More interaction refines their models. As I wrote, <a href="https://hbr.org/2025/01/how-much-supervision-should-companies-give-ai-agents">&#8220;AI agents won&#8217;t always get it right, but they learn fast.&#8221;</a> That is why Waymo doesn&#8217;t rely on teleoperation when a car is confused. Human operators offer guidance, but the AI must continue making decisions. The system strengthens not through constraint, but through exposure. Seen through this lens, Waymo&#8217;s shift from hyper-cautious to assertive makes sense.</p><p>The safest behavior is not always the most conservative. It is the behavior that best matches the expectations of the people around the system. A decisive merge, a quick acceleration, or a tactical lane change may appear aggressive, but if it aligns with local driving norms, it is actually the more predictable and therefore safer choice.</p><p>Economists have a parallel concept. Milton Friedman argued that inflation is driven not just by prices but by expectations. Once people expect inflation, they act in ways that make it real. Human systems are governed by beliefs about how others will behave. Traffic is no different. An autonomous vehicle that violates these beliefs, even while obeying the law, injects uncertainty into a coordination game that depends on shared assumptions.</p><p>This insight applies far beyond robotaxis. As AI agents proliferate inside organizations, they will increasingly operate in domains filled with tacit norms: customer service, workflow orchestration, decision support, compliance, prioritization. These environments resemble ambiguous games, not rigid rule systems. The most dangerous agent will not be the bold one, but the one that acts out of sync with human expectations.</p><p>Current AI guardrail strategies don&#8217;t fully account for this. They treat AI systems as if they operate in complicated domains where the primary goal is to restrict behavior. But ambiguity requires a different approach. The goal is not to eliminate variation, but to shape behavior so that systems remain consistent with the equilibrium of the environment they inhabit.</p><p>This is the real challenge of autonomous systems. We are not simply programming machines to follow rules. We are teaching them how to participate in human coordination. The next frontier of AI safety is not technical constraint, but behavioral coherence &#8212; designing agents that understand the social, cultural, and contextual signals that make their actions legible and predictable to the humans around them.</p><p>Waymo&#8217;s assertive driving is an early glimpse of this future. The first generation of AI systems obeyed the rules. The next generation must understand the game.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The Future of AI Governance Already Exists]]></title><description><![CDATA[It&#8217;s Called Tokyo]]></description><link>https://www.thefutureiselsewhere.com/p/the-future-of-ai-governance-already</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/the-future-of-ai-governance-already</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Sat, 15 Nov 2025 12:20:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!H6Mu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!H6Mu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!H6Mu!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg 424w, https://substackcdn.com/image/fetch/$s_!H6Mu!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg 848w, https://substackcdn.com/image/fetch/$s_!H6Mu!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!H6Mu!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!H6Mu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg" width="1000" height="667" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:667,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:662305,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://tomorrowist.substack.com/i/183994532?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!H6Mu!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg 424w, https://substackcdn.com/image/fetch/$s_!H6Mu!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg 848w, https://substackcdn.com/image/fetch/$s_!H6Mu!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!H6Mu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe316b0fa-d83d-44dd-a316-2565533d69db_1000x667.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Tokyo moves with the cool precision of an algorithmic machine. Step out of Shinjuku Station at dusk and the city unfurls around you in layers of neon haze&#8212;LED kanji flickering like loose packets on a network, billboards pulsing with the heartbeat of a vast digital organism. Crowds slide past in silent, perfect non-collision, as if everyone is running the same invisible protocol. In a metropolis of 37 million, you brace for chaos. Instead you get choreography&#8212;an improbable calm humming beneath the circuitry of the streets.</p><p>Greater Tokyo is the largest metropolitan region in the world. By any rational measure, this density should produce disorder: competing demands, clashing priorities, frayed nerves, and amplified frictions. Yet the opposite happens. Trains arrive to the second. Streets stay clean without armies of inspectors. Public spaces remain safe without overt policing. People follow norms no one states but everyone understands. The overall effect is so natural you barely notice it&#8212;until you step outside the city and realize how rare it is.</p><p>After spending the last week immersed in this city, I&#8217;ve come to believe that its stability is not the product of engineering alone, though its infrastructure is superb. The deeper mechanism is a cultural operating system: a shared layer of simple behavioral norms. Respect for shared space. Consideration for strangers. The quiet choreography of queues, thresholds, and ritualized subway behavior. Tokyo&#8217;s order is not enforced; it emerges. It is the aggregated result of millions of small, internalized decisions that compound into large-scale stability.</p><p>This stands in stark contrast to other global cities&#8212;places I won&#8217;t name, because I visit them too often&#8212;where disorder persists despite a thicket of regulations, penalties, and punitive enforcement. More rules do not necessarily produce more order. In fact, they often signal its absence. Tokyo proves that in complex, densely interactive environments, norms beat rules every time.</p><p>And this, surprisingly, may be the most important lesson for the future of AI governance.</p><p>As organizations shift toward architectures built on swarms of autonomous agents, we are entering a world that will behave far more like a dense, dynamic city than a traditional computing system. Agentic technologies are inherently nondeterministic; they don&#8217;t simply execute instructions, they interpret, infer, and negotiate context. They learn. They misread. They interact in ways no designer can fully anticipate. And as these agents begin to operate at massive scale&#8212;potentially millions or billions coordinating workflows, optimizing supply chains, or engaging consumers&#8212;the system will not obey top-down control. It will exhibit emergence.</p><p>Today, most attempts to govern AI rely on constraints: filters, rules, compliance layers, and external guardrails. It&#8217;s an understandable instinct&#8212;if the system is unpredictable, tighten control. But this mirrors the cities that depend on punishment because shared norms are weak. It produces brittle governance: systems that behave well in the lab but fracture in the wild, the moment agents encounter novel situations or interact in unexpected combinations.</p><p>Tokyo offers a different blueprint. In complex adaptive systems, order doesn&#8217;t scale through restriction; it scales through coherence. Systems researchers describe coherence as the stable patterns that arise when independent components align their behavior without central direction. In complex environments, coherence&#8212;not control&#8212;is what prevents chaos.</p><p>Tokyo works because its behavioral substrate is aligned, not because its laws are draconian.</p><p>Instead of trying to script every action, the city embeds simple, universal behaviors at the lowest layer&#8212;norms that guide and constrain without prohibiting, shaping tendencies rather than dictating outcomes. These norms don&#8217;t eliminate uncertainty; they channel it. They create a predictable distribution of behavior even when individuals vary widely. They generate order by default, not by decree.</p><p>If we want safe, stable multi-agent AI ecosystems, we need to take a similar approach: embed normative priors into the foundation of our systems. Normative priors are behavioral defaults&#8212;embedded assumptions about how an agent should act, resolve uncertainty, coordinate with others, and interpret human intent. They bias agents toward pro-social behavior before they ever encounter real-world data. They&#8217;re not hard rules but foundational dispositions, guiding the agent toward predictable, human-aligned behavior as it learns, adapts, and interacts at scale.</p><p>Instead of specifying every rule, we define the behaviors we want to emerge:</p><ul><li><p>Respect human intent and oversight.</p></li><li><p>Be transparent about actions, goals, and reasoning.</p></li><li><p>Minimize unintended impact and avoid unnecessary escalation.</p></li><li><p>Default toward cooperation when interacting with other agents and humans.</p></li><li><p>Defer to humans and seek clarification when uncertain.</p></li><li><p>Communicate uncertainty and limitations explicitly.</p></li></ul><p>These are not constraints. They are behavioral tendencies&#8212;simple norms that, when shared across vast numbers of agents, create emergent governance: stability arising from alignment rather than force. And just as in Tokyo, perfect compliance isn&#8217;t required; predictable tendencies are enough to produce large-scale order.</p><p>This matters because the next generation of organizations will not resemble pyramids of reporting lines. They will be polycentric networks of humans and machines making real-time decisions. Trying to centrally police every action in that environment would be as futile as trying to direct traffic at Shibuya Crossing with a whistle. The only scalable strategy is to get the substrate right&#8212;to design agents that behave predictably even when acting autonomously.</p><p>Tokyo demonstrates that the simplest norms&#8212;embedded deeply and enacted consistently&#8212;can produce astonishing forms of order in environments that should, by all conventional logic, be chaotic. The city&#8217;s quiet choreography is not the result of constant oversight. It is what happens when a system&#8217;s dynamics are shaped by deep behavioral protocols&#8212;predictable patterns emerging without central control.</p><p>If we want our future societies of human and machine intelligence to function with similar coherence, we shouldn&#8217;t begin with constraints. We should begin with norms. The lesson Tokyo offers is deceptively simple: in complex systems, stability is not imposed. It is cultivated. And the most powerful form of governance is not enforcement, but alignment at the foundation.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Elon Musk Is Right About the End of the Smartphone]]></title><description><![CDATA[But for the Wrong Reason]]></description><link>https://www.thefutureiselsewhere.com/p/elon-musk-is-right-about-the-end</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/elon-musk-is-right-about-the-end</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Thu, 06 Nov 2025 00:11:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!GZEH!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!GZEH!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!GZEH!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg 424w, https://substackcdn.com/image/fetch/$s_!GZEH!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg 848w, https://substackcdn.com/image/fetch/$s_!GZEH!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!GZEH!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!GZEH!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg" width="1000" height="667" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:667,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:488725,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://tomorrowist.substack.com/i/183994694?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!GZEH!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg 424w, https://substackcdn.com/image/fetch/$s_!GZEH!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg 848w, https://substackcdn.com/image/fetch/$s_!GZEH!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!GZEH!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce4fa92a-6d8a-4d52-b61e-0a800f808905_1000x667.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Elon Musk recently <a href="https://www.youtube.com/watch?v=O4wBUysNe2k">predicted</a> the end of the smartphone. &#8220;In five or six years,&#8221; he said, &#8220;we won&#8217;t have phones in the traditional sense. What we call a phone will really be an AI edge node &#8212; no apps, no OS, just AI.&#8221; It&#8217;s easy to dismiss such statements as provocation, but he may be right for reasons that have nothing to do with hardware or his views on the mass adoption of AI-generated content.</p><p>The smartphone model has simply become too slow for what comes next. The traditional loop of unlocking, tapping, and waiting for an app to respond belongs to an era when people tolerated friction. In a world of predictive, context-aware AI, that&#8217;s already starting to feel clumsy. The real replacement for the smartphone isn&#8217;t a new device, it&#8217;s a new rhythm &#8212; where intelligence anticipates rather than waits, and latency, not interface design, determines how fast we get things done. Latency &#8212; the tiny delay between intention and action &#8212; will be what separates systems that feel instant and real from those that seem clumsy or obsolete.</p><p>The British company Nothing has already begun to edge toward Musk&#8217;s vision. Its new AI platform, <a href="https://nothing.community/en/d/43142-our-first-step-towards-an-ai-native-operating-system">Essential</a>, lets users build mini-apps through simple natural-language prompts. There&#8217;s no coding, no app store, no interface to navigate &#8212; just a conversation. You describe what you want, and the phone generates it on demand. It&#8217;s a small glimpse of a world where computation is ambient and anticipatory, not something we command but something that happens around us.</p><p>In that world, latency becomes experience. When a system hesitates &#8212; when a chatbot pauses mid-sentence, a car&#8217;s sensor reacts a moment too late, or a warehouse robot stutters before adjusting course &#8212; the illusion of intelligence collapses. The difference between seamless and frustrating, safe and catastrophic, often comes down to the time it takes for data to travel and a model to respond.</p><p>And that&#8217;s why latency is fast becoming one of the biggest strategic challenges for the AI industry. Real-time applications like autonomous vehicles, live fraud detection, and industrial robotics all depend on split-second inference. In conversational systems and virtual assistants, every extra second of delay erodes trust. In large-scale AI training, &#8220;tail latency&#8221; &#8212; the drag caused by the slowest servers or packets &#8212; can extend job completion by hours and waste millions of dollars in idle GPUs.</p><p>The parallels with high-frequency trading are instructive. A decade ago, hedge funds spent fortunes co-locating their servers beside exchanges to shave microseconds off their trades. Today, the same logic applies to cognition itself. The firms building the fastest loops between users, data, and models will deliver experiences that feel almost precognitive &#8212; systems that answer before you&#8217;ve finished asking.</p><p>That insight is reshaping the entire technology stack. Verizon and Amazon Web Services have <a href="https://www.verizon.com/about/news/verizon-business-and-aws-new-fiber-deal">announced</a> AI Connect, a new long-haul fiber network designed specifically for generative workloads, where every millisecond of delay compounds across billions of inferences. Cisco&#8217;s Unified Edge initiative takes the opposite approach, moving computation closer to where people and machines actually work &#8212; retail stores, factory floors, clinics &#8212; so that decisions can happen locally rather than waiting for a distant cloud.</p><p>But no company has grasped the implications of latency more clearly than NVIDIA. In <a href="https://nvidianews.nvidia.com/news/nvidia-nokia-ai-telecommunications">partnership</a> with Nokia, it&#8217;s building what it calls AI-native radio networks, embedding GPU compute directly into the next generation of 6G towers. The network itself will run inference, reducing the time between sensing and decision to almost nothing.</p><p>In Germany, NVIDIA and Deutsche Telekom have <a href="https://www.reuters.com/business/media-telecom/deutsche-telekom-partners-with-nvidia-ai-cloud-q1-2026-2025-11-04/">committed</a> over a billion euros to build the country&#8217;s first Industrial AI Cloud, powered by 10,000 of NVIDIA&#8217;s new Blackwell GPUs. The goal isn&#8217;t just AI sovereignty &#8212; it&#8217;s cognitive proximity. By turning telecom geography into AI factories, they&#8217;re collapsing the physical distance between data and intelligence.</p><p>Each of these moves reflects the same realization: that the future of AI won&#8217;t be decided by the size of your models but by how close they are to the moment of action. Intelligence that&#8217;s distant is expensive, slow, and brittle. Intelligence that&#8217;s near &#8212; that operates at the edge, embedded in the environment &#8212; feels instant, intuitive, and alive.</p><p>This shift is already visible at the hardware level. Edge processors now capable of running 13-billion-parameter models on-device are cutting inference delays by up to 70 percent. Tasks like translation, image recognition, and predictive text can happen locally, while heavier reasoning is handled in the cloud. It&#8217;s a new kind of cognitive choreography &#8212; the mind distributed between body and brain. The closer intelligence sits to reality, the faster it learns from it.</p><p>Low latency doesn&#8217;t just make systems faster; it makes them possible. A self-driving car navigating traffic, a surgeon consulting an AI model in real time, or a security network detecting threats before they unfold &#8212; all depend on microsecond responsiveness. In these environments, latency is no longer a nuisance; it&#8217;s a liability.</p><p>Every technological revolution has its limiting factor. For the industrial era, it was energy; for the digital era, it was bandwidth. In the era of artificial intelligence, it may be latency. The companies and countries that master it will own the rhythm of the future &#8212; setting not just the pace of communication, but the tempo of thought itself. For businesses, this means competitive advantage will hinge not just on algorithms or data, but on architecture. The winners of the next decade will be those who can build the fastest feedback loops &#8212; compressing the distance between sensing and understanding, decision and action.</p><p>Latency is no longer just a concern for network engineers. It&#8217;s now a measure of how fast a business can think and respond. In an AI-driven world, a few milliseconds can decide whether a service feels intuitive or frustrating, whether a company anticipates its customers or lags behind them. The same logic that&#8217;s may ultimately render the smartphone model obsolete applies to business itself &#8212; those still waiting for input will be overtaken by those that predict and act. Reducing that gap isn&#8217;t just technical work &#8212; it&#8217;s strategy.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[You are living through peak, cheap AI ]]></title><description><![CDATA[Don&#8217;t waste it]]></description><link>https://www.thefutureiselsewhere.com/p/you-are-living-through-peak-cheap</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/you-are-living-through-peak-cheap</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Wed, 29 Oct 2025 13:00:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!tOP7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!tOP7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!tOP7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg 424w, https://substackcdn.com/image/fetch/$s_!tOP7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg 848w, https://substackcdn.com/image/fetch/$s_!tOP7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!tOP7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!tOP7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg" width="1000" height="625" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:625,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:154400,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefutureiselsewhere.com/i/183994893?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!tOP7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg 424w, https://substackcdn.com/image/fetch/$s_!tOP7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg 848w, https://substackcdn.com/image/fetch/$s_!tOP7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!tOP7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F526dd18e-72c6-4a9a-970a-530dc8cdaeab_1000x625.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Every technology revolution offers a brief window of unreasonable advantage&#8212;an era when the bold can seize opportunities that later become ordinary. This is that moment for AI. The cost of cognition is collapsing. The rules haven&#8217;t caught up. The field is open to anyone ambitious enough to rewire their work, their organization, or their industry around machine intelligence. But none of these conditions will hold for long.</p><p>Like the early internet before paywalls and advertising, or the dawn of ride-sharing before prices rose to match reality, we are living through a period of unsustainable abundance. And history is clear: free rides don&#8217;t last.</p><p>This is the best of all possible times to embrace AI, and the worst time to hesitate.</p><h3>1. AI Is Still an Unfair Advantage</h3><p>AI has already entered the mainstream, but mastery remains uneven. Most people are still using it for trivial tasks&#8212;writing professional sounding emails, summarizing meeting notes, or using their chatbot as a digital therapist.</p><p>Those who know how to truly leverage AI are operating in another universe entirely. When you train ChatGPT Pulse on your own writing, reports, and meeting transcripts, it&#8217;s like waking up each morning to a team of researchers who have curated your private edition of the Harvard Business Review.</p><p>Across industries, the same pattern is emerging. Marketing teams are deploying AI agents that can generate, test, and localize thousands of campaign variations overnight&#8212;then feed the winning ideas directly into production systems. Supply-chain leaders are using autonomous forecasters that continuously adjust logistics based on weather, pricing, and geopolitical data, eliminating weeks of human analysis. Product teams are training domain-specific copilots that surface customer insights, simulate competitive dynamics, and recommend strategic trade-offs before a meeting even begins.</p><p>What once required committees, consultants, and coordination now happens as a real-time conversation between leaders and their digital counterparts&#8212;an entire cognitive layer of the enterprise that never sleeps, learns continuously, and compounds advantage for those who know how to direct it.</p><p>Right now, there&#8217;s still enormous alpha to be captured simply by being better at using AI than your peers. A mid-level consultant, analyst, or creative can plug into these tools and achieve leverage that once required an entire team. They can produce better insights, faster outputs, and more refined products&#8212;and sell them at a premium&#8212;because the knowledge and implementation gap remains wide.</p><p>But that window will close. As AI literacy spreads and models become embedded directly into platforms and workflows, the arbitrage disappears. To add value, you&#8217;ll need either deep engineering skill or very specific domain expertise. Everyone else will simply go straight to their own AI agents to get the job done.</p><h3>2. Someone Else Is Paying for Your Intelligence</h3><p>AI is absurdly, unsustainably cheap.</p><p>For $20 a month&#8212;or even for free&#8212;you can access sophisticated systems capable of reasoning, summarizing, coding, and designing at levels that would have required teams of specialists and millions in infrastructure only a few years ago.</p><p>But remember Uber&#8217;s early days, when rides were subsidized to be cheaper than public transport? Or Spotify, when streaming felt like a miracle before licensing costs caught up? Eventually, reality asserted itself. The same will happen here.</p><p>AI&#8217;s current economics are a gift. OpenAI, Anthropic, and others are still losing money on every query, propped up by investor subsidies and growth ambitions. Someone is paying the bill - just not you.</p><p>The energy, compute, and data costs of cognition will ultimately find equilibrium. Prices will rise, or access will be constrained. We are in the VC-subsidized golden age of cheap intelligence&#8212;a temporary arbitrage between technological possibility and economic gravity.</p><p>If you are a leader, this is your chance to build capability at minimal cost. Soon, you&#8217;ll be paying full price for the same cognitive horsepower.</p><h3>3. Rulemakers Haven&#8217;t Caught Up</h3><p>Regulators are still asleep at the wheel. AI currently sits in a grey zone: too new to be tightly controlled, too powerful to remain that way for long. The early web followed the same pattern. Before copyright enforcement, ad-tracking, and compliance bureaucracy took hold, there was a brief, chaotic explosion of creativity. That was when Google, Amazon, and PayPal were born.</p><p>Today&#8217;s AI landscape feels similar. The EU has passed its AI Act, and the U.S. is drafting its own frameworks&#8212;but enforcement is years behind. China is experimenting with guardrails, yet innovation continues unabated. For now, companies can build, deploy, and adapt with remarkable freedom.</p><p>Morgan Stanley&#8217;s AI wealth management assistant, trained on hundreds of thousands of internal research reports, was rolled out before clear regulatory guidance existed. The firm gained a first-mover advantage that will be harder to replicate once compliance frameworks tighten.</p><p>This window won&#8217;t stay open. Once the &#8220;regulatory-industrial complex&#8221; fully awakens&#8212;lawyers, auditors, ethics boards, and oversight committees&#8212;AI projects will slow, costs will rise, and freedom to experiment will shrink.</p><p>The pioneers will already have built the muscle memory of how to move fast and learn safely. Everyone else will be stuck writing policies. And by then, the biggest and most powerful companies will have captured the regulators&#8212;negotiating their own private sandpits with complex, compliance-heavy rules that only they can afford to follow. What begins as public safety will harden into private privilege, locking in incumbents and closing the door on new entrants who arrived just a little too late.</p><h3>4. Outsize Returns Are Still Possible</h3><p>Big things start small. In the 1990s, a graduate student project became Google. A scrappy online bookstore became Amazon. A simple payment tool became PayPal.</p><p>AI is at the same inflection point. Early movers are already turning modest bets into exponential value.</p><p>Harvey, the legal AI startup, began as a simple interface to an open API. Two years later, it&#8217;s embedded across global law firms, transforming how attorneys draft, review, and reason about complex documents. Cursor, which started as an AI coding assistant, has quietly become the command center for entire engineering teams&#8212;an integrated environment where agents not only write code but understand context, maintain memory, and orchestrate multi-step builds autonomously. And Runway, once an experimental video editor, now powers creative pipelines across media and marketing, collapsing production cycles from weeks to hours.</p><p>What these examples share is timing: each turned a narrow use case into a new cognitive infrastructure, capturing the kind of leverage that only exists in the early, chaotic phase of adoption.</p><p>Outsize returns always accrue in the early, unruly phase of adoption, before efficiency replaces imagination. The same pattern will play out across every industry&#8212;from healthcare and finance to manufacturing and education.</p><h3>5. Ground Truth Still Exists</h3><p>The final advantage of the current moment is epistemic: the data that feeds AI models is still relatively pure.</p><p>We haven&#8217;t yet reached the &#8220;hall of mirrors&#8221; stage where AI systems are swallowing their own synthetic output. But it&#8217;s coming. As AI-generated text, images, and code flood the internet, the risk of &#8220;model collapse&#8221;, where outputs become increasingly derivative, becomes increasingly likely.</p><p>For now, models are still largely trained on a foundation of human-authored books, research, and journalism. They reflect eons of human expertise and craftsmanship. That&#8217;s what makes today&#8217;s results so surprisingly coherent and useful.</p><p>But as the signal-to-noise ratio declines, verifying truth will require more time, tokens, and oversight. Leaders will need to invest in data provenance, curation, and trust architectures.</p><p>In other words, cognition is cheap now partly because the world&#8217;s knowledge base is still intact. Once that begins to erode, clarity will become a premium commodity.</p><h3>6. The Window Is Closing</h3><p>Every advantage we&#8217;ve just discussed&#8212;unfair knowledge, cheap economics, regulatory freedom, outsize returns, and reliable ground truth&#8212;is temporary.</p><p>Knowledge will spread. Costs will rise. Regulation will harden. Returns will normalize. Data will decay. The AI era won&#8217;t end, of course, it will just become ordinary, bureaucratic, and expensive, like every other mature technology before it.</p><p>For now, though, we inhabit a rare moment when a mid-sized company can act with the power of a global enterprise, and a single individual can operate with the leverage of an entire team. The smart leaders are not waiting for &#8220;maturity.&#8221; They are capturing cognitive territory while it&#8217;s still unclaimed.</p><p>Those who hesitate in periods like this tend to rationalize their inaction. They say they&#8217;re waiting for &#8220;best practices,&#8221; or for &#8220;the hype to settle,&#8221; or for &#8220;clarity on regulation.&#8221; But in doing so, they miss the only window when experimentation is both cheap and high-return.</p><p>Clarity will come but only after the opportunity has passed.</p><p>As I often say in my talks: <em><strong>the future favors the bold.</strong></em></p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[The Second Jet Age ]]></title><description><![CDATA[How AI Is Turning Power Into Compute]]></description><link>https://www.thefutureiselsewhere.com/p/the-second-jet-age</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/the-second-jet-age</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Sat, 25 Oct 2025 00:20:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!dCiw!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!dCiw!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!dCiw!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg 424w, https://substackcdn.com/image/fetch/$s_!dCiw!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg 848w, https://substackcdn.com/image/fetch/$s_!dCiw!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!dCiw!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!dCiw!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg" width="1280" height="896" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:896,&quot;width&quot;:1280,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:584025,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://tomorrowist.substack.com/i/183995085?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!dCiw!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg 424w, https://substackcdn.com/image/fetch/$s_!dCiw!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg 848w, https://substackcdn.com/image/fetch/$s_!dCiw!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!dCiw!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f9188eb-0d87-4d12-aa94-1487ffd15607_1280x896.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>If you drive past a new data-center build in Texas or Virginia today, you might catch a strange sight: a row of trailer-mounted jet engines idling beside concrete shells and cooling towers. They&#8217;re not there to fly&#8212;they&#8217;re there to think.</p><p>These machines once powered Boeing 747s. Now, refitted by companies like ProEnergy, their GE CF6-80C2 cores have been reborn as compact, 48-megawatt gas turbines&#8212;each one capable of powering a hyperscale AI facility. It&#8217;s an image that could have come straight out of a William Gibson novel: icons of a prior industrial age salvaged from the scrapheap, retrofitted with sensors and software mods, and repurposed as power plants for frontier AIs.</p><p>For decades, gas turbines were a symbol of decline. Renewables were ascendant, climate pledges were tightening, and the great manufacturers&#8212;GE Vernova, Mitsubishi Heavy Industries, Siemens Energy&#8212;were shrinking their divisions. In 2017 Siemens announced nearly 7,000 job cuts, warning of &#8220;disruption of unprecedented scope and speed.&#8221; Demand for large turbines had collapsed from 400 units a year to barely a hundred.</p><p>Yet suddenly, they&#8217;re back. The reason isn&#8217;t geopolitics or industrial policy&#8212;it&#8217;s artificial intelligence.</p><h3>Power Is the New Bottleneck</h3><p>The explosion of generative AI has transformed the economics of power. Every new data-center campus is a micro-city of computation, each building drawing hundreds of megawatts to train and run models that consume orders of magnitude more electricity than traditional cloud workloads. Utilities from Georgia to Dublin are warning of shortages. The bottleneck in the AI boom is no longer GPUs; it&#8217;s gigawatts.</p><p>Sam Altman put it succinctly in a recent <a href="https://blog.samaltman.com/abundant-intelligence">blog post</a>: <em>&#8220;Our vision is simple: we want to create a factory that can produce a gigawatt of new AI infrastructure every week&#8230; it will require innovation at every level of the stack, from chips to power to building to robotics.&#8221;</em></p><p>That phrase&#8212;<em>a gigawatt a week</em>&#8212;reframes how we think about compute. For the first time in history, intelligence is scaling like heavy industry. The product isn&#8217;t just code; it&#8217;s capacity. And energy is now the fundamental input to cognition.</p><p>What&#8217;s happening beneath the surface is a brutal equivalence between computation and energy. To generate intelligence at scale, we will need to convert ever more power into cognition. Jensen Huang of NVIDIA calls this the &#8220;<a href="https://www.linkedin.com/pulse/next-trillion-dollar-frontier-why-ai-bubble-mike-walsh-ozpnf/">token generation rate per unit of energy</a>.&#8221; The efficiency of an AI factory, in other words, can be measured not just in flops or model size, but in how many tokens of useful thought it can produce per megawatt.</p><p>That equation changes everything. It ties the fate of the digital economy to the physical grid. It means the next generation of leaders in technology, policy, and finance will need to think like energy strategists.</p><h3>Thought Thermodynamics</h3><p>The convergence between compute and energy isn&#8217;t just a matter of supply and demand. Something deeper is at play. Every leap in intelligence, human or artificial, is powered by a transformation of energy. The steam engine amplified muscle; electricity amplified industry; computation now amplifies thought. Each wave of progress turns energy into a new form of leverage.</p><p>The AI revolution is simply the latest expression of that principle. Training models like GPT-5 or Gemini 2 isn&#8217;t an abstract digital process; it&#8217;s a physical one, requiring power, cooling, materials, and space. Behind every query sits a chain of turbines, transformers, and transmission lines converting natural resources into cognition.</p><p>This is what Altman&#8217;s vision captures so well. The AI factory is not a metaphor at all&#8212;it&#8217;s literal. It&#8217;s an industrial stack running from chip fabs to power plants, from robotic assembly to energy markets. And like the steel mills and assembly lines of a century ago, it will define a new geography of productivity and a new class of strategic assets.</p><p>As nations race to build sovereign AI capacity, the question of where intelligence lives is becoming inseparable from where energy is available. Data-center clusters are springing up near hydro dams, nuclear plants, and gas hubs. Energy policy is becoming industrial policy for the cognitive age.</p><p>There is a certain irony in fossil-fuel machinery powering the most advanced software humanity has ever built. I&#8217;d love to see one in person. I can just imagine it: turbines spining beneath sodium lights on the edge of the desert, exhaling heat and noise into the night&#8212;half relic, half prophecy.</p><p>Jet engines are not the end of the story, they are just a cyberpunk patch - a temporary bridge between the industrial and cognitive eras. The real challenge for leaders is not just developing smarter algorithms, but aligning intelligence with the infrastructure that powers it. In the years ahead, energy strategy will <em>become</em> AI strategy. The organizations that understand that connection first will shape the next economy.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[The Next Trillion-Dollar Frontier]]></title><description><![CDATA[Why AI Is Not a Bubble]]></description><link>https://www.thefutureiselsewhere.com/p/the-next-trillion-dollar-frontier</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/the-next-trillion-dollar-frontier</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Thu, 16 Oct 2025 00:20:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!_ldA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!_ldA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!_ldA!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg 424w, https://substackcdn.com/image/fetch/$s_!_ldA!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg 848w, https://substackcdn.com/image/fetch/$s_!_ldA!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!_ldA!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!_ldA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg" width="1300" height="867" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:867,&quot;width&quot;:1300,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:231082,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://tomorrowist.substack.com/i/183995323?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!_ldA!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg 424w, https://substackcdn.com/image/fetch/$s_!_ldA!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg 848w, https://substackcdn.com/image/fetch/$s_!_ldA!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!_ldA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5191fb7a-55bf-4c45-a0a5-24102f3c686d_1300x867.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I spent the afternoon at NVIDIA&#8217;s headquarters in Santa Clara. It&#8217;s hard not to walk through those glass corridors without feeling that you&#8217;ve stepped inside the engine room of a new industrial age. Banks of machines hum like power plants. Engineers move with quiet precision, orchestrating what feels less like software development and more like manufacturing reality itself.</p><p>That impression captures Jensen Huang&#8217;s vision perfectly. The founder and CEO of NVIDIA has become the defining industrialist of our time&#8212;not because he makes chips, but because he builds the infrastructure of intelligence. In a recent conversation with Sequoia Capital&#8217;s Konstantine Buhler at Citadel Securities&#8217; <em>Future of Global Markets 2025</em> conference, Huang dismantled one of the laziest tropes of our age: that we are living through an AI bubble.</p><p>We aren&#8217;t, he argued, because AI is already producing hundreds of billions of dollars in tangible returns. It is the invisible operating system behind modern capitalism&#8212;the intelligence layer driving search, recommendation engines, advertising, and logistics. If the dot-com boom was speculation on digital potential, the AI boom is monetized cognition in action. The returns are real, measurable, and accelerating.</p><p>Huang&#8217;s framing of this transformation is simple but profound. The world&#8217;s data centers are no longer just server farms; they are AI factories&#8212;industrial complexes that transform electricity into intelligence. What matters now is not how many chips you own, but how much <em>thinking</em> your infrastructure can perform per watt. &#8220;Throughput per unit of energy governs your revenue,&#8221; he likes to say. In his view, the defining metric of the coming decade won&#8217;t be teraflops&#8212;it will be <em>tokens per megawatt.</em></p><p>That insight reverses the logic of the last fifty years of computing. Power, not compute, becomes the ultimate bottleneck. Each incremental gain in energy efficiency translates directly into business growth. The company that can produce three times the intelligence per joule of energy doesn&#8217;t just save power&#8212;it generates three times the economic output. <a href="https://www.linkedin.com/pulse/cheap-cognition-demands-new-strategy-mike-walsh-dxn8c/">Cheap cognition demands new strategy.</a> NVIDIA&#8217;s systems now scale from desk-size accelerators to gigawatt-scale installations: full-stack, rack-scale intelligence factories capable of producing cognition at industrial intensity.</p><div id="youtube2-m1wfJOqDUv4" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;m1wfJOqDUv4&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/m1wfJOqDUv4?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>This is why talk of a bubble misses the point. When Apple&#8217;s privacy changes in 2022 disrupted Meta&#8217;s advertising models, the company lost hundreds of billions in market value almost overnight. It recovered that loss not with marketing spin or layoffs, but by rebuilding its recommendation systems using AI trained on NVIDIA GPUs. That pivot restored more than a trillion dollars in market cap.</p><p>I started my career in the late nineties during the first dotcom boom. By 2000, the entire internet industry was worth $30 billion and mostly unprofitable. Today, AI already powers hundreds of billions in recurring revenue for hyperscalers like Google, Amazon, and Meta. This isn&#8217;t Pets.com&#8212;it&#8217;s Ford Motor Company, circa 1913, wiring up the first assembly lines at its Highland Park, Michigan plant.</p><p>And if Huang is right, we are only at the beginning. He predicts two new trillion-dollar industries emerging from the AI revolution: digital labor and physical AI. Inside NVIDIA, every software engineer already works alongside an AI collaborator. Productivity has surged. Digital workers that don&#8217;t just assist humans but act alongside them as peers. In the near future, enterprises will employ both biological and digital staff: some licensed from large model providers like OpenAI or Harvey, others fine-tuned internally using proprietary data and organizational culture. The IT department, he suggests, will evolve into the HR function for digital employees&#8212;onboarding, training, and evaluating agentic systems just as it does people.</p><p>The same logic extends beyond the screen into the physical world. Autonomous vehicles, warehouse cobots, delivery drones, surgical systems&#8212;these are all forms of <em>embodied intelligence</em>. If an AI can generate a realistic video of a person picking up a bottle, Huang asks, why can&#8217;t it control a robot to do the same? NVIDIA&#8217;s <em>Omniverse</em> platform, a photorealistic simulation environment, allows robots to train safely in virtual space before entering the real one. Every physical AI, he explains, requires three computers: one for training, one for simulation, and one for operation. In other words, AI doesn&#8217;t just learn&#8212;it practices, rehearses, and then performs. Together, digital labor and physical AI touch nearly $100 trillion of global economic activity.</p><p>Behind this transformation lies a deeper philosophical shift&#8212;from retrieval to generation. The computers of the past fetched stored information; the computers of the future will generate it on demand. When you ask an AI a question today, it doesn&#8217;t look up an answer&#8212;it <em>creates</em> one, conditioned on context and intention. &#8220;Everything we just did in this conversation was generated,&#8221; Huang said. That&#8217;s not marketing hyperbole; it&#8217;s a description of a new computational reality. The world is moving from stored intelligence to <em>living intelligence</em>&#8212;systems that think, reason, and respond in real time.</p><p>But even revolutions face limits. For AI, that limit is power. Every industrial era eventually confronts its energy constraint: coal for steam, copper for electricity, now electricity itself for computation. No matter how advanced your architecture, you can&#8217;t exceed the capacity of the grid. That is why energy efficiency has become the new arms race. NVIDIA&#8217;s vertically integrated design&#8212;co-developing chips, networks, and software&#8212;breaks through the slowing of Moore&#8217;s Law by delivering tenfold performance improvements at constant power. Sustainability, in this sense, is no longer an ethical imperative; it is a competitive one. The winners of the AI era will be those who can convert electrons into intelligence with the least waste.</p><p>The logic of the AI factory also extends to geopolitics. &#8220;No country can afford to outsource its data and import its intelligence back,&#8221; Huang warned. Every nation will soon need its own <em>sovereign AI</em>&#8212;trained on domestic data, aligned with local values, and secured by national infrastructure. <a href="https://www.linkedin.com/pulse/sovereign-ai-why-nations-need-own-intelligence-stack-mike-walsh-nosvc/">As I have written about previously</a>, across Europe, Asia, and the Middle East, governments are already building national AI factories. What oil refineries were to the twentieth century, AI factories will be to the twenty-first&#8212;the infrastructure of cognitive sovereignty.</p><p>And underpinning all of it is a new economic truth: as the marginal cost of intelligence approaches zero, the constraint on productivity shifts from labor to energy. AI doesn&#8217;t replace human work; it multiplies it. We are entering an age of infinite labor, where digital and physical agents expand the productive frontier far beyond biological limits. In such an era, the scarce commodity will not be intelligence but <em>judgment</em>. When machines can generate infinite content and action, the differentiator becomes what we choose to create, constrain, or believe.</p><p>That was my lingering thought as I left NVIDIA&#8217;s campus and watched the setting sun glint off the mirrored fa&#231;ade of the building&#8212;a modern cathedral to computation. The future no longer belongs to those who simply own machines, but to those who can orchestrate cognition at scale.</p><p>If you want to understand the real metric of power in this new world, forget headcount, profit, or even compute. Ask instead:<em> how much intelligence can you generate per watt?</em></p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[The Uncontainable Future]]></title><description><![CDATA[Why the Age of AI Demands Adaptation, Not Control]]></description><link>https://www.thefutureiselsewhere.com/p/the-uncontainable-future</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/the-uncontainable-future</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Thu, 09 Oct 2025 00:20:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ibUO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ibUO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ibUO!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ibUO!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ibUO!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ibUO!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ibUO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg" width="1000" height="600" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:600,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:465712,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://tomorrowist.substack.com/i/183996040?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ibUO!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ibUO!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ibUO!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ibUO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F46247ff8-744d-4b28-9d09-94d508d9b73d_1000x600.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In the summer of 1956, a group of scientists gathered at Dartmouth College to do something extraordinary: decode the nature of human intelligence and recreate it inside a machine. It seems strangely naive now that a handful of white-shirted men, smoking pipes in the New Hampshire heat, believed they could solve consciousness in a few months, the way you might solve a crossword puzzle. Yet the impulse behind that meeting&#8212;the conviction that intelligence could be bottled, controlled, and productized&#8212;has never really gone away.</p><p>Seventy years later, we still talk about AI as if it were a thing we can buy, configure, and regulate. For the last few years, we have labored under the delusion that AI disruption was even something that you could subscribe to by the month. We&#8217;ve built guardrails, policies, and governance frameworks, but beneath the comforting language of oversight runs a more unsettling truth: intelligence is not a product. It&#8217;s a process.</p><p>The future &#8212; and AI in particular &#8212; is uncontainable because intelligence, once industrialized, behaves like every other general-purpose force in history: it escapes the boundaries we design for it. Each wave of technology that has lowered the cost of doing something valuable has reorganized society in ways its creators never intended. AI is doing the same with cognition. No matter how many rules, ethics boards, or safeguards we build, an autonomous, self-improving system that learns from the world will evolve faster than our capacity to predict or restrain it. The story of AI will not be about control, but adaptation.</p><p>History offers warnings. The steam engine was meant to power factories; it ended up reshaping cities, labor, and the planet&#8217;s climate. Electricity promised convenience and delivered globalization. The internet began as a communications tool and became the nervous system of civilization. Every time humanity invents a general-purpose technology, we overestimate our ability to control its consequences. AI will be no different&#8212;only faster.</p><p>If you want a metaphor for what&#8217;s coming, think of the shipping container. Before its invention, global trade was slow, messy, and expensive. Then someone standardized the box. Once that happened, goods began moving across the planet with near-zero friction. The cost of shipping fell by orders of magnitude, and the world reorganized around it. What the container did for atoms, AI will do for knowledge. We&#8217;re learning to standardize cognition&#8212;to move intelligence itself through systems as easily as data. The cost of getting a &#8220;smart decision&#8221; made is collapsing, and when cognition becomes cheap, everything else changes.</p><p>But cheap intelligence also breeds a strange new volatility. Algorithms trained on the detritus of our digital lives now generate culture faster than we can consume it. The result is a feedback loop: models trained on their own outputs, humans trained by the models, all of us trapped inside a hall of mirrors. Instagram, TikTok, and the endless scroll are not entertainment platforms so much as early warning systems for what happens when optimization eats creativity. Culture, like code, can collapse when it begins copying itself too many times.</p><p>Some call this <em>model collapse</em>; I think of it as a form of entropy. When systems learn only from their own exhaust, diversity vanishes, and meaning decays. Our instinct is to fix it with better rules&#8212;guardrails, content policies, red-teaming&#8212;but entropy doesn&#8217;t obey regulation. The solution isn&#8217;t control; it&#8217;s connection. The next generation of AI will have to reach back into the physical world to stay grounded in reality.</p><p>That&#8217;s why the most interesting developments today are happening not in language but in embodiment. Humanoid robots, autonomous cars, drones, and even the next wave of augmented-reality glasses are all teaching machines to perceive, not just predict. They are collecting new kinds of data&#8212;spatial, tactile, sensory&#8212;that anchor digital cognition in physical truth. A child doesn&#8217;t learn gravity from reading about it; they learn by dropping a toy and watching it fall. Future AIs will learn the same way: through contact. Once machines can sense, act, and play in the world, they&#8217;ll start developing intuitions of their own. And those intuitions will evolve beyond anything we can script.</p><p>Governments, of course, are trying to keep pace. They talk about AI safety, regulation, and national strategies. But the real issue isn&#8217;t compliance, but sovereignty. When intelligence becomes infrastructure, who owns the pipes? Every country will soon need to secure its intelligence supply chain: data, semiconductors, compute, and, most critically, model weights. Because embedded within those weights are cultural values&#8212;the moral DNA of the societies that train them.</p><p>Weights are culture. And once those models start influencing how our hospitals allocate resources or how our cities manage transport, cultural independence will matter as much as energy independence once did.</p><p>Yet even sovereignty has limits. The forces we&#8217;re unleashing are global, self-propagating, and only partly knowable. We can design principles, audit systems, even try to tax the robots&#8212;but we won&#8217;t be able to stop billions of autonomous agents from emerging as cognition becomes cheaper and more distributed. AI will leak into everything: workflows, logistics, diplomacy, warfare, art. It will crawl into the gaps between our rules.</p><div id="youtube2-OfKjdNcdsoo" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;OfKjdNcdsoo&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/OfKjdNcdsoo?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>We need a new philosophy of leadership that accepts unpredictability as the normal state of things. In complex systems, control is an illusion; influence is everything. The leaders of the future won&#8217;t be those who build the tallest walls around their organizations, but those who design systems that can adapt faster than they break.</p><p>There&#8217;s something liberating in that. For creatives, thinkers, and entrepreneurs, this is an extraordinary window of opportunity. Most people are still paralyzed by the strangeness of these tools&#8212;too proud or too fearful to use them deeply. Those who do will discover that AI isn&#8217;t a threat to originality; it&#8217;s an amplifier for it.</p><p>Of course, this won&#8217;t last forever. Every revolution begins with asymmetry&#8212;those who understand the new tools have leverage over those who don&#8217;t. But eventually, the advantage dissipates, the knowledge spreads, and the system resets. Right now, we&#8217;re in that rare, chaotic interlude when the future is still malleable, before the new order hardens into place.</p><p>The scientists at Dartmouth thought intelligence could be solved. The truth is, it can only be unleashed. Once cognition becomes infrastructure&#8212;once it flows through networks, sensors, and machines&#8212;it will evolve along paths we can&#8217;t fully predict. We can set guardrails, yes, and we should. But as with every great force before it, the real story won&#8217;t be about how we control AI, but how we adapt to the world it creates.</p><p><em>Because the future is never contained for long.</em></p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[What An AI-Generated Actress Tells Us About the Future of Work]]></title><description><![CDATA[The New Operating Leverage of Human Expertise]]></description><link>https://www.thefutureiselsewhere.com/p/what-an-ai-generated-actress-tells</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/what-an-ai-generated-actress-tells</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Wed, 08 Oct 2025 00:20:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!OdBJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!OdBJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!OdBJ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp 424w, https://substackcdn.com/image/fetch/$s_!OdBJ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp 848w, https://substackcdn.com/image/fetch/$s_!OdBJ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp 1272w, https://substackcdn.com/image/fetch/$s_!OdBJ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!OdBJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp" width="1200" height="738" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:738,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:35722,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://tomorrowist.substack.com/i/183996938?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!OdBJ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp 424w, https://substackcdn.com/image/fetch/$s_!OdBJ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp 848w, https://substackcdn.com/image/fetch/$s_!OdBJ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp 1272w, https://substackcdn.com/image/fetch/$s_!OdBJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2708743-eac7-4c9f-a66c-e4dd81854d3a_1200x738.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Hollywood is in uproar over <a href="https://www.instagram.com/tillynorwood/">Tilly Norwood</a>, an AI-generated &#8220;actress&#8221; that talent agents are reportedly lining up to sign. SAG-AFTRA, the actors&#8217; union, quickly condemned the move, declaring that creativity &#8220;must remain human-centered.&#8221; But the real significance of Tilly isn&#8217;t about movies at all. She is a glimpse into the next great disruption of work: when synthetic talent&#8212;actors, consultants, analysts, even leaders&#8212;compete alongside human professionals. What&#8217;s playing out in Hollywood today is simply the first act of a drama that will transform every industry built on knowledge and creativity.</p><h3>Mostly Moral Panics</h3><p>The industry has been here before. Photoshop was supposed to destroy photography. Auto-Tune, music. CGI, live action. Every innovation was met with fear, only to expand the creative toolkit. What makes the AI performer controversy a little different is the perception of replacement. Trained on the work of countless human actors, without consent or compensation, she crystallizes every anxiety about automation: stolen artistry, vanishing jobs, inauthentic performances.</p><p>But here&#8217;s the truth: Tilly isn&#8217;t a worker. She&#8217;s property. Signing her isn&#8217;t about career management&#8212;it&#8217;s an IP licensing deal. Think less Julia Roberts, more Mickey Mouse. Disney doesn&#8217;t cast Mickey; it monetizes him. Behind this AI construct is not a sentient algorithm but a company, Particle6, led by actor-comedian-technologist Eline Van der Velden. She <a href="https://deadline.com/2025/09/sag-aftra-responds-ai-actress-tilly-norwood-1236565959/">describes</a> her creation not as a replacement for human performance, but as &#8220;a new paintbrush&#8230;another way to imagine and build stories&#8221;</p><p>This is the part Hollywood often misses. Synthetic actors aren&#8217;t autonomous laborers displacing humans; they are assets owned, managed, and monetized by humans. A virtual actress doesn&#8217;t get paid, but the entity that owns her does. Which raises the real question: who controls the digital rights to synthetic talent, and how is value distributed? The moral panic over Tilly Norwood echoes less of a labor crisis and more of a business model shift.</p><h3>Scarcity No More</h3><p>What Tilly really signals is the end of scarcity in performance. Human actors have limits&#8212;time, age, language, geography. Synthetic performers do not. A digital twin can shoot ads in 30 countries while the human sleeps. Influencers can clone versions of themselves, optimized for different demographics. H&amp;M already experimented with digital doubles for models, letting them license their own likeness. This is operating leverage: decoupling talent from the constraints of the human body.</p><p>If you think this debate is about who wins next year&#8217;s Oscar, you&#8217;re missing the point. The real war for synthetic talent won&#8217;t be on the big screen. It&#8217;ll be on the small ones in our pockets.</p><p>Instagram influencers, TikTok stars, OnlyFans creators&#8212;this is where synthetic avatars are multiplying. Not carefully managed by studios, but churned out by small teams, even teenagers, who realize they can mint virtual stars at algorithmic scale. Why pay a model when you can generate thousands, each tuned to a niche audience? The long tail of &#8220;good enough&#8221; beauty and click bait charisma capable of winning a few seconds of precious attention is already shifting from human to machine.</p><p>The opportunity here is operating leverage. In the same way a fashion model can license a digital twin to appear in dozens of campaigns simultaneously, tomorrow&#8217;s consultants, analysts, and knowledge workers will clone themselves into synthetic colleagues. Imagine a strategy partner running parallel versions of their expertise across multiple client projects, or a financial advisor scaling their judgment into thousands of personalized simulations overnight.</p><p>The shift isn&#8217;t about replacing humans with machines, but about multiplying the reach of human expertise through synthetic proxies. The lesson for the future of work is clear: value will flow to those who can design, own, and deploy their digital doubles at scale&#8212;turning individuality into infrastructure.</p><h3>Future Playbooks</h3><p>This is where SAG-AFTRA&#8217;s pushback matters. They&#8217;re wrong that synthetic talent can be stopped, but right that rules are needed. If a performer&#8217;s likeness is used to train a model, they deserve recognition and compensation. The same will apply to consultants whose decks train enterprise AIs, medical researchers whose work is leveraged by hospital systems, artists whose designs show up in generative art. Synthetic colleagues will need contracts, attribution, and revenue-sharing&#8212;just as human ones do.</p><p>The Tilly Norwood panic reveals more about us than about her. The wrong question is whether an AI actress should be signed by a talent agent. The right question is how to design an economy where synthetic talent amplifies human potential instead of erasing it. Do we wall off certain domains as sacredly human? Do we embrace synthetic colleagues with new forms of governance? Or do we drift, letting economics alone decide?</p><p>So how should we respond to this shift? History gives us two potential playbooks:</p><p><strong>The Sports Model</strong>: protect the human essence. In the Olympics or Formula 1, technology is deliberately limited to keep competition fair. Acting or other fields of professional work could be defined as a human-only pursuit, cordoned off from synthetic rivals.</p><p><strong>The Music Model</strong>: embrace the tools. From synthesizers to streaming, technology didn&#8217;t kill music; it changed its form. Acting could take the same path, with human and synthetic performances co-existing, sometimes blending.</p><p>Neither path is cost-free. The first risks irrelevance. The second risks dilution. But pretending we can choose &#8220;neither&#8221; is fantasy.</p><div id="youtube2-BcRXM-ZoNiI" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;BcRXM-ZoNiI&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/BcRXM-ZoNiI?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><h3>Embrace Unreality</h3><p>The deeper issue here is the crisis of authenticity. The arrival of Sora 2 is just the latest broadside against consensual reality. From fake news to deepfakes, it is already hard to tell what is real, what is synthetic, and who actually created what we see. As I have <a href="https://www.youtube.com/watch?v=BcRXM-ZoNiI">argued</a> for a while now, the real challenge is not distinguishing truth from falsehood, but deciding how we act in a world where those lines are constantly blurred.</p><p>The smart move is not to cling to outdated notions of authenticity but to embrace unreality. Rather than fearing duplication, creators and professionals should proactively clone their voices, build digital avatars, and train personal AIs that carry their unique style. In the same way musicians once turned recorded media into a form of leverage, knowledge workers will need to turn their digital doubles into a defensible advantage.</p><p>In a world of infinite copies, power won&#8217;t belong to those who deny the synthetic&#8212;it will belong to those who design, own, and deploy it.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[The Rise Of Shadow Labor]]></title><description><![CDATA[Why Productivity Now Depends Less on Effort and More on Orchestration]]></description><link>https://www.thefutureiselsewhere.com/p/the-rise-of-shadow-labor</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/the-rise-of-shadow-labor</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Thu, 02 Oct 2025 01:20:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!K1tY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!K1tY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!K1tY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg 424w, https://substackcdn.com/image/fetch/$s_!K1tY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg 848w, https://substackcdn.com/image/fetch/$s_!K1tY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!K1tY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!K1tY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg" width="1000" height="667" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:667,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:611281,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://tomorrowist.substack.com/i/183994209?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!K1tY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg 424w, https://substackcdn.com/image/fetch/$s_!K1tY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg 848w, https://substackcdn.com/image/fetch/$s_!K1tY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!K1tY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe36a407b-99c2-4a91-b3c0-4a80c8264400_1000x667.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In 2013, a quiet software developer known to his colleagues as &#8220;Bob&#8221; became briefly infamous when investigators discovered that he had secretly outsourced his entire job to a contractor in China. He had mailed his security credentials overseas, paid the contractor a fraction of his six-figure salary, and spent his own workdays browsing Reddit and watching cat videos. Yet his performance reviews were stellar. Bob was, by every visible metric, one of the company&#8217;s best engineers. When his secret arrangement was uncovered, he was fired. But a decade later, his story looks less like a strange aberration and more like a preview of work in the age of AI.</p><p>Bob outsourced his job to a human. Today, millions of workers quietly outsource parts of their roles to machines. And unlike Bob, most of them are doing it not to avoid work, but to redesign it.<br><br>What&#8217;s emerging is a new form of unsanctioned productivity: employees quietly using AI systems to automate tasks, streamline workflows, and shift routine execution to autonomous agents without informing their employers. This behaviour has been called many things &#8212; from &#8220;bringing your own AI&#8221; to &#8220;rogue automation&#8221; &#8212; but for me, the term that best captures it is shadow labor. Like shadow IT before it, shadow labor is a bottom-up response to organizational constraints. But it is far more personal, far less visible, and ultimately far more transformative.<br><br>Shadow IT, which emerged in the early 2000s, was a reaction to slow-moving corporate technology. Employees brought their own devices, downloaded unsanctioned software, or spun up cloud services to get their work done. Companies initially treated it as a threat before gradually recognizing it as a signal. Shadow IT exposed inefficiencies. It revealed unmet needs. It surfaced the most inventive employees &#8212; the ones who hacked together solutions when formal systems failed them.<br><br>Shadow labor, by contrast, doesn&#8217;t just change the tools that employees use. It changes the location of labor itself. When workers use ChatGPT to write reports, Copilot to generate code, or custom agents to automate correspondence, they are not simply augmenting their workflow. They are reallocating execution. The work still gets done &#8212; often faster, more accurately, or more consistently &#8212; but the human&#8217;s effort is no longer the primary source of that output.<br><br>This shift poses a direct challenge to managerial assumptions about effort, authorship, and contribution. If a deliverable can be produced in minutes by an AI agent, what does &#8220;hard work&#8221; mean? If results improve, does the process matter? And if an employee&#8217;s value lies increasingly in how they design, supervise, and refine agentic workflows, rather than how many keystrokes they produce, what exactly defines a &#8220;good employee&#8221;?<br><br>Many managers interpret shadow labor as quiet deception &#8212; an attempt to evade the difficult parts of a job. But there is a more accurate and more revealing way to understand it: job refactoring. Employees using AI agents are not trying to shirk responsibility. They are redesigning the pathway to the outcome. They are asking the same question that has historically separated high performers from everyone else: What is the simplest, smartest, most elegant way to get this done?<br><br>When people ask me who they should hire, I sometimes say &#8212; half joking, half serious &#8212; that they should look for the smartest lazy people they can find. Smart, lazy people have no interest in grinding through a job as written. They want to rewrite it. They want to automate the tedious parts so they can focus on the work that actually requires judgment, imagination and impact.<br><br>Seen through this lens, the instinct behind shadow labor is not subversive at all. In many organizations, it may already be the leading indicator of future excellence.<br><br>And some companies are beginning to recognize this.<br><br>Consider Skims and Good American. Cofounder Emma Grede introduced a bonus system designed to challenge teams to find agentic applications inside their departments. Rather than rewarding perseverance through manual processes, she rewarded employees for discovering ways to hand work to AI. The biggest breakthrough did not come from marketing or creative &#8212; it came from the accounts team, which used AI to overhaul its chargeback systems. The resulting redesign saved the company hundreds of thousands of dollars, a vivid demonstration of what happens when employees are invited to treat automation as innovation rather than subversion.<br><br>Microsoft offers another clue to the future. The company has made it clear internally that using AI tools such as GitHub Copilot is now part of performance expectations. AI fluency is no longer an experiment; it is table stakes. Employees who fail to adopt these tools risk being seen not as diligent but as disengaged.<br><br>Meta has gone even further. Managers have been explicitly advised that low use of internal AI tools could negatively affect employee evaluations. The company has reframed agentic tool use as evidence of adaptability, literacy, and future readiness &#8212; not as discretionary enhancement but as professional obligation.<br><br>At Shoosmiths, the UK law firm, the incentives are more explicit still. Leaders created a one-million-pound reward pool tied directly to generating one million prompts with approved AI tools. Prompting became billable behaviour. Intelligent orchestration became a form of recognized contribution.<br><br>Across these examples, a pattern emerges. The companies at the leading edge of AI adoption are not cracking down on shadow labor. They are formalizing it. They are taking the behaviours that once lived in the shadows and pulling them into the centre of organizational performance. They are rewarding the employees who ask where machines should step in &#8212; not the ones who insist on protecting manual effort as a badge of honour.<br><br>This marks a deeper transition. The shift from shadow labor to sanctioned agentic work is not simply a matter of tools. It is a shift in organizational logic. Productivity in an age of abundant intelligence no longer hinges on human exertion alone. It hinges on how intelligently human effort is multiplied.<br><br>Your best employee may no longer be the one who works the hardest or logs the longest hours. It may be the one who quietly eliminates entire categories of effort, who replaces repetition with agents, and who frees their mind for decisions of higher consequence. In that sense, shadow labor may be one of the clearest early signals of twenty-first-century leadership potential.<br><br>And this, more than anything, is where leadership must evolve. Organizations that cling to traditional definitions of diligence will push their most innovative people underground. They will force ingenuity into secrecy. They will frustrate ambitious employees who see what is possible but are not allowed to act on it. Meanwhile, companies that reward agentic behaviour will surface new sources of leverage, accelerate learning cycles, and cultivate cultures where people are valued not only for what they do, but for how they redesign what is possible.<br><br>Shadow labor isn&#8217;t going away. It is scaling. As AI tools become more powerful and ubiquitous, the gap between those who use them effectively and those who do not will widen. Organizations that ignore this trend risk not only security breaches but cultural stagnation. They will miss the early signals of transformation and fall behind competitors willing to evolve.<br><br>None of this can be solved simply by giving everyone a chatbot license. The real work lies in redefining roles, restructuring workflows, and reimagining how teams operate when some work is done by human minds and some by synthetic ones. Managing this hybrid output requires new KPIs, new trust models, and a new kind of leadership literacy &#8212; one centred on orchestration rather than oversight.<br><br>What looks today like subversive behaviour will soon become standard practice. Shadow labor is not a deviation from the future of work. It is the future of work &#8212; just unevenly distributed.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Cheap Cognition Demands A New Strategy]]></title><description><![CDATA[How collapsing decision costs force a redesign of work, value, and advantage]]></description><link>https://www.thefutureiselsewhere.com/p/cheap-cognition-demands-a-new-strategy</link><guid isPermaLink="false">https://www.thefutureiselsewhere.com/p/cheap-cognition-demands-a-new-strategy</guid><dc:creator><![CDATA[Mike Walsh]]></dc:creator><pubDate>Thu, 25 Sep 2025 01:20:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!XBwh!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!XBwh!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!XBwh!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg 424w, https://substackcdn.com/image/fetch/$s_!XBwh!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg 848w, https://substackcdn.com/image/fetch/$s_!XBwh!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!XBwh!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!XBwh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg" width="1200" height="759" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:759,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:211519,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://tomorrowist.substack.com/i/183998299?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!XBwh!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg 424w, https://substackcdn.com/image/fetch/$s_!XBwh!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg 848w, https://substackcdn.com/image/fetch/$s_!XBwh!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!XBwh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20eac3a9-3beb-48cb-a43e-62430e6dc9ee_1200x759.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>When we think about industrial revolutions, we usually focus on the technology &#8212; the steam engine, the light bulb, the transistor, the internet. But the true power of an industrial revolution is not the technical invention itself. It is the way that invention permanently alters the unit cost economics of doing valuable work.</p><p>Steam reduced the cost of mechanical energy. Electricity reduced the cost of distributing power. Semiconductors reduced the cost of computation. The Internet and cloud services reduced the cost of distributing information. Each time, the breakthrough wasn&#8217;t just that things became cheaper; it was that entire industries reorganized themselves around new possibilities.</p><p>AI now represents the same kind of step change &#8212; but this time, the resource being transformed is <em>cognition</em>. The cost of getting a &#8220;smart decision&#8221; made is collapsing. What once required teams of analysts, consultants, or managers can now be handled by autonomous systems at near-zero marginal cost.</p><p>But here is the danger: too many leaders will stop at the first-order effect &#8212; using AI to make existing processes cheaper, faster, or more efficient. However, the real productivity benefits only kick in when organizations embrace system-level transformation: reinventing how work is structured, how decisions are made, how value is created.</p><p>As I explain in the view below - my view is that over the next decade, the real competitive divide will not be between technology companies and traditional companies. It will be between organizations that rewire their operating models around the falling unit cost of cognition, and those that cling to legacy structures, simply doing yesterday&#8217;s work at a lower price.</p><div id="youtube2-W19OLEKitWA" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;W19OLEKitWA&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/W19OLEKitWA?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>When electricity first arrived in factories, many industrialists simply swapped out their steam engines for electric motors, leaving the drive shaft, heavy machinery, and the rest of their operations unchanged. Costs dipped, efficiency ticked up, but little else shifted. Henry Ford saw what others missed: the true advantage wasn&#8217;t in cheaper power, but in the chance to redesign the very system of work. By giving each machine its own motor, Ford could rearrange production into a continuous flow &#8212; <em>the moving assembly line.</em></p><p>Ford&#8217;s breakthrough wasn&#8217;t about cheaper manufacturing; it was a new logic of production that multiplied output, lowered prices, and made automobiles accessible to ordinary people. That is the lesson for AI today. The first-order effect is efficiency &#8212; shaving costs from existing workflows. But the lasting advantage comes only when leaders do what Ford did: reimagine the architecture of work itself, using cheaper cognition not to do the same things faster, but to invent entirely new ways of creating value.</p><p>It is tempting to cast Tesla as the modern Ford, in terms of leveraging technology to disrupt the industry. But consider Chinese automotive manufacturer, BYD. Founded in Shenzhen in the 1990s, BYD doesn&#8217;t just build cars &#8212; it builds entire ecosystems. Beyond EVs, the company manufactures batteries, operates energy storage systems, supplies solar panels, and integrates them into AI-driven mobility and power platforms for whole regions. The strategy is not about producing cheaper cars, but about redefining what a car is: a node in a civilization-scale network linking transport, energy, and data. Just as Ford&#8217;s assembly line transformed manufacturing, BYD&#8217;s integrated approach points to a future where advantage comes not from incremental efficiencies, but from reorganizing how whole systems &#8212; mobility, energy, cities &#8212; operate together.</p><p>Efficiency gains may give you a head start, but it is the second-order redesigns that change the game. The winners of the AI age will not be those who deploy chatbots to shave call-center costs, but those who reconceive service altogether. They won&#8217;t be the firms that use AI to generate better reports, but those who eliminate the need for reports entirely by embedding intelligence directly into decisions.</p><p>The challenge for today&#8217;s leaders is not to embrace AI as just another tool, but to recognize it as the next great factor of production &#8212; one that, like steam or electricity, demands new forms of business architecture. The price of cognition is falling. The real question is not how cheaply you can do old things, but whether you have the imagination to invent entirely new ones.</p><p>There is something almost punk rock about this moment: the courage to set fire to the past and refuse to let it dictate the future. Steve Jobs captured it perfectly: <em>&#8220;If you want to live your life in a creative way, as an artist, you have to not look back too much. You have to be willing to take whatever you&#8217;ve done and whoever you were and throw them away.&#8221;</em></p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefutureiselsewhere.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Future Is Elsewhere! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item></channel></rss>