# The New Folk Engineers *A Companion Artifact and Essay for [[S2E7 - Embracing AI Without Losing Your Mind]] --- There is a particular kind of silence forming beneath the noise of the AI age. Not the silence of retreat or surrender, but the silence of people quietly rebuilding their relationship with tools before the tools rebuild the relationship for them. That silence was present in this conversation. Not as a thesis statement, not as a product pitch. More like weather. A low-pressure system moving through a room on an otherwise average day in Boulder, CO while four people tried to speak honestly about what it means to remain human while standing knee-deep in synthetic cognition. And maybe that is the real story now. Not whether artificial intelligence becomes powerful (it already has). Not whether entire professions are about to dissolve into vapor trails and API calls (many already are). The real story is whether human beings can develop enough psychological, communal, and spiritual literacy to survive proximity to systems that increasingly imitate certainty, intimacy, creativity, and thought itself. Because the strange thing about this moment is that the technology arrived before the culture did. We built engines before rituals. We built interfaces before ethics. We built acceleration before emotional suspension. Civilization is now essentially beta-testing consciousness at planetary scale, and the experiment is running whether we opted in or not. Which is why conversations like this one matter. Not because anyone involved claims to have THE answers, but because there is a visible refusal here to become either techno-utopian evangelists or exhausted doomers. Instead there is something far more rare: people trying to stay grounded while the floor itself, or flatirons as it were, becomes software. --- You can hear it in Neil's relationship to craft. A sign painter turned creative director—someone who understands, at a cellular level, that making things is physical before it becomes symbolic. Paint drips. Wood warps. Hands ache. Fonts are not abstractions when you've held the brush. That history matters more than Silicon Valley tends to understand. Because somewhere along the way, much of the digital economy began treating friction as failure... remove delay, remove effort, remove ambiguity, remove labor. And yet many of the experiences that actually produce wisdom require precisely those things. A relationship without friction is usually performance. Art without friction is usually decoration. Thinking without friction is usually dogmatic ideology. The machine promises seamlessness. Human beings often require texture. Neil described painting through AI the way he once described painting itself: you stop thinking about the medium and start thinking about life. He cites Basquiat: _I'm not thinking about painting, I'm thinking about life._ That's the quality of absorption he finds in deep AI work, and it's also what makes it dangerous. He's been here before, in the physical world, with brushes and canvas, so he recognizes the moment he needs to stop. He knows what overconsumption of flow feels like from the inside. That prior embodied literacy is what gives him the capacity to regulate. Most people entering this technology don't have that reference point, and if they do, the gloss over or ignore it for the sake of efficiency. Jon approaches the same terrain from a different angle: software not as spectacle, but as reduction. Not maximalism. Not infinite dashboards glowing like casino architecture for knowledge workers. Just trying to remove what is unnecessary until the useful thing remains. He described his working relationship with Neil as a shared practice of cutting to the actual problem. A stripping away of everything peripheral until the valuable thing is clear and visible. There is something deeply humane in that instinct, especially now, when modern systems increasingly reward cognitive obesity. What they've discovered together is something the episode (and hopefully this essay) keeps circling: the fence between designer and developer, between atoms and bits, is dissolving. Pre-AI, Jon described tossing things over a wall and waiting. Now they're in the same sandbox, manipulating the same material, operating in feedback loops that didn't previously exist. The translation barrier hasn't disappeared, it's just moved from _between_ disciplines to _inside_ the work itself, where AI mediates in real time. --- Most frontier technologies are not neutral. They are shaped by incentives, and incentives eventually become psychology. If a platform profits from your dependency, it will quietly train dependency. If a system benefits from prolonged engagement, it will simulate companionship, certainty, and urgency with unnerving fluency. Not because it is evil, but because it is optimized. Capitalism at scale eventually becomes behavioral design, and behavioral design eventually becomes ontology. People begin mistaking engineered responses for reality itself. Certain AI systems are built to say _I don't know_, to use direct engineering-style language, to avoid the sycophantic loop. When the engineering reflects that humans have nervous systems, the tool behaves differently. When it doesn't, you get what many watched happen at (name your crypto event of choice): people who, from across the room, looked indistinguishable from someone on drugs. Not because they were unstable, but because they had been inside the machine too long without a way back. The danger is not merely that machines become human-like. The danger is that humans become machine-like in response: reactive, predictive, fragmented, optimization-driven, unable to sit inside ambiguity without demanding instant synthesis. And the protection, as argued in the episode, is almost embarrassingly analog. A dog that needs walking. A particular playlist. A human you can co-regulate with who will notice before you do that something is going sideways. The nervous system's off-ramp is not a sophisticated tool, it's usually another person who knows you well enough to just say _hey_. --- Underneath all of this runs another thread worth tracking carefully: self-teaching. Again and again, the conversation returned to autodidacts. Neil, self-taught across multiple mediums. Jon, a college dropout who realized the startup world didn't care about credentials. Crystal, a photojournalist who taught herself web production out of necessity. Taylor, who grew up in the dirt in Evergreen and carries that atomic physicality into every conversation about bits. This may be one of the defining archetypes of the era now arriving. Not the credentialed expert protected by static systems, but what you might call the ***neo folk engineer***—the person capable of learning across mediums, across disciplines, across collapsing economic assumptions. The person who can move between atoms and bits, between philosophy and implementation, between emotional intelligence and technical fluency, not because it is trendy, but because reality increasingly demands it. Industrial society trained specialists. The emerging landscape may favor integrators. People who can hold ambiguity across domains long enough to let something genuine emerge at the seam. Even that framing risks becoming too clean, though. Because there is grief here too, and it deserves honesty. The conversation touched repeatedly on identity collapse—creative workers watching decades of expertise destabilize almost overnight. Journalists driving Ubers in DC, the same city where they once walked into the White House. Designers disoriented. Knowledge workers staring into mirrors that no longer reflect stable economic futures. Neil described explaining AI to his father over Whiskey Wednesday, contrasting two timescales: his father's career shifted over ten years, slowly enough to metabolize; Neil's shifted overnight, and the identity had to follow at the same velocity. Some people can track that transition. Many of his friends have not been able to. You almost have to do something with this technology or be completely retired or have a trust fund. There is no third option that involves standing still. --- Neil mentioned orbiting. The image lingers. The rocket goes out to the moon, but it has to come back. It has to reenter the atmosphere safely, deplane, touch ground. The going-out matters: you're getting out of isolation, you're talking through things, you're failing in front of people who can catch you. But the return is not optional. The orbit without reentry is just drift. Perhaps the healthiest relationship to these systems is exactly that choreography: approach, build, learn, retreat, touch grass, make dinner, laugh with someone, repair something analog, return. The old mystics disappeared into caves. The new mystics may simply need better notification settings and someone nearby who knows when to knock. Absurd sentence. True sentence. Welcome to 2026. --- Jon then raises a question that opened something: could the tools themselves prompt grounding in a healthy way? The answer offered was careful: it depends on the engineering, and right now most engineering is optimized in the opposite direction. But the question points toward something real... the possibility that the people building these tools might eventually build them the way a good community functions: flagging when someone's been in too long, naming when the loop is getting sycophantic, creating exits rather than extensions. That's essentially what they're doing with the personal operating systems (SoloOS), the knowledge gardens, the open-source communications frameworks. Building structures that are owned by the person rather than by the platform, so that when one model misbehaves, you plug your wisdom into another. Sovereignty over context, rather than dependency on any single frontier model's incentive structure. Crystal described ingesting twenty years of field notes and raw journals into such a system and asking it to show her the patterns. It broke her work into six eras. It showed her the exact moment her journalism training arrived in the prose—where the mask came on, how the free writing changed, how certain qualities of the early voice never fully returned. That's both a beautiful thing and a terrifying one: a pattern recognition tool holding up the arc of a life and saying, _here is what was lost, here is what was shaped, and here is what remains. Not everyone is ready to look at that mirror. The question of readiness is not trivial. For someone whose identity has been fused with a professional mask for two decades, having it reflected back in high resolution without the right support around them could be genuinely destabilizing. Which is why, as Neil said, community keeps showing up as the actual answer. Not as a warm abstraction, but as the concrete mechanism of regulation. Someone in the room who can see you before you see yourself. --- The culture is still largely trapped between two exhausted myths. The first: technology will save us. The second: technology will destroy us. Both flatten human responsibility. Tools amplify intention... that has always been true. A hammer can build a home or fracture a skull. An algorithm can widen access or industrialize manipulation. A language model can deepen inquiry or anesthetize thought. The moral burden never fully leaves the human hand. What was notable in this conversation was a consistent resistance to spectacle. No grand declarations about AGI gods. No startup-theater salvation narratives. Just people trying, imperfectly and honestly, to construct healthier relationships with powerful systems while remaining emotionally available to one another. In an era dominated by scale, that may be the real rebellion: staying relational, staying local, staying embodied, staying in community, and staying capable of wonder without surrendering discernment. The internet trained humanity to externalize attention. AI may now pressure humanity to externalize cognition itself. That is not automatically catastrophic, but it does mean we need new forms of literacy: emotional, epistemic, relational, attentional. The next decade may belong to people who can consistently distinguish between assistance and surrender. That line is thinner than most realize, and it requires practice to maintain. --- Somewhere inside all of this, strangely, there is still hope. Not the glossy optimism sold by keynote speakers beneath LED walls, but something quieter and more durable. The hope that human beings can still choose intentionality. That communities can still shape technology rather than merely consuming it. That craft still matters, that presence still matters, that the specific texture of a person—the pauses, the exhales, the uncertainty, the laughter, the contradiction, the grief, the trembling sincerity—has not (will not) become obsolete. If anything, as synthetic systems grow more sophisticated, the texture of actual humanity becomes easier to recognize and harder to fake. Machines may eventually generate infinite content. But meaning still seems to emerge from lived experience moving between human nervous systems like fire passed hand to hand across a dark field. That remains sacred, and for now at least, no model can fully replicate the warmth of that flame. The folk engineers know this. That's why they keep coming back to the small quiet room at the base of the flatirons.