January 2026
What This Is
This piece was developed through conversation between a human and an LLM. The human provided direction, local knowledge, and corrections. The LLM provided structure and prose. We offer it as an example of the practices it describes.
The Place
Thistlebridge is a house in Utah with a geodesic dome greenhouse. A marginal property — awkward parcel, freeway noise, not picturesque. The greenhouse produces plant starts for the neighborhood — tomatoes, peppers, herbs, flowers. In a back room, some servers hum.
The Questions We’re Sitting With
Two possible futures, neither certain:
In one, automation genuinely succeeds. Machines grow most of the food, manufacture most of the goods, handle most of the logistics. Many jobs disappear not because of economic failure but because machines really are better at them. The question becomes: what do people do? How do they find meaning, capability, connection?
In the other, systems fray. Supply chains break. The complexity we depend on stops working reliably. The question becomes: can communities take care of themselves? Do people know how to grow food, fix things, maintain their environment?
These sound like opposite scenarios, but they call for the same preparation: people who know how to do things with their hands, who can work with intelligent tools without becoming dependent on them, who have the skills to help their neighbors. In abundance, this is fulfilling. In scarcity, it’s survival. Either way, it’s better than passive consumption.
What We’re Testing
The specific bet: local AI infrastructure can support practical skill development without replacing human capability.
Most people experience AI as a cloud service — pay a subscription, send your data to a remote server, get results back. This creates dependency. If the service changes terms, raises prices, or shuts down, you lose access. Your data lives on someone else’s computer.
Local infrastructure inverts this. The servers in the back room run AI models without cloud connectivity, subscriptions, or data leaving the premises. Total hardware cost: around $2,000–3,000 for repurposed enterprise equipment. This runs language models, voice transcription, image generation, document processing — all locally.
The hardware and software exist. Integration is the bottleneck — making the pieces work together smoothly enough that you don’t need to be a systems administrator to use them.
We’re trying to solve that for a specific use case: documentation while working. You’re in the greenhouse, hands occupied, learning through doing. You want to capture insights without stopping to write. Instead of fumbling with voice memos, you talk to the system like you’d talk to a colleague. It transcribes, organizes, connects new observations to previous work.
This is a narrow test. But if it works — if natural language can mediate between human intention and technological capability while you’re actually doing physical work — it suggests something larger might be possible.
What This Isn’t
This isn’t a tech startup. There’s no business model, no growth targets, no investor expectations. The house is paid off. Ongoing costs are minimal. We’re not trying to scale.
This isn’t a commune or intentional community. It’s a single-family home where ordinary domestic life happens. The experimental work fits around that, not the other way around.
This isn’t a manifesto. We’re not sure this approach works. We’re testing it. The documentation is honest about uncertainty because the point is to figure out what’s true, not to promote a predetermined conclusion.
The Replication Question
If the model works here, could it work elsewhere?
The goal isn’t to scale Thistlebridge itself, but to develop something transferable. Other homes, other communities, other contexts. The knowledge to set up and maintain local AI infrastructure is currently concentrated in a small population of technically sophisticated people. Most communities don’t have someone with that skillset.
We’re planning a nonprofit called Nymphaea — named after the water lily genus (there’s a Nymphaea caerulea in the greenhouse pond) — to work on this transfer problem. Training people at Thistlebridge who then help establish sites in their own communities. Barn-raising replication rather than franchise scaling.
But that’s future. For now, we’re focused on making one site work well enough to be worth replicating.
Following Along
These updates will document what we’re learning — what works, what doesn’t, what questions emerge. Published when there’s something worth saying, not on a schedule.
The approach is carrier-bag rather than hero narrative — gathering fragments, testing approaches, accumulating observations rather than conquering problems with solutions. Some will be useful; some won’t. The usefulness can’t be determined in advance.
If you’re interested in local AI infrastructure, practical skill development, or the intersection of the two, you might find something here. If not, that’s fine too. We’re not trying to build an audience. We’re trying to figure something out and sharing the process for whoever finds it useful.
The greenhouse is quiet in January. Seeds wait in packets. Spring will come.
Sources
-
The “carrier bag” framing comes from Ursula K. Le Guin’s essay “The Carrier Bag Theory of Fiction” (1986), collected in Dancing at the Edge of the World (1989). She argues for narrative as container rather than weapon — gathering rather than conquering.
-
The two-futures framing (automation success vs. system fragility) is our own synthesis, but draws on a long tradition of thinking about appropriate technology, including E.F. Schumacher’s Small Is Beautiful (1973) and Ivan Illich’s Tools for Conviviality (1973).