The Model
A working prototype for community-scale AI infrastructure — built on repurposed hardware, running in an ordinary home, invisible to anyone who isn't looking for it.
The Physical Layer
From the outside, Thistlebridge looks like a house with a nice greenhouse. That's intentional. The infrastructure is designed to integrate with residential life, not dominate it.
The home: A 1,500 square foot single-family house, retrofitted with heat pumps (heating, cooling, water), solar panels, and modern insulation. Paid off. Ongoing costs minimal.
The greenhouse: A 26-foot Growing Spaces geodesic dome on an ICF foundation, with a Ceres climate battery (GAHT) for passive heating and cooling. The ground stores summer heat and releases it in winter. No supplemental heating required down to about -10°F.
The growing system: A 1,080-cup wicking system on shelves, producing plant starts for the neighborhood. Tomatoes, peppers, herbs, flowers. Low labor once established. Demonstrates that the infrastructure can generate modest revenue while serving the community.
The Compute Layer
A local server runs AI models without requiring cloud connectivity, subscriptions, or data leaving the premises. This is the core technical contribution: proving that capable AI infrastructure can run on repurposed enterprise hardware in a home environment.
The current setup includes:
- Dell PowerEdge server with dual NVIDIA Tesla P40 GPUs (repurposed datacenter hardware, purchased used)
- NVIDIA Jetson Xavier (edge AI development board)
- Raspberry Pi cluster for lightweight services
- Mesh network across the property
Total hardware cost: approximately $2,000–3,000 for capable local AI. This runs large language models, image generation, voice transcription, and document processing — all locally.
What it enables:
- Assistance: Ask questions, think through problems, get help with planning — without internet dependency or usage fees
- Documentation capture: Voice notes and photos from the field get transcribed, organized, and structured automatically
- Knowledge accumulation: Over time, the system builds a searchable record of what was tried, what worked, what failed
- Privacy: Sensitive information never leaves the property
The stack is intentionally modular — Proxmox virtualization, Ollama for models, standard open-source tools. As AI capabilities change (rapidly), the system adapts. Nothing is locked in.
Why This Matters
Most people currently experience AI as a cloud service: pay a subscription, send your data to a remote server, get results back. This creates dependency. If the service changes its terms, raises prices, or shuts down, you lose access. Your data lives on someone else's computer.
Local AI infrastructure inverts this relationship. You own the hardware. You control the models. Your data stays local. The capability persists even if the internet goes down.
This isn't about paranoia or prepping. It's about the same principle that makes home gardens valuable even when grocery stores work fine: self-reliance as a baseline, not just a backup plan.
The harder question is: how does someone who isn't a systems administrator get this running? The hardware exists. The software exists. But the knowledge to combine them, maintain them, and adapt them as things change — that's the bottleneck.
Nymphaea: Replication and Training
Thistlebridge is a proof of concept. The goal isn't to scale this single site but to develop a model that replicates — other homes, other communities, other contexts.
Nymphaea is the planned organization for this replication work. (Named after the water lily genus — there's a Nymphaea caerulea in the greenhouse pond.) The mission: help communities implement their own local AI infrastructure alongside practical skills education.
The approach:
- Train operators at Thistlebridge: People come here, work alongside existing operators, learn the systems through use. Apprenticeship, not curriculum.
- Document everything: The training process itself generates documentation. What confuses newcomers? What takes longest to learn? What can be simplified?
- Barn-raising replication: When someone is ready to establish a site in their community, trained operators travel to help set up infrastructure and train local people.
- Network, not hierarchy: Sites stay independent, connected by mutual aid rather than central control. Knowledge flows between them.
The first few replications will be tightly experimental. What "sufficiently trained" means will emerge from doing it, not from criteria defined in advance. Towpath — a second site on leased land nearby — is planned as the first test of whether the model transfers.
What Funding Enables
Thistlebridge currently operates on minimal costs from a paid-off home. This is sustainable but limited. Funding would enable:
Near-term: Hire the first Nymphaea staff — people to train at Thistlebridge who will eventually help establish other sites. The documentation of their onboarding becomes core methodology.
Medium-term: Establish Towpath as a second site, testing whether the model works with purpose-built (rather than retrofitted) infrastructure and off-the-shelf (rather than repurposed enterprise) hardware.
Longer-term: Support additional replications, refine training methodology, develop open documentation that others can use independently.
Nymphaea will be structured as a 501(c)(3) nonprofit once the model is proven. Until then, support flows through Thistlebridge directly.
What Success Looks Like
In three years, if this works:
- Thistlebridge operates sustainably, producing food and training people
- At least two other sites exist, run by people trained here
- Documentation is mature enough that new sites can be established faster
- A loose network of sites shares knowledge and supports each other
- The approach is written up and available for anyone to adapt
The demonstration isn't that this is the right way to live. It's that this way of living is possible — local AI infrastructure, practical skills, community connection, all running on modest resources in ordinary places.
Interested in supporting this work? See how to help, or reach out directly to discuss larger involvement.