At a visceral level, the world is becoming unfamiliar.
A new digital reality now overlays ordinary life—an unprecedented blend of interfaces and physical habitat. Cues we relied on for millennia can lose their reliability in days. Body language fades when people arrive as disembodied voices. Social rank, identity, and belonging—once signaled by stable cultural markers—have become as fluid as the avatars that carry them. Even micro-expressions can be filtered, edited, and optimized like marketing copy. And this won’t stop at sight alone. One by one, our senses are being routed through systems designed to persuade, predict, and profit.
This strangeness doesn’t live only on our screens. It is leaking into the built world. Self-checkout lanes, automated phone trees, and algorithmic “customer care” insert layers between people. GPS and frictionless delivery shrink the need to navigate, negotiate, or cooperate in shared space. Work, education, dating, shopping—almost every human need now has a digital proxy. We are building an environment that is increasingly human-made and human-directed—making it powerful, but never neutral.
Because “human-made” systems are built in partnership with the best and worst in us. We carry ancient tendencies: status-seeking, tribal sorting, fear responses, dominance games, shortcut thinking. These are not moral failures; they are survival features—firmware forged in a world of scarce resources and immediate threats.
FIELD NOTE: ATMOSPHERE
If the world feels suddenly louder, faster, and angrier, you are not imagining it.
The strange savanna is optimized for activation.
Treat outrage like weather: notice it, don’t worship it.
Over time, we also evolved counterweights such as empathy, reciprocity, reputation tracking, and repair after conflict. These became norms that held groups together. The human story is not simply “selfishness.” It is the ongoing struggle to balance competing drives inside a social animal. The problem today is not that we are flawed.
The problem is scale and speed.
Our technologies now act at planetary scale, at machine tempo, with feedback loops tuned to attention and incentives. Meanwhile our institutions—laws, norms, governance—move at human tempo. And our biology still reacts as if a rumor is a rustle in the grass.
E.O. Wilson captured the danger with brutal clarity: “The real problem of humanity is the following: We have Paleolithic emotions, medieval institutions and godlike technology.”
That mismatch is no longer philosophical. It is operational. It shows up in workplaces, elections, markets, classrooms, families. A single false signal can travel farther in an hour than truth can travel in a week. A coordinated manipulation can reach millions while we are still arguing about definitions. And the consequences of old instincts—fear, tribalism, domination—are amplified by tools that were never shaped for human flourishing as their default setting.
This is what it means to walk a strange savanna.
Not a savanna of grass and predators—but of feeds, metrics, recommendation engines, automated decisions, synthetic media, and invisible incentives. The threats are often informational, social, and psychological. The predators don’t always look like predators; they can look like convenience, certainty, or belonging. And the cost of getting lost is not only personal. At scale, it becomes civilizational.
So the question is not merely: What is happening to technology? The deeper question is: How do we stay human inside the environments we are building?
Balance the Triangle Labs begins with a simple claim: to navigate this new terrain, we need shared language and practical safeguards that keep three forces in view at the same time:
Science & Technology—what we can build and deploy,
Human Behavior—how we actually perceive, decide, cooperate, and panic,
Ethics & Governance—what rules, norms, and accountability can keep power aligned with flourishing.
When one of these outruns the others, the savanna becomes dangerous—no matter how beautiful the tools look from a distance.
This book is a field guide for the mismatch. It is not a tech book… but a survivability book.
The goal is to build moral infrastructure that holds under pressure. It is about why our instincts misfire in digital environments, how systems exploit those misfires, why institutions lag behind, and what it would look like to rebuild coherence in individuals, in organizations, and in the incentives that shape our shared world.
FIELD NOTE: DRIFT
When you feel stuck, ask: which corner outran the others?
Tech leap without governance?
Human panic without understanding?
Rule without capability?
Because we are not doomed by our wiring.
But we are responsible for the environments we create and for the safeguards we choose to carry as we walk.
Thus, this question.
How can such a conflicted species survive the modern digital age—and still become more fully human in the process?