What I Learned From a Decade in Policing, and Why I Started Remake The Rules
I ended up in policing mostly by accident. When I moved to Colorado after grad school, I was looking for a way to honor a value modeled for me by my parents: a commitment to public service. I wanted to help build public institutions that were just, kind, and functional; places that could right the wrongs of the past while making public resources fair, accessible, and democratic.
The Promise of the Early Internet
I was a child of the early internet. I grew up in an idealistic time when our newly connected world felt like a fresh, golden era of scientific reason, social progress, and technological innovation. Back then, "data-driven decision-making" wasn't just a marketing buzzword; it was a genuine hope for a better, more critically engaged world. Through a confluence of chance and that desire to do good, I ended up as a statistical researcher within a large municipal police department.
My role eventually grew into leadership and building the complex models that represented the department's operations in real-time. The utility of these tools was supposed to be simple: distill critical information about an incredibly complex organization so that everyone, from officers to dispatchers to chiefs and city leaders, could make better judgments.
The Blur of Choicetech
But the deeper I got, the more I ran into a foundational problem. We had an impressive amount of information to build from, but we were losing the meaning behind it. We were asking massive questions—What is good policing? Is this department operating in a way that is acceptable and positive?—while using data tools that were steering conversations, sometimes problematically or unproductively.
The lines between the tool as an "information provider," "primary decision-maker," and “justification resource” became blurry. I watched as the power of these tools began to shift our focus, our judgment, and even the stories we told ourselves about what was happening. It became clear to me that these weren't just data tools. They were choicetech.
The Challenge of Choice Inflation
The infusion of data into the day-to-day operations of a police department is a strange combination of extraordinary and immature. And that same strange combination is the baseline for almost every aspect of our lives. We are living in a world that is technologically sophisticated but functionally unstable, where the tools we use to navigate our days are often far more powerful than our strategies for using them. What I saw was this widening gap: that our information technology tools are far outpacing our capacity to integrate them well into our decision-making. And the impacts of that widening gap are often unseen and unmanaged.
We were not only trying to learn new tools and digest new data, we were doing it while we were suffering from default decay and choice inflation. The sheer power of our new choicetech was changing our information environment so quickly that we could no longer rely on old assumptions or ways of thinking. Instead, we were forced to wade through an ocean of new choices, new options, and so much data it becomes difficult to orient at all.
And as the flood of new choicetech continues, often with problematic and profit-driven intent baked into their designs, our capacity to discern what choicetech is helping and what is harming us diminishes.
Choosing a Tech-Agnostic Path
This realization changed everything for me. It forced me to rethink what was actually needed to live and decide better with our now omnipresent, society-quaking choicetech. I became even more tech-agnostic; a real sin in a world where innovation means new tech means unbridled success. I realized the goal isn't to build new patterns around every shiny new tool, but to be incredibly intentional about the few we do adopt. It requires a specific kind of choicetech literacy: the ability to look under the hood and understand how an environment is nudging your judgment.
Designing for High-Reliability
In engineering, we use high-reliability design to keep rockets from blowing up. It’s a discipline built on the assumption that in complex systems, failure is the default. Therefore, you must design every routine and every tool to actively fight against those failures before they become catastrophes. I started to see that our social systems, especially our most powerful institutions, ethically require that same level of care.
And the choicetech within our institutions isn't just administrative; it's the architecture that shapes behavior and changes choices. It dictates what an officer sees and responds to, what a chief measures, and how an organization with the authority to use force understands what is actually happening. This is the structural opposite of "go fast and break things." It is the refusal to accept "glitches" when the cost is measured in public safety, human lives, and civil rights.
Systemic Integrity for the High-Stakes of Everyday Life
The problem of successfully collaborating with choicetech isn’t contained to policing. Making high-stakes choices with choicetech is everywhere—the way we hire, the way we lead teams, the way we choose partners—and they all deserve a better level of care. These systems impact way more people, way more frequently, than a moon landing ever will. They deserve systemic integrity: where our processes, tools and goals actually align, and where our choicetech is designed to support the person making the choice and the people impacted by that choice.
I started Remake The Rules because we are outpacing our ability to live and decide well inside the hybrid, human-choicetech systems we’ve built. Learning how to do this takes time, and it requires a commitment to the creative labor of developing new patterns of thought and behavior. I still believe in the promise of a critically engaged world. But that promise doesn’t fulfill itself. We have to remake the rules to ensure our tools are truly supporting us, our communities, and our greater goals.