
Giraffe’s Data Strategy: Why Ontology Matters for AI in Cities
When people hear “ontology” they think philosophy. But in Giraffe, ontology isn’t abstract—it’s the foundation of how we make cities computable.
What does that mean?
Ontology is the study of “being.” In our world we are subject to the rules of physics: matter can’t occupy the same space, gravity pulls on mass.
Giraffe works the same way.
- A building can’t overlap another building.
- A landscape has an area.
- A facade has a surface.
These rules are complex to understand: they’re the physics of the Giraffe universe. And they make measurement possible.
.png)
Why measurement?
Because measurement drives everything in the built environment.
- Babylon, the pyramids, the Empire State Building: all existed before computers, none existed before measurement.
- Volume, surface area, facade area, perimeter: once you can measure, you can design, engineer, finance, and govern.

The payoff of a mechanical ontology
By creating a deterministic, rule-driven data model, Giraffe unlocks consistent, auditable answers:
- “Residential GFA” comes only from buildings.
- “Landscape” comes only from things of type landscapes.
- Change the model → the numbers update instantly, according to the Giraffe 'physics'.
And the ontology can be enriched. Via API, parcel ownership can be pulled from Regrid and put into the Giraffe model. Same with price data from HelloData, zoning overlays, all plug into the same structured data framework without breaking it.

Where AI fits
AI models sits 'below' the ontology, and is governed by it. The LLM alone is subject to hallucination.
Think of it like this:
- In Giraffe’s ontology, pressing 2 + 2 will always return 4.
The rules of the system make that outcome inevitable: just like gravity's laws in physics. - In a pure LLM, pressing 2 + 2 will almost always return 4, but not because of a law.
It’s because the model has learned that “4” is the most probable next token. Usually that works, but there’s no guarantee.
That’s the difference between deterministic measurement and probabilistic synthesis.
When LLMs sit on top of deterministic ontologies, they become powerful:
- Ask: “How many students would this development generate?”
- The LLM uses Giraffe’s measurement (e.g. 770 residents), adds assumptions (student ratio), and returns a coherent answer.
The ontology keeps it grounded. The AI makes it usable.

Why this matters
Real estate is unforgiving. If a drainage calculation is wrong, you end up in court. Ontologies guarantee that 2 + 2 always equals 4. AI then becomes the friendly layer that turns raw measurement into actionable insight.
This is shaping up as a really productive direction:
- Drawing tools on a mechanical ontology → deterministic, auditable measurement.
- Third-party data enrichment via API → parcel, zoning, market, pricing data.
- LLM interface for synthesis→ natural language queries and synthesis.
This stack seems extraordinarily powerful.