Your data model is your destiny
Importance of the core model
- Many commenters strongly agree that a product’s core abstractions (what the article calls the “data model”) deeply shape UX, feature evolution, and long‑term competitiveness.
- A recurring theme: when the core model is clear and coherent, everything else becomes “just implementation”; when it’s wrong or inconsistent, every new feature feels like fighting the system.
- Some extend this beyond sales/marketing to include operations and support as critical interfaces to “real” users that should share the same model.
Domain-Driven Design & shared language
- Several tie the article directly to Domain-Driven Design (DDD): ubiquitous language, early collaboration with domain experts, and modeling domain concepts, not just tables.
- Others use alternate labels like “primitives,” “lego pieces,” or “core conceptual model,” emphasizing that the real power is in inventing or refining domain primitives that reframe the problem space.
Changing the model: possible but expensive
- Multiple stories describe large, painful but successful overhauls of flawed early models, often taking a year+ of focused work.
- Advice: be greedy with subject-matter experts, plan migrations (dual‑write, log replay), and aim to do this kind of rewrite only once.
Architecture and modeling mistakes
- Some lament over-engineered microservice/domain splits that should have been a single service, noting that “subdomains” should be business, not engineering, boundaries.
- There’s a debate on pushing business rules into the database (stored procedures) vs keeping them in application code; one side praises centralization, the other warns about tight coupling and organizational gridlock.
Data model vs domain model vs implementation
- Several argue the article really describes a domain/conceptual model, not a database schema, and that conflating these is a “near miss.”
- Others broaden “data model” to include the organization’s shared conceptual center, not just physical storage.
- A separate thread notes more traditional data‑model concerns (relational design, star vs snowflake, normalization/denormalization) as another foundational layer.
AI, flexibility, and skepticism
- Some see AI as a tool to map old models to new ones or to support multiple “views” (facts vs perspectives).
- Others argue good graph/triple‑store thinking can avoid being locked into one view in the first place.
- A few suggest future “data-driven” ecosystems with open/shared schemas; others are wary of “data model” turning into a vague buzzword driven by management fashion.