(Final Manuscript, April 15, 2003) © Eric Evans, 2003 19
4. The model was distilled. Important concepts were added to the model as it became more complete,
but equally important, concepts were dropped when they didn’t prove useful or central. When na
unneeded concept was tied to one that was needed, a new model was found that distinguished the
essential concept so that the other could be dropped.
5. Knowledge crunching. The language combined with sketches and a brainstorming attitude turned
our discussions into laboratories of the model, in which hundreds of experimental variations could
be exercised, tried and judged.
It is this last point, knowledge crunching, that makes it possible to find a knowledge rich model and ways
to distill it. It requires the creativity of brainstorming and massive experimentation.
Financial analysts crunch numbers. They sift through reams of detailed figures, combining and
recombining them looking for the underlying meaning, looking for a simple presentation that brings out
what is really important – an understanding that can be the basis of a financial decision.
Effective domain modelers are knowledge crunchers. They take a torrent of information and probe for the
relevant trickle. They try one organizing idea after another, searching for the simple view that makes sense
of the mass. Many models are tried and rejected or transformed. Success comes in an emerging set of
abstract concepts that make sense of all the detail. This distillation is a rigorous expression of the particular
knowledge that has been found most relevant.
Knowledge crunching is not a solitary activity. A team of developers and domain experts collaborate,
typically led by developers. Together they draw in information and crunch it into a useful form. The raw
material comes from the minds of domain experts, from users of existing systems, from the prior
experience of the technical team with a related legacy system or another project in the same domain. It
comes in the form of documents written for the project or used in the business, and lots and lots of talk.
Early versions or prototypes feed experience back into the team and change earlier interpretations.
In the old waterfall method, the business experts talked to the analysts, analysts digested and abstracted and
passed the result along to the programmers who coded the software. This failed because it completely lacks
feedback. The analysts have full responsibility for creating the model based only on input from the business
experts. They have no opportunity to learn from the programmers or gain experience with early versions.
Knowledge trickles in one direction, but does not accumulate.
Other projects have iteration, but don’t build up knowledge because they don’t abstract. They get the
experts to describe a desired feature and they go build it. They show the experts the result and ask them
what to do next. If the programmers practice refactoring they can keep the software clean enough to
continue extending it, but if programmers are not interested in the domain, they only learn what the
application should do, not the principles behind it. Useful software can be built that way, but the project
will never gain the kind of leverage where powerful new features unfold as corollaries to older features.
Good programmers will naturally start to abstract and develop a model that can do more work. But when
this happens only in a technical setting, without collaboration with domain experts, the concepts are naïve.
That shallowness of knowledge produces software that does a basic job but lacks a deep connection to the
domain expert’s way of thinking.
The interaction between team members changes as all members crunch the model together. The constant
refinement of the domain model forces the developers to learn the important principles of the business they
are assisting, rather than mechanically producing functions. The domain experts often refine their own
understanding by being forced to distill what they know to essentials, and they come to understand the
conceptual rigor software projects require.
All this makes the team more competent knowledge crunchers. They winnow out the extraneous.
They recast the model into an ever more useful form. Because analysts and programmers are feeding
into it, it is cleanly organized and abstracted, and can provide leverage for the implementation.