So I had dev-pipeline in a position where it met every one of my current needs: I could fetch sources, I could build, and I could generate dependency graphs. I was happy and my projects ran well.
The problem is that in order to support anything besides git and CMake, I’d have to modify sources. Yeah, it’s really just one line, (tools were looked up using Python dictionaries), but it’s not great. If I want to make life easy for users I’m implicitly stuck supporting any tooling they need forever, since it all needs to ship together. This is also true if dev-pipeline needs new commands, new executors, new dependency graph formats, and so on. On the other hand, I could refuse to accept patches for these tools and make users deal with a mishmash of their custom tooling. I’d still be happy, but anybody who didn’t use my exact set of tooling would find dev-pipeline useless.
The trick is figuring out how to meet the needs of other users without being forced to maintain their work. The solution I chose was to use plugins: if git and CMake are supported using plugins, anybody can add tooling. Once I had scm and build tools populated using plugins, it wasn’t difficult to use plugins for everything; commands, executors, build-order methods, and dependency resolution algorithms are all extendable by installing extra plugins. The core stuff that I’m willing to support is still built-in, but there’s no special knowledge or hacks; everybody gets the same API, so all tooling is equal.
Why does this matter when dev-pipeline has all of three users right now? For one, my day job has lots of repositories that still use Mercurial instead of git; writing a Mercurial plugin didn’t take long, so now I can use dev-pipeline to work with those repositories. I also contract on the side, and one of the requirements with that contractor is that engineers don’t get to build the production code (this prevents “tricks”); I realized adding a new command (not sure of the name yet) that builds components and installs them in a distribution-aware way would be useful, and now it’s easy to write. Another useful command that I missed during the initial development is “test” (especially funny since wanting to build GTest with multiple compilers was the main catalyst to start dev-pipeline development).
By making it easier for other users to leverage dev-pipeline to meet their needs, it sparked new ideas and use cases in me too. Even if nobody but me ever writes a plugin, the process still makes it easier for me to write plugins. The process also gave me a chance to focus on a more modular design.
The really cool thing though, is that I’m just slow to the station; the actually talented developers who work with Unix figured this out years ago.
Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new features.
Doug McIlroy, quoted by Eric S. Raymond
So dev-pipeline uses plugins instead of programs, but same difference. In fact, a number of the rules Raymond lists apply really well in this case:
- Rule of Modularity: Write simple parts connected by clean interfaces.
- Rule of Composition: Design programs to be connected to other programs.
- Rule of Separation: Separate policy from mechanism; separate interfaces from engines.
- Rule of Extensibility: Design for the future, because it will be here sooner than you think.
These sum up exactly why Unix’s design principles have held up for forty years. By using these principals in my work, not only do I make my life better but make it easier for future users to leverage what I’ve done to meet their needs, and that means we all win.
Those who don’t understand Unix are condemned to reinvent it, poorly.
Henry Spencer
One thought on “Step 2 — Scratching Everybody’s Itches”