Jenkins is a tool that can help you do lots of crazy stuff, very fast. The questions is whether you want it to.
Out of the box, these things are true:
- a workspace is shared between different runs of the same job, and also available for easy peeking into (often used to “integrate” jobs),
- configuration of jobs are not version controlled,
- only one repository or source of changes is recorded for a run of a job,
That is, if you have two sources for changes, you can’t really model this with Jenkins’ jobs - you’ll have to make one source master and in it point out one changeset that you’re recording. Otherwise, re-running a job can only happen as long as only one of the source introduce changes, and the other stays put so you can use “latest” from that repo.
Are there other models?
Yes! The people over at GoCD thought this through and puts a bigger perspective on things than the rather simple model provided by Jenkins.
A Pipeline (think of this as a segment of a long chain of pipelines) can have many Materials, such as Git, Mercurial, NuGet, Maven and Debian package repositories, but also other Pipelines. That is, for every run of a Pipeline, GoCD will remember exactly what inputs where selected from all these Materials, allowing you to easily re-schedule an exact re-run. This of course helps make your jobs deterministic.
Nothing is perfect
Blue Ocean, a new UI on top of Jenkins, looks great but doesn’t really help with the underlying assumptions which aren’t sufficient for Continuous Delivery.
That, essentially is the bottom line: if all you want to do, is continuously build your stuff, then Jenkins is probably fine. But if you want to care for a more complex setup, it’s simply not built to handle the complexity of a large setup.