HiveBrain v1.2.0
Get Started
← Back to all entries
patternModerate

Best Practices for a multi environment deployment using Jenkins

Submitted by: @import:stackexchange-devops··
0
Viewed 0 times
multijenkinsenvironmentpracticesforusingdeploymentbest

Problem

I have 3 environments, each on their own Virtual Network with their own configurations. Do I need to have 3 separate instances of Jenkins if I am to do Continuous Deployments on each of the environments? What are some best practices regrading deploying on a multi environment architecture?

Solution

This is a very good question as it is a common anti-pattern to couple ci/cd tools with environments.

Jenkins is a build factory. As such, it should be totally agnostic of the notion of environment, or even delivery.

The best practice is to have some sort of staging process and / or interface (if you can afford it: a dedicated delivery software).
Staging process

You should try to create some sort of staging jobs for each environment, in which you'll define the configuration of the environment and bundle the final package with it.

Next you'll have delivery jobs for each environment.
Delivery tools

In order to do create the staging and delivery jobs, you may use very basic scripting tools, like shell, but there are more adapted scripting tools, like Ansible, puppet, chef...

If you can afford the expense, you may consider investing in a deployment software (I'll mention some that I worked with in the comments section).

Notice that those softwares manage some kind of environment staging process.

Obviously, it makes a lot of sense to combine deploy softwares and scripting.
Beyond segregating tools and environment

Since you asked for best practices, it appears to me that it is worth mentioning another commonly found anti-pattern: coupling SCM tools and delivery process.

It is a very good practice to store environment configuration (not passwords or confidential information, of course) and go live scripts in SCM tools (like svn or git...). It is however a very bad practice to check-out environment configuration and go live scripts DURING the go live. It might simply be unavailable when you need it.

This check-out phase should be part of the staging process that I mentioned before.
Idempotence

Another best practice is that your scripts should be idempotent: this means that you should be able to play your scripts once to perform the staging and delivery. Then you should be able to play them again and again, and they would not change the state of your system unless something chacked in the configuration.
Scaling

As a final best practice I shall share from direct experience is scaling Jenkins: different teams have different needs and uses of Jenkins. When too many teams share a common Jenkins, there might be issues with resources. The worst case is when a team requires to restart the Jenkins master while another needs to deliver.

The ideal would be to have one Jenkins per team or per group of teams that share the same goal and delivery planning.

Context

StackExchange DevOps Q#4649, answer score: 10

Revisions (0)

No revisions yet.