principledockerMinor
Best practice for installing new jenkins server
Viewed 0 times
newpracticejenkinsinstallingforserverbest
Problem
When creating a fresh new Jenkins server is it better to install it on a VM or Docker?
I know that for most situations when you want just one bit of software on a server the answer would be Docker. Because of all it's plugins and their updates it is a rapidly changing piece of software so it feels like this is the exception.
If you were to use a Docker container, I know you can save all the jobs and configurations on external volumes. How would you go about keeping the updated plugins so you won't have to reinstall them if it falls? Is there a better way than Docker commits/images/saves? Those feel like they could quickly become clunky.
I know that for most situations when you want just one bit of software on a server the answer would be Docker. Because of all it's plugins and their updates it is a rapidly changing piece of software so it feels like this is the exception.
If you were to use a Docker container, I know you can save all the jobs and configurations on external volumes. How would you go about keeping the updated plugins so you won't have to reinstall them if it falls? Is there a better way than Docker commits/images/saves? Those feel like they could quickly become clunky.
Solution
There is no single "best" practice. Use whatever approach you and your team feel comfortable with and fits well with the rest of your infrastructure.
If you want to do the Docker approach, you might find this person's writeup worthwhile.
To answer your questions:
-
If you were to use a Docker container, I know you can save all the jobs and configurations on external volumes. How would you go about keeping the updated plugins so you won't have to reinstall them if it falls?
Depends. You may want to write a script or configuration management recipe that installs all your plugins for you and/or configures your jobs. You can also create a custom Docker image containing Jenkins, all of your plugins, and all of your job configuration. You can also combine these two approaches, using the aforementioned scripts as part of an automated image build process. Whatever you do, you should definitely ensure whatever files (e.g. Dockerfiles) and build steps (e.g. scripts) are part of the process are stored in configuration management in order to make deploys repeatable.
-
Is there a better way than Docker commits/images/saves? Those feel like they could quickly become clunky.
Again, it's really just a matter of what you're comfortable with, there's no such thing as "best"; instead aim for "it does what I need it to do".
If you want to do the Docker approach, you might find this person's writeup worthwhile.
To answer your questions:
-
If you were to use a Docker container, I know you can save all the jobs and configurations on external volumes. How would you go about keeping the updated plugins so you won't have to reinstall them if it falls?
Depends. You may want to write a script or configuration management recipe that installs all your plugins for you and/or configures your jobs. You can also create a custom Docker image containing Jenkins, all of your plugins, and all of your job configuration. You can also combine these two approaches, using the aforementioned scripts as part of an automated image build process. Whatever you do, you should definitely ensure whatever files (e.g. Dockerfiles) and build steps (e.g. scripts) are part of the process are stored in configuration management in order to make deploys repeatable.
-
Is there a better way than Docker commits/images/saves? Those feel like they could quickly become clunky.
Again, it's really just a matter of what you're comfortable with, there's no such thing as "best"; instead aim for "it does what I need it to do".
Context
StackExchange DevOps Q#6388, answer score: 3
Revisions (0)
No revisions yet.