We have everything ready to begin our pipeline and need to start with unit testing. Unit Testing will be performed by Jenkins. Jenkins is a very powerful tool and will take some time to configure.
Configuring Jenkins
In this section, we will be creating a Jenkins job that we can call that will perform our automated testing and deploy our module to our puppet master. This is the most complicated section and has many parts. This is also where we put all that continuous integration material into practice in the real world.
There are a couple of Jenkins plugins we will need in order to successfully test our module. Click on Manage Jenkins, then Manage Plugins. Click on the Available Tab and select the following plugins:
– Git Plugin
– SSH Agent Plugin
Jenkins in one of the favourite open source tools. It must also be one of the most flexible.
You can set up Jenkins to monitor your Gitlab project for any changes. If Jenkins sees any commits to the master repo, Jenkins will execute a build script and run any customized scripts. Here is how to set this up.
We now need to give our Jenkins server access to our GitLab repository. To do this, you will need to create a user on Gitlab named Jenkins and give the user Reporter permissions to your repository.
On the Jenkins server login as the Jenkins user from the root user
# su – Jenkins
Create an SSH keypair like we did for your user account.
$ ssh-keygen -t rsa $ cat ~/.ssh/id_rsa.pub
Copy the contents of the key and add the SSH key to the Jenkins user on Gitlab.
Our first task is to create our Jenkins job. Click on Create New Jobs to start the process.
Name the job the same as your full module name. For our example, the name would be ‘cdbook-sample.’ Then select Freestyle as the job type. Click on OK to create the job. Under Source Code Management, select git and enter the git URL of your repository.
Next to Credentials, click on Add and configure the settings like the following:
Click on Add to add the credentials. Jenkins automatically tests the connection and will display an error if there is an issue.
Under Build Triggers, click on Poll SCM. This will allow our GitLab server to automatically call the Jenkins job when an update is pushed.
Click on Build Now to start our first build job. Not much will happen at this point, but Jenkins will pull our module repository down to the Jenkins server. You can see this by clicking the Workspace link and looking at what Jenkins has pulled from git.
There is our module ready for testing. But before we can do our automated testing, we need to configure our Jenkins server to run Ruby for us just like we did on our test system.
Configuring Jenkins for Puppet Testing
Follow the same steps as before to install RBENV on our Jenkins server. Be sure to install the same ruby version as our development system. When you are done, you should be able to see the version of Ruby.
Install the bundler gem
$ gem install bundler
We are now prepared to automate some of our puppet testing using Jenkins.
Create our Build Script
We now need to create our build script that will perform the actual testing. In the end, you will have many build jobs with many git repositories. If you want to update the build script for them all you would have to edit is the Jenkins jobs. The alternative is to have Jenkins clone the build script from its own repository and run it from within the Jenkins job. This allows us to update the script in one place and affect all the Jenkins jobs for our modules.
Create a new project in GitLab called Jenkins-build-script and give the Jenkins user permissions to the project. Next, create a new directory on your development system for the script.
$ mkdir -p ~/Development/scripts/jenkins-build-script
Edit the build script which for this tutorial will be a simple Bash script. You could write it any language you want as long as you have that language installed on your Jenkins server.
$ cd ~/Development/scripts/jenkins-build-script
$ vim test-puppet.sh
Add the following lines to your test script:
# install the gems to a local subdirectory bundle install --path=.bundle/gems
Save the file and fix the permissions to make it executable:
$ chmod a+x test-puppet.sh
Create the local commit.
$ git init $ git add . $ git commit -m “initial commit”
Frequently Asked Puppet Interview Questions & Answers
Copy the git URL from GitLab and add the remote repository.
$ git remote add origin $ git push origin master
Our Jenkins build script is now available on our GitLab server. Edit your Jenkins job to perform the build. Go to your Jenkins job and click on Configure on the left. Under Build, click on Add Build Step.
Select Execute shell and fill in the following lines:
Click on the Save button. We are now ready to do our first iteration, which is to make sure that our gems install. Click on Build Now. After the build is complete, click on the last build link on the left, then click on the Console Output link. You will see that Jenkins has cloned our test script and ran it. It also shows that our gems are installed.
Run automated testing
Automated testing is one of the key ways to ensure that your libraries and manifests are meeting your expectations.
Our next iteration will be to do so automated testing! Edit the local copy of your build script on your development system and add the following line to the end of the script.
bundle exec rake spec
Save and push your update to git.
$ git add . $ git commit -m “adding rspec testing” $ git push origin master
And run the Jenkins job again by clicking on Build Now.
Our puppet rspec tests are now automatically run! This is the first step of our continuous deployment pipeline and now we need to move to our next step. After puppet unit testing, we need to deploy our module to our non-production environment for further testing.
Updating our modules using R10K
If testing is successful, then we need to do something with our module. According to the continuous deployment pipeline we need to deploy our module to our nonprod puppet environment for acceptance testing. Our first step is to install MCollective on our Jenkins server as the Jenkins user.
Configure MCollective on the Jenkins Server
$ tar cvf .mcollective.* mco.tgz
as root move the tarball to your admin users home directory
# cp /var/lib/peadmin/mco.tgz /home/stackadmin/ # chown stackadmin:stackadmin /home/stackadmin/mco.tgz
On the Jenkins server as the Jenkins user:
$ scp stackadmin@puppet.billcloud.local:/home/stackadmin/mco.tgz . $ tar -xvf mco.tgz
Edit the .mcollective file for the location of your MCollective configuration files:
Test our configuration by running the following command:
$ mco ping
You should see the following output or something very similar:
Now we need to add the R10K MCollective plugin like previously outlined in the R10K Configuration section. Test this by running the following command from the Jenkins server as the Jenkins user:
$ mco r10k synchronize
Add our module to the control repo
To add our sample module to the control repo, add the following lines to your Puppetfile:
Push your changes back to GitLab, so the next R10K run will try to copy our module to the puppet masters.
Make a change to your sample module by editing the metadata.json file and incrementing the version point release by one. Push your change to git. Login to the Jenkins server and look for the build job for your module and view the console output of the latest build that should have been triggered by the git commit we just pushed to our module’s repository.
And we should see the update on our compile masters:
Our work-support plans provide precise options as per your project tasks. Whether you are a newbie or an experienced professional seeking assistance in completing project tasks, we are here with the following plans to meet your custom needs:
Name | Dates | |
---|---|---|
Puppet Training | Dec 24 to Jan 08 | View Details |
Puppet Training | Dec 28 to Jan 12 | View Details |
Puppet Training | Dec 31 to Jan 15 | View Details |
Puppet Training | Jan 04 to Jan 19 | View Details |
Ravindra Savaram is a Technical Lead at Mindmajix.com. His passion lies in writing articles on the most popular IT platforms including Machine learning, DevOps, Data Science, Artificial Intelligence, RPA, Deep Learning, and so on. You can stay up to date on all these technologies by following him on LinkedIn and Twitter.