ericsysmin's DevOps Blog

Ansible Collections: Testing only what’s changed

Ansible Collections: Testing only what’s changed

Previously

When testing roles before GitHub Actions, it was assumed that you’d only have one repository for each role. But with the addition of collections, that is no longer the case. Your collection can now have multiple roles, modules, and often you do not need to test everything when a role or a set of modules has changed.

Using GitHub Actions, there’s a way to do this.

Now with GitHub Actions

Using GitHub Actions and workflow, we can configure what actions will trigger a test (workflow) run. In my example, which I do use on all of my collections, I set only on Pull Request and Push will the tests be triggered.

So if you notice in the example, configured my test to run on both push and pull_request. Unfortunately, GitHub Actions doesn’t support anchors yet so I couldn’t use them.

Why did I choose those paths?

'roles/zabbix_agent/**'  – sets GitHub actions to watch all the files underneath the role zabbix_agent

'molecule/zabbix_agent/**'  – watches all the files part of the molecule testing for zabbix_agent

'.github/workflows/zabbix_agent.yml'  – the file that runs the GitHub Action workflow itself

The code here helps ensure that only when a file used for testing or executing this role is modified will it run and ensures that you don’t waste a lot of testing time on GitHub Actions so other tests can run on other repositories. You can find more options here https://help.github.com/en/actions/reference/workflow-syntax-for-github-actions#on

Continue reading...
Ansible Collections: Automating the Release Process to Galaxy

Ansible Collections: Automating the Release Process to Galaxy

Since we are moving to Ansible Collections, some things are changing. Now when you create your collection and update your collection Ansible Galaxy will no longer automatically discover your collection via a Webhook. Now for Galaxy to know about your collection you have to upload a tar.gz file that containers the result of the ansible-galaxy collection build command.

However, many of us may still want to automate that process, and with @geerlingguy‘s help I was able to fully automate the release process, not from just tagging a release, but creating a release as we would before. So how does this work?

Creating the Build Directory

First, we need to create the build/  directory and include a couple of files.

Instead of having a galaxy.yml file in our root, we will need to generate the file when we execute the playbook.

This is the galaxy_deploy.yml  playbook.

You’ll then need to create a build/templates folder, and create the galaxy.yml.j2  file within the templates folder.

Edit the values to fit your Ansible Collection, the only var I use is {{ tag }}  which will be used later on.

Ok, so now that we’ve created the build components now we need to do the automation part of this. I chose to use GitHub Actions again, as they are the recommended path for the repositories sitting at https://github.com/ansible-collections.

Configuring the GitHub Action Workflow

In your .github/workflows/  folder you’ll need to create a release workflow. To do this I used the following GitHub Actions Workflow YAML. I called it release.yml , and it sits at .github/workflows/release.yml this is an example of what you can use.

Using the on  value we are able to set the workflow to only execute when a release is created in GitHub. This will ensure we have a GitHub ref to be used against the playbook. It will also sync your Ansible Galaxy release with GitHub release actions.

If you noticed we have a key here that provides our Ansible Galaxy token ${{ secrets.ANSIBLE_GALAXY_TOKEN }} for us to use this token we need to get it from Ansible Galaxy, and add it to our repository secrets. You can find your key here https://galaxy.ansible.com/me/preferences under API Key.

Ansible Galaxy Preferences Page

Within the GitHub repo go to the Settings -> Secrets.
GitHub Settings Page

Then when on that page we will add a new secret and name it ANSIBLE_GALAXY_TOKEN

GitHub Secrets Page

Now when the Workflow runs it will grab this secret from your GitHub and be able to authenticate to Ansible Galaxy.

This section tells GitHub Actions to only run this workflow when a release is created. That is done in the GitHub UI, just like you did in the past to release a new version of a role.

This section:

  • checks out the code
  • configures python 3.8 on the host
  • installs the latest version of python pip
  • installs ansible
  • then runs the playbook with the github.ref  value from the GitHub Release action

Once this is done you will have the release version uploaded automatically to your Ansible Galaxy account.

Continue reading...
Multi-distribution Ansible testing with Molecule on Travis-CI

Multi-distribution Ansible testing with Molecule on Travis-CI

In this post I will cover how to test Ansible roles against multiple distributions while using Molecule on Travis-CI. First of course you’d need access to Travis-CI, and a GitHub repo. But I am going to skip those details and assume you’ve already figured that part out. Hint: /git-root/.travis.yml

I am going to cover two scenarios: an existing Ansible role, and a new Ansible role. Both have a few different steps.

Getting Started

What is Molecule?

Molecule is a testing tool with support of multiple instances, roles, including dependent roles, and able to deploy against multiple operating systems, distributions, and virtualization providers including cloud platforms. It also provides you with the ability to use test frameworks to verify end results. Molecule also lets you test multiple test scenarios such as different configurations of the role.

First steps you’re going to need to install Molecule. For futures sake, I am not going to duplicate the how too, nor the steps as that can always change. Instead I am going to reference the molecule documentation.

https://molecule.readthedocs.io/en/stable/installation.html

So, on that page they cover how to install molecule and it’s required dependencies. If you have issues just find #ansible-molecule on Freenode IRC and people will help you.

You’ll also need to install docker. I’m also not going to show you how to install docker, it has changed a few times, and for the same of not screwing your environment up too much I’ll direct you to Docker’s official guide. You can find the steps for that located here:

https://docs.docker.com/install/

Existing Ansible Role

When we are using an existing Ansible role, it’s fairly safe to assume you have a “test” directory. We can essentially toss that one out, by toss it, delete it, move it, or convert it. It’s non-sense and doesn’t do much more but run the role locally. Who wants to test by editing your existing current host? Not me! We have better things like Docker to do that.

So, after Molecule, and Docker are installed, we can go ahead and work on getting Molecule setup on our existing role.

  1. Navigate to your role’s root
  2. In your role directory we need to initialize a new Molecule scenario.
  3. This will create a new folder named molecule in your roles root directory. In that folder you will have the following.

    These files are the default files created by molecule at the writing of this post. You can find out what each of these do at the Molecule documentation website. https://molecule.readthedocs.io/en/stable/ You can customize any of these to perform specific python testinfra, and other tests against your system.

Creating a New Role with Molecule Init

If you are starting from scratch and are creating a new role. The process to create the role is a bit easier. You’ll likely want to use these steps instead of the older ansible-galaxy init role_name process. Ansible Galaxy init process still creates the old tests which run locally, and don’t provide all of the lint tests, and other tests we’d want to make sure ran to verify quality of our role.

  1. To create the role we will run
  2. This will create the following directory tree in your roles root directory.

    These files are the default files created by molecule at the writing of this post. You can find out what each of these do at the Molecule documentation website. https://molecule.readthedocs.io/en/stable/ You can customize any of these to perform specific python testinfra, and other tests against your system.
  3. After this step you would work on your role and the tasks, vars, and other items needed. Once done creating the role, proceed to Configuring Molecule & Travis-CI for Multiple Distribution Testing.

Configuring Molecule & Travis-CI for Multiple Distribution Testing

  1. First file we want to edit is the molecule.yml  file. This file provides settings and configuration to Molecule when it runs. Initially the file will look like this

    By default Molecule will want to run your tests on CentOS 7 docker image. Yes you can add additional platforms here which would sequentially test on them. But the more listed the longer the test. These are not ran asynchronously. We want faster tests so we want to run asynchronously. We are going to change this file so that we can provide some values via Travis-CI. This is what it will look like.

    Adding the ${MOLECULE_DISTRO:-centos:7} allows us to specify as a var to molecule which docker image should be used in the testing process. :-centos:7 allows us to specify a default of centos:7 as the image. Without this then any situation without a MOLECULE_DISTRO  var would return an error.
  2. We will sometimes also want to edit the playbook.yml file located in the molecule/default/ directory. By default it will look like a plain role declaration as so.

    However, we may want to use some parameters, and we can do that as well.

    You can also have different playbook.yml files within the same scenario, or even multiple scenarios, but we can save that for another day.
  3. Next we will need to configure the Travis-CI configuration file. Basically since this file didn’t already exist you can use mine. Just make sure you replace my_role_name with the name of your role

    So, what this file will do is on Travis-CI create 8 different executions, each one with a different distribution and version using docker images of each. Then execute those in parallel. Each one will get a copy of the role, and execute the molecule tests against it.
  4. Once your role is properly added and repository enabled on Travis-CI you should end up seeing multiple build jobs for the build. Here’s a screenshot from an existing role I’ve built before.
  5. That’s it! You’re now finished! You’ve successfully configured molecule to test against multiple distributions.
Continue reading...

Deploying Modules as an Ansible Role

As Ansible grows as solution for much of the DevOps community many partners and supported modules are hitting the community. However, all of us are being hit by a serious problem. Release Gap.

Release Gap is the difference of time between your modules being accepted into the Ansible core and the time you release a new feature or patch you want your customers to use.

At Avi we ran into this problem, and luckily Ansible has a solution. Roles. Using roles to deploy your modules helps you control your code and guarantee customers are using the latest tested modules you offer.

To do this we need to have a role structure as such.

However, users will need to include the role in their playbook.

However, there are some caveats. Your module names should never be the same as modules that someone else, or core uses. In our case we stuck with avi_<module-name>  for our modules.

A good way to look at the pipeline is the following.

Using this pipeline you can then import your modules into the core, but also pre-release or release modules at your own schedule and version, independent of Ansible release schedule. You can also allow users to upgrade their environments.

Users can then download your modules using the following command.

You can also specify version so your end users can control versions of your modules

The goal of this was to empower the partners, supported modules to be downloaded by customers without having to force customers to also upgrade Ansible itself which can affect other modules that are working for all playbooks.

Continue reading...