ericsysmin's DevOps Blog

Converting Python Google.Cloud Objects to JSON Parseable Dictionaries

Converting Python Google.Cloud Objects to JSON Parseable Dictionaries

Trying to write some python scripts to handle our infrastructure in GCP. I found that the Google Cloud Python SDK, does not easily convert into python using __dict__, and json.dumps() so I had to do some digging. It took a bit of time but found that we can use the Python proto library to handle conversion of the Google Cloud Objects to JSON. Here’s an example of listing GKE clusters.

As you can see using proto.Message.to_json(object) allowed me to provide json parseable data. Just figured someone else can use this and I wanted to keep a note of it as the solution wasn’t something easily able to be found. Someone also found it works for other GCP objects.

Other methods were also discussed here: https://github.com/googleapis/python-vision/issues/70

Continue reading...
Configuring Docker Desktop on WSL2

Configuring Docker Desktop on WSL2

First steps, you’ll need to install and configure WSL2. To install WSL2 you can use the Microsoft Store or follow these instructions: https://learn.microsoft.com/en-us/windows/wsl/install

Then to install Docker to run on Windows and WSL2 you’ll need to follow these instructions: https://docs.docker.com/desktop/wsl/

During some testing and trying to simplify my WSL2 environment I stumbled upon an annoying issue that prevented me from running docker ps each time I attempted to run docker ps I’d receive the following error.

To get around this issue you’ll need to run the following commands:

Once those are ran you should be able to run docker without hitting permissions errors.

Those commands are adding/ensuring that the docker group exists and adding your existing user to the docker group. It then modifies the docker.sock to allow the docker group access to the socket.

Continue reading...

How to Install Pyenv on MacOS

Steps to install Pyenv on MacOS

There are a few ways on MacOS to install Python. You can install it via Brew, or by using Pyenv. After needing to switch between different versions of Python often I’ve decided to move to Pyenv. Prior to these steps I removed all versions of Python installed directly with Brew.

1. Update Brew and install prerequisites
We will need to update brew.

In some cases when installing Python >=3.12.1 we will need ncurses. If it’s missing you can install using:

2. Install Pyenv using brew

The recommended way to install pyenv on MacOS is brew.

3. Brew doctor fix

If you want to avoid brew doctor warning about “config” scripts existing outside the system or Homebrew directories please include the following in your shell.

4. Configure your Zsh profile.

If you wish to use Pyenv in non-interactive shells, add the following:

5. Restart shell

6. Install python 3.12

I am going to show how to install python 3.12 but you can select any version of your choice.

7. Switch between your python versions

pyenv shell <version> – modifies python for the current shell session

pyenv local <version> – modifies the python used in the current directory (or subdirectories)

pyenv global <version> – modifies the python used for your user account

 

Continue reading...

How to Install Pyenv on Ubuntu 22.04

Due to the slowness of repositories or even lack thereof being updated with specific versions of Python, I’ve decided to move some of my environments over to Pyenv to allow me to dynamically install and configure Python specifically for my environment. As it turns out this will also allow VS Code to allow me to choose the version of Python that I’d like to use when testing. So, here’s a quick guide to installing Pyenv on Ubuntu 22.04

Steps to install Pyenv on Ubuntu 22.04

1. Update and Install Dependencies

We need to ensure our package cache is updated, and then install the dependencies to download, and build Python from Pyenv.

2. Install Pyenv using pyenv-installer

3. Configure user profile to use pyenv

Ensure the following is in your ~/.bash_profile (if exists), ~/.profile (for login shells), ~/.bashrc (for interactive shells), or ~/.zshrc

Optionally enable pyenv-virtualenv

4. Reload your profile

5. Install python using pyenv

6. Set your python version

pyenv shell <version>  — select just for current shell session
pyenv local <version>  — automatically select whenever you are in the current directory (or its subdirectories)
pyenv global <version>  — select globally for your user account

7. Validate your installation of python

or

 

Continue reading...
Ansible Collections: Automating the Release Process to Galaxy

Ansible Collections: Automating the Release Process to Galaxy

Since we are moving to Ansible Collections, some things are changing. Now when you create your collection and update your collection Ansible Galaxy will no longer automatically discover your collection via a Webhook. Now for Galaxy to know about your collection you have to upload a tar.gz file that containers the result of the ansible-galaxy collection build command.

However, many of us may still want to automate that process, and with @geerlingguy‘s help I was able to fully automate the release process, not from just tagging a release, but creating a release as we would before. So how does this work?

Creating the Build Directory

First, we need to create the build/  directory and include a couple of files.

Instead of having a galaxy.yml file in our root, we will need to generate the file when we execute the playbook.

This is the galaxy_deploy.yml  playbook.

You’ll then need to create a build/templates folder, and create the galaxy.yml.j2  file within the templates folder.

Edit the values to fit your Ansible Collection, the only var I use is {{ tag }}  which will be used later on.

Ok, so now that we’ve created the build components now we need to do the automation part of this. I chose to use GitHub Actions again, as they are the recommended path for the repositories sitting at https://github.com/ansible-collections.

Configuring the GitHub Action Workflow

In your .github/workflows/  folder you’ll need to create a release workflow. To do this I used the following GitHub Actions Workflow YAML. I called it release.yml , and it sits at .github/workflows/release.yml this is an example of what you can use.

Using the on  value we are able to set the workflow to only execute when a release is created in GitHub. This will ensure we have a GitHub ref to be used against the playbook. It will also sync your Ansible Galaxy release with GitHub release actions.

If you noticed we have a key here that provides our Ansible Galaxy token ${{ secrets.ANSIBLE_GALAXY_TOKEN }} for us to use this token we need to get it from Ansible Galaxy, and add it to our repository secrets. You can find your key here https://galaxy.ansible.com/me/preferences under API Key.

Ansible Galaxy Preferences Page

Within the GitHub repo go to the Settings -> Secrets.
GitHub Settings Page

Then when on that page we will add a new secret and name it ANSIBLE_GALAXY_TOKEN

GitHub Secrets Page

Now when the Workflow runs it will grab this secret from your GitHub and be able to authenticate to Ansible Galaxy.

This section tells GitHub Actions to only run this workflow when a release is created. That is done in the GitHub UI, just like you did in the past to release a new version of a role.

This section:

  • checks out the code
  • configures python 3.8 on the host
  • installs the latest version of python pip
  • installs ansible
  • then runs the playbook with the github.ref  value from the GitHub Release action

Once this is done you will have the release version uploaded automatically to your Ansible Galaxy account.

Continue reading...