/ #personal #automation 

6 Things I Learned in 2019

As I work from home, I don't get to spend much time in my car…that said, when I do, it's always an excellent opportunity to catch up on podcasts. This morning, I listened to a Virtually Speaking podcast episode where Frank Denneman talks about the Art of Learning where he described his experience of learning PowerCLI to create his PCIe Device NUMA Node Locality script.

There were a lot of things I could relate to in his experience! While listening to the podcast, I wondered what I learned new in 2019? I must admit it was a productive year for me in learning new technologies or tech skills, and I tried to summarize the 6 major ones below.


I used to maintain my virtual machine templates manually since I started with VMware products (which was not optimal). It was Timo that introduced me to Packer from HashiCorp: I must admit it makes a real difference. Packer automates the creation of any type of machine image for multiple platforms (vSphere, AWS, etc.) from a single source configuration. I'm only building an Ubuntu 18.04 LTS template using Packer for now (as it's the only one that I require), but I plan to extend that process to Windows, CentOS, PhotonOS, and so on.

If you're interested, you can check my Packer templates repository: https://github.com/cloudmaniac/packer-templates

Additional resource: Packer Curriculum


Ansible was my first choice to clone virtual machines and build/configure infrastructure and systems since 2016: for example, I was using a playbook to deploy multiple vSphere virtual machines from a template.

However, I switched to Terraform in 2019 for various reasons such as immutable infrastructure, state management, the possibility to quickly destroy resources, and much more.

HashiCorp Terraform allows infrastructure to be expressed as code in a simple, human-readable language called HCL (HashiCorp Configuration Language). Terraform uses this language to provide an execution plan of changes, which can be reviewed for safety and then applied to make changes.

I learned so much in a year! Terraform can seem very easy at the beginning when you start creating a few resources, but it quickly becomes more complicated.

If you're interested, you can check a repository I published recently; this repository provides an example on how to clone a vSphere template into one or multiple virtual machines using Terraform: https://github.com/cloudmaniac/terraform-deploy-vmware-vm

Additional resource: Terraform Curriculum.


Late 2018, I wondered why I was not yet familiar with Git? ¯\_(ツ)_/¯ Of course, I was able to clone a repository, but that was about it. After a few evenings learning and testing (the hard way) how Git is working, I don't need to Google anymore to write my .gitignore files, to remember how to do commits, and so on.

Of course, I don't know by heart advanced concepts which I don't need daily (such as branching), but what's important is that I understand how it's working.

There are quite a few resources out there to help you familiarize with git concepts, such as:


I will come back in another post on the reasons that made me switch to Hugo for my blog, but I had obviously to learn how to use it, how to import my old articles from Wordpress to Markdown, and how new concepts such as archetypes, shortcodes, or templates work.

Hugo is an open-source static site generator written in Go; it builds pages when I create or update my content (unlike systems that dynamically build a page with each visitor request).


I started to look at CI/CD pipelines as I wanted a sustainable publication process for my blog. One thing led to another, and I ended up creating deployment pipelines for infrastructure deployment (mainly for my lab).

I started with GitLab as it was the most open system at that time. I will not dwell on the subject, but I'm now able to use Terraform to build and scale GitLab runners automatically, and to write CI/CD multi-stage pipelines that make sense for my usage.

I also had a look at GitHub CI/CD: although the concept is the same, the implementation is a little different (with GitHub Actions).


I used some AWS services here and there since 2008 (e.g., to store an ISO on S3, or to quickly spin up an EC2 instance to test something), but nothing more. I decided to learn more on that topic for 2 main reasons:

  • My new blog version based on Hugo is hosted on AWS (mix of Route53, ACM, CloudFront, S3, and CloudWatch).
  • I required AWS services for a pet project I had last year.

Now, I'm comfortable with S3, EC2, VPC, Lambda, API Gateway, CloudFront, CloudWatch, IAM, and other secondary services: and of course, I build everything on AWS using Terraform. 😉

Yes, but how?

Here are some pieces of advice to be successful at learning something new (some were also described in the podcast):

  • Don't try to learn something just because: learning something new should be based on a use case. Frank had a specific use case, and he had to learn PowerCLI to be successful in his research.
  • You can't learn by just reading blog articles or attending conferences; you have to experience it, to test it (and to fail multiple times before being successful).
  • It's not possible to maintain sustained attention during hours.

And you, what did you learn in 2019?

Photo by Helloquence on Unsplash



Staff Architect & Member of the CTO Ambassador Program at VMware, focusing on NSX and Cloud-Native Applications. He is a double VCDX (DCV and NV, #120), VCDX panelist, frequent VMUG/VMworld speaker and contributor to the community via this blog or social media (follow him on Twitter @woueb).