Create an Ansible Content Collection and an automation execution environment, and develop a playbook to test the automation execution environment from automation controller.
Outcomes
Create an Ansible Content Collection that includes roles.
Create a custom automation execution environment.
Upload an Ansible Content Collection and an automation execution environment to a private automation hub.
Configure automation controller to use a custom automation execution environment from a private automation hub.
Create automation controller resources such as credentials, projects, and inventories.
Create and launch a new job template using a specific automation execution environment.
As the student user on the workstation machine, use the lab command to prepare your system for this exercise.
This command creates the /home/student/review-cr3/ directory and creates a Git repository that includes a playbook to test your work.
Use Student@123 as the Git password when you push changes to this remote repository.
[student@workstation ~]$ lab start review-cr3
The /home/student/review-cr3/lab-resources.txt file lists all the resources that you configure during the exercise.
You can use that file to copy and then paste those resources.
To log in to the web UI of private automation hub, use student as the username and redhat123 as the password.
To log in to the web UI of automation controller, use admin as the username and redhat as the password.
Use the following URLs to access those services:
Private automation hub: https://hub.lab.example.com
Automation controller: https://controller.lab.example.com
You can log in to each of the computer systems that you manage with the student user, with student as the password.
The password for the root user account on those machines is redhat.
Specifications
Create an Ansible Content Collection as follows:
Name the collection review.system.
Initialize a directory for the collection on workstation in the /home/student/review-cr3 directory.
The collection must require Ansible version >=2.9.10.
Add the iscsi_target_create role to the collection.
Copy the role from the /home/student/review-cr3/roles/ directory to the correct location in the collection's directory structure.
The iscsi_target_create role requires ansible.posix collection version 1.0.0 or later.
Add that dependency to the collection.
When you build the review.system collection, copy the resulting compressed tar archive to /home/student/review-cr3.
On your private automation hub, create a namespace for the collection, upload it, and approve it.
The namespace must specify Content Developers as the namespace owners.
Create a custom automation execution environment as follows:
Create the /home/student/review-cr3/ee-build directory and use this directory for the configuration files needed to build a custom automation execution environment.
Example files are in the /home/student/review-cr3/examples/ directory on the workstation machine.
The build environment must be configured to retrieve Ansible Content Collections from your private automation hub. Use the private automation hub web UI to generate an authentication token.
The automation execution environment must be tagged hub.lab.example.com/system/ee-review-rhel8:v1.0.
The automation execution environment must include the review.system, ansible.posix, and community.general Ansible Content Collections.
The automation execution environment must include your private automation hub's TLS CA certificate.
Copy the /home/student/review-cr3/Containerfile and /etc/pki/tls/certs/classroom-ca.pem files to the context/ directory of your build environment.
Use hub.lab.example.com/ee-minimal-rhel8:latest as the base container image and hub.lab.example.com/ansible-builder-rhel8:latest as the builder image.
Publish the execution environment's container image on your private automation hub.
Prepare an Ansible Playbook that tests your review.system Ansible Content Collection and custom automation execution environment, as follows:
The playbook must be named demo.yml.
A partially completed playbook is available in the Git repository at git@git.lab.example.com:student/iscsi.git.
Clone that repository to the /home/student/git-repos directory on workstation.
Create the review3 branch to store your modifications.
In your branch, edit the demo.yml playbook.
The play in that playbook must call the review.system.iscsi_target_create role and set the iscsi_target_create_disk variable to the value vdb.
These changes must be committed to your local repository and your new branch pushed to the remote repository.
Use Student@123 as the Git password when you push changes to the remote repository.
Configure your automation controller with the resources and a job template that you can use to run your playbook to test your Ansible Content Collection and custom automation execution environment, as follows:
In the automation controller web UI, create a machine credential resource that enables access to the managed nodes. Use the following settings:
| Field | Value |
|---|---|
DevOps
| |
Default
| |
Machine
| |
devops
| |
redhat
| |
sudo
| |
root
|
Create a credential resource that enables access to the Git repository. Use the following settings:
| Field | Value |
|---|---|
GitLab
| |
Default
| |
Source Control
| |
student
| |
Content of the /home/student/.ssh/gitlab_rsa file. |
Create an inventory resource named Review 3 that belongs to the Default organization.
| Field | Value |
|---|---|
Review 3
| |
Default
|
Add the serverc.lab.example.com managed node to the Review 3 inventory.
| Field | Value |
|---|---|
serverc.lab.example.com
|
Create a project resource. Use the following settings:
| Field | Value |
|---|---|
iSCSI
| |
Default
| |
Git
| |
git@git.lab.example.com:student/iscsi.git
| |
GitLab
| |
Allow Branch Override
(selected)
|
Create an automation execution environment resource for your custom automation execution environment. Use the following settings:
| Field | Value |
|---|---|
Review 3
| |
hub.lab.example.com/system/ee-review-rhel8:v1.0
| |
Always pull the container before running.
| |
Default
| |
Automation Hub Container Registry
|
Create a job template resource that you can use to run your playbook on the serverc.lab.example.com managed node.
Use the following settings:
| Field | Value |
|---|---|
Configure iSCSI
| |
Review 3
| |
iSCSI
| |
Review 3
| |
review3
| |
demo.yml
| |
DevOps
|
When everything is configured, start a job from the Configure iSCSI job template resource.
To confirm that the job is successful, you can verify that the /etc/target/saveconfig.json file is created on the serverc.lab.example.com machine.
On the workstation machine, in the /home/student/review-cr3/ directory, create an Ansible Content Collection named review.system.
Create the meta/runtime.yml file and set the requires_ansible parameter to >=2.9.10.
Change to the /home/student/review-cr3/ directory and then run the ansible-galaxy collection init command to create the collection:
[student@workstation ~]$cd ~/review-cr3/[student@workstation review-cr3]$ansible-galaxy collection init review.system- Collection review.system was created successfully
Create the ~/review-cr3/review/system/meta/ directory.
[student@workstation review-cr3]$ mkdir review/system/metaCreate the ~/review-cr3/review/system/meta/runtime.yml file with the following content:
--- requires_ansible: '>=2.9.10'
Copy the iscsi_target_create role to the collection's roles/ directory.
Declare the ansible.posix collection as a dependency.
Build the collection and then copy the resulting .tar.gz file to the /home/student/review-cr3/ directory.
Copy the role that is provided to you in the ~/review-cr3/roles/ directory to the ~/review-cr3/review/system/roles/ directory:
[student@workstation review-cr3]$ls roles/iscsi_target_create [student@workstation review-cr3]$cp -r roles/iscsi_target_create \>review/system/roles/
Edit the review/system/galaxy.yml file to declare the ansible.posix collection as a required dependency of your new collection:
...output omitted... # Collections that this collection requires to be installed for it to be usable. The key of the dict is the # collection label 'namespace.name'. The value is a version range # L(specifiers,https://python-semanticversion.readthedocs.io/en/latest/#requirement-specification). Multiple version # range specifiers can be set and are separated by ','dependencies:ansible.posix: '>=1.0.0'...output omitted...
In the ~/review-cr3/review/system/ directory, build the collection.
[student@workstation review-cr3]$cd review/system/[student@workstation system]$ansible-galaxy collection buildCreated collection for review.system at /home/student/review-cr3/review/system/review-system-1.0.0.tar.gz
Copy the resulting review-system-1.0.0.tar.gz file from the ~/review-cr3/review/system/ directory to the ~/review-cr3/ directory.
[student@workstation system]$ cp review-system-1.0.0.tar.gz ~/review-cr3/Publish the review.system collection.
Use the private automation hub web UI to create the review namespace.
Upload the review.system collection to that namespace, and approve the collection after you upload it.
Open a web browser and navigate to https://hub.lab.example.com.
Log in with student as the username and redhat123 as the password.
To create the namespace, navigate to → and then click .
Create the namespace by using the following information and then click .
| Field | Value |
|---|---|
review
| |
Content Developers
|
The Content Developers group must be a namespace owner for group members, such as the student user, to upload to the namespace.
Upload the collection from the /home/student/review-cr3/review-system-1.0.0.tar.gz file to the review namespace.
To do so, click , select the /home/student/review-cr3/review-system-1.0.0.tar.gz file, and then click .
Wait for the upload to complete.
To approve the collection, navigate to → and then click .
To confirm that the collection is published, navigate to → , and then select in .
The system collection is displayed.
Prepare the configuration files to build a custom automation execution environment.
Create the ~/review-cr3/ee-build/ directory to store the configuration files.
Create an ee-build/execution-environment.yml file to control the build process.
An example file is available in the ~/review-cr3/examples/ directory.
Set the hub.lab.example.com/ee-minimal-rhel8:latest container image as the base image.
Set the hub.lab.example.com/ansible-builder-rhel8:latest container image as the builder image.
Change to the ~/review-cr3 directory and create the ee-build/ directory.
[student@workstation system]$cd ~/review-cr3/[student@workstation review-cr3]$mkdir ee-build
Copy the ~/review-cr3/examples/execution-environment.yml example file to the ee-build/ directory.
[student@workstation review-cr3]$ cp examples/execution-environment.yml ee-build/Edit the ee-build/execution-environment.yml file.
Set the EE_BASE_IMAGE parameter to hub.lab.example.com/ee-minimal-rhel8:latest and the EE_BUILDER_IMAGE parameter to hub.lab.example.com/ansible-builder-rhel8:latest.
Remove the python and system parameters from the dependencies section.
The completed ee-build/execution-environment.yml file should consist of the following content:
--- version: 1 build_arg_defaults: EE_BASE_IMAGE:hub.lab.example.com/ee-minimal-rhel8:latestEE_BUILDER_IMAGE:hub.lab.example.com/ansible-builder-rhel8:latestansible_config: ansible.cfg dependencies: galaxy: requirements.yml
Create the ee-build/requirements.yml file and configure it to install the review.system and community.general collections into the automation execution environment.
You do not need to declare the ansible.posix collection because the review.system collection already specifies it as a dependency.
In the ~/review-cr3/ee-build/ directory, create the requirements.yml file to list the collections that you want in the new automation execution environment.
The completed ee-build/requirements.yml file should consist of the following content:
--- collections: - name: review.system - name: community.general
Create an ee-build/ansible.cfg file to configure access to the private automation hub at https://hub.lab.example.com so that the build process can retrieve the collections.
An example file is available in the ~/review-cr3/examples/ directory.
Get a token from the private automation hub web UI and then edit the ansible.cfg file to update the token parameters.
Copy the ~/review-cr3/examples/ansible.cfg file to the ee-build/ directory.
[student@workstation review-cr3]$cd ee-build[student@workstation ee-build]$cp ~/review-cr3/examples/ansible.cfg .
You update the token parameters in that file in the next step.
Retrieve the API token from the private automation hub web UI at https://hub.lab.example.com.
Navigate to → and then click . Click the icon.
Edit the ~/review-cr3/ee-build/ansible.cfg file and then paste the token from the clipboard as the value for the token parameters.
Your token is probably different from the one shown in the following example.
[galaxy] server_list = published_repo, rh-certified_repo, community_repo [galaxy_server.published_repo] url=https://hub.lab.example.com/api/galaxy/content/published/ token=c6aec560d9d0a8006dc6d8f258092e09a53fd7bd[galaxy_server.rh-certified_repo] url=https://hub.lab.example.com/api/galaxy/content/rh-certified/ token=c6aec560d9d0a8006dc6d8f258092e09a53fd7bd[galaxy_server.community_repo] url=https://hub.lab.example.com/api/galaxy/content/community/ token=c6aec560d9d0a8006dc6d8f258092e09a53fd7bd
Install the ansible-builder package on workstation.
In the ~/review-cr3/ee-build/ directory, use the ansible-builder command to create the ee-build/context/ directory by performing the first stage of the build process.
Configure the container image to include your private automation hub's TLS CA certificate.
This build process can then retrieve collections from the lab environment's private automation hub.
Use the two provided files as follows:
Copy ~/review-cr3/Containerfile to ~/review-cr3/ee-build/context/Containerfile.
Copy /etc/pki/tls/certs/classroom-ca.pem to ~/review-cr3/ee-build/context/classroom-ca.pem.
Use the yum command to install the ansible-builder package.
[student@workstation ee-build]$sudo yum install ansible-builder[sudo] password for student:student...output omitted...
Run the ansible-builder create command from the ee-build/ directory:
[student@workstation ee-build]$ ansible-builder create
Complete! The build context can be found at: /home/student/review-cr3/ee-build/contextCopy the /home/student/review-cr3/Containerfile and /etc/pki/tls/certs/classroom-ca.pem files into the context/ directory.
[student@workstation ee-build]$cp ~/review-cr3/Containerfile context/[student@workstation ee-build]$cp /etc/pki/tls/certs/classroom-ca.pem context/
Build the automation execution environment container image.
Set hub.lab.example.com/system/ee-review-rhel8:v1.0 as a tag for the container image.
After you build the container image for your automation execution environment, push the container image to the private automation hub.
Use the podman login command to log in to the private automation hub at hub.lab.example.com:
[student@workstation ee-build]$podman login hub.lab.example.comUsername:studentPassword:redhat123Login Succeeded!
Run the podman build command from the ee-build/ directory.
Add the -t option to specify the hub.lab.example.com/system/ee-review-rhel8:v1.0 tag.
[student@workstation ee-build]$podman build -f context/Containerfile \>-t hub.lab.example.com/system/ee-review-rhel8:v1.0 context
Confirm that the new container image is available locally:
[student@workstation ee-build]$podman imagesREPOSITORY TAG IMAGE ID CREATED SIZEhub.lab.example.com/system/ee-review-rhel8v1.0 65d5f183c35 2 minutes ago 435 MB ...output omitted...
Push the container image to the private automation hub:
[student@workstation ee-build]$podman push \>hub.lab.example.com/system/ee-review-rhel8:v1.0
Clone the https://git.lab.example.com/student/iscsi.git Git repository into the /home/student/git-repos directory and then create the review3 branch.
From a terminal, create the /home/student/git-repos directory if it does not exist, and then change into it.
[student@workstation ee-build]$mkdir -p ~/git-repos/[student@workstation ee-build]$cd ~/git-repos/
Clone the https://git.lab.example.com/student/iscsi.git repository and then change into the cloned repository:
[student@workstation git-repos]$git clone \>https://git.lab.example.com/student/iscsi.gitCloning into 'iscsi'... ...output omitted... [student@workstation git-repos]$cd iscsi/
Create the review3 branch and switch to it.
[student@workstation iscsi]$ git checkout -b review3
Switched to a new branch 'review3'In the branch, update the demo.yml playbook.
The playbook must call the review.system.iscsi_target_create role.
Set the iscsi_target_create_disk role variable to vdb.
When done, commit and push your changes.
Edit the demo.yml playbook.
The completed file should consist of the following content:
---
- name: Testing the iscsi_target_create role in the review.system collection
hosts: all
become: true
tasks:
- name: Ensure the iSCSI target is configured
ansible.builtin.include_role:
name: review.system.iscsi_target_create
vars:
iscsi_target_create_disk: vdbCommit and then push your changes.
If prompted, use Student@123 as the Git password.
[student@workstation iscsi]$git add demo.yml[student@workstation iscsi]$git commit -m "Updating the demo playbook"[review3 6657707] Updating the demo playbook 1 file changed, 2 insertions(+), 2 deletions(-) [student@workstation iscsi]$git push -u origin review3Password for 'https://student@git.lab.example.com':Student@123...output omitted... To git.lab.example.com:student/iscsi.git * [new branch] review3 -> review3 Branch 'review3' set up to track remote branch 'review3' from 'origin'.
Access the automation controller web UI at https://controller.lab.example.com to create the resources.
Use admin as the username and redhat as the password.
To create the machine credential resource, navigate to → , click , complete the form with the following information, and then click .
| Field | Value |
|---|---|
DevOps
| |
Default
| |
Machine
| |
devops
| |
redhat
| |
sudo
| |
root
|
To create the source control credential resource, navigate to → , click , complete the form with the following information, and then click .
| Field | Value |
|---|---|
GitLab
| |
Default
| |
Source Control
| |
student
| |
Content of the /home/student/.ssh/gitlab_rsa file. |
If you choose to browse for the file, then right-click anywhere in the directory navigation and select .
With this option enabled you can see the .ssh directory in the /home/student directory.
To create the Review 3 inventory resource, navigate to → , click → , specify the name of the inventory, and then click .
| Field | Value |
|---|---|
Review 3
| |
Default
|
Within the Review 3 inventory, click the tab and click .
Create the host with the serverc.lab.example.com name and then click .
| Field | Value |
|---|---|
serverc.lab.example.com
|
To create the iSCSI project resource, navigate to → , click , complete the form with the following information, and then click .
| Field | Value |
|---|---|
iSCSI
| |
Default
| |
Git
| |
git@git.lab.example.com:student/iscsi.git
| |
GitLab
| |
Allow Branch Override
(selected)
|
To create the Review 3 automation execution environment resource, navigate to → , click , complete the form with the following information, and then click .
| Field | Value |
|---|---|
Review 3
| |
hub.lab.example.com/system/ee-review-rhel8:v1.0
| |
Always pull the container before running.
| |
Default
| |
Automation Hub Container Registry
|
To create the Configure iSCSI job template resource, navigate to → , click → , complete the form with the following information, and then click .
| Field | Value |
|---|---|
Configure iSCSI
| |
Review 3
| |
iSCSI
| |
Review 3
| |
review3
| |
demo.yml
| |
DevOps
|
In the automation controller web UI, start a job from the Configure iSCSI job template.
Verify that the demo.yml playbook correctly configures the serverc.lab.example.com managed node.
Navigate to → and then click the icon for the Configure iSCSI job template.
Wait for the job to complete.
Confirm the successful execution of the job by verifying that the role created the /etc/target/saveconfig.json file on the managed node.
[student@workstation iscsi]$ ssh serverc ls /etc/target/saveconfig.json
/etc/target/saveconfig.json