Deploy a batch application managed by a job resource and a database server that a deployment resource manages by using the Kubernetes command-line interface.
Outcomes
In this exercise, you deploy a database server and a batch application that are both managed by workload resources.
Create deployments.
Update environment variables on a pod template.
Create and run job resources.
Retrieve the logs and termination status of a job.
View the pod template of a job resource.
As the student user on the workstation machine, use the lab command to prepare your system for this exercise.
This command ensures that resources are available for the exercise.
[student@workstation ~]$ lab start deploy-workloads
Instructions
As the developer user, create a MySQL deployment in a new project.
Log in as the developer user with the developer password.
[student@workstation ~]$ oc login -u developer -p developer \
https://api.ocp4.example.com:6443
...output omitted...Create a project named deploy-workloads.
[student@workstation ~]$ oc new-project deploy-workloads
Now using project "deploy-workloads" on server "https://api.ocp4.example.com:6443".
...output omitted...Create a deployment that runs an ephemeral MySQL server.
[student@workstation ~]$ oc create deployment my-db \
--image registry.ocp4.example.com:8443/rhel9/mysql-80:1
Warning: would violate PodSecurity "restricted:v1.24"
...output omitted...
deployment.apps/my-db createdIt is safe to ignore pod security warnings for exercises in this course. OpenShift uses the Security Context Constraints controller to provide safe defaults for pod security.
Retrieve the status of the deployment.
[student@workstation ~]$oc get deploymentsNAME READY UP-TO-DATE AVAILABLE AGE my-db0/11 0 67s
The deployment never has a ready instance.
Retrieve the status of the created pod. Your pod name might differ from the output.
[student@workstation ~]$oc get podsNAME READY STATUS RESTARTS AGE my-db-8567b478dd-d28f7 0/1CrashLoopBackOff4 (60s ago) 2m35s
The pod fails to start and repeatedly crashes.
Review the logs for the pod to determine why it fails to start.
[student@workstation ~]$oc logs deploy/my-db...output omitted...You must either specify the following environment variables:MYSQL_USER (regex: '^$') MYSQL_PASSWORD (regex: '^[a-zA-Z0-9_~!@#$%^&*()-=<>,.?;:|]$') MYSQL_DATABASE (regex: '^$') Or the following environment variable: MYSQL_ROOT_PASSWORD (regex: '^[a-zA-Z0-9_~!@#$%^&*()-=<>,.?;:|]$') ...output omitted...
Note that the container fails to start due to missing environment variables.
Fix the database deployment and verify that the server is running.
Set the MYSQL_USER, MYSQL_PASSWORD, and MYSQL_DATABASE environment variables.
[student@workstation ~]$ oc set env deployment/my-db \
MYSQL_USER=developer \
MYSQL_PASSWORD=developer \
MYSQL_DATABASE=sampledb
deployment.apps/my-db updatedRetrieve the list of deployments and observe that the my-db deployment has a running pod.
[student@workstation ~]$oc get deploymentsNAME READY UP-TO-DATE AVAILABLE AGE my-db1/11 1 4m50s
Retrieve the internal IP address of the MySQL pod within the list of all pods.
[student@workstation ~]$ oc get pods -o wide
NAME READY STATUS RESTARTS AGE IP ...
my-db-748c97d478-g8xc9 1/1 Running 0 64s 10.8.0.91 ...The -o wide option enables additional output, such as IP addresses.
Your IP address value might differ from the previous output.
Verify that the database server is running, by running a query. Replace the IP address with the one that you retrieved in the preceding step.
[student@workstation ~]$ oc run -it db-test --restart=Never \
--image registry.ocp4.example.com:8443/rhel9/mysql-80:1 \
-- mysql sampledb -h 10.8.0.91 -u developer --password=developer \
-e "select 1;"
...output omitted...
---
| 1 |
---
| 1 |
---Delete the database server pod and observe that the deployment causes the pod to be re-created.
Delete the existing MySQL pod by using the label that is associated with the deployment.
[student@workstation ~]$ oc delete pod -l app=my-db
pod "my-db-84c8995d5-2sssl" deletedRetrieve the information for the MySQL pod and observe that it is newly created. Your pod name might differ in your output.
[student@workstation ~]$oc get pod -l app=my-dbNAME READY STATUS RESTARTS AGE my-db-fbccb9447-p99jd 1/1 Running 06s
Create and apply a job resource that prints the time and date repeatedly.
Create a job resource called date-loop that runs a script.
Ignore the warning.
[student@workstation ~]$ oc create job date-loop \
--image registry.ocp4.example.com:8443/ubi9/ubi \
-- /bin/bash -c "for i in {1..30}; do date; done"
job.batch/date-loop createdRetrieve the job resource to review the pod specification.
[student@workstation ~]$oc get job date-loop -o yaml...output omitted... spec: containers: - command:- /bin/bash - -c - for i in {1..30}; do date; done image: registry.ocp4.example.com:8443/ubi9/ubi
imagePullPolicy: Always name: date-loop resources: {} terminationMessagePath: /dev/termination-log terminationMessagePolicy: File dnsPolicy: ClusterFirst restartPolicy: Never
schedulerName: default-scheduler securityContext: {} terminationGracePeriodSeconds: 30 ...output omitted...
List the jobs to see that the date-loop job completed successfully.
[student@workstation ~]$oc get jobsNAME COMPLETIONS DURATION AGE date-loop1/17s 8s
You might need to wait for the script to finish and run the command again.
Retrieve the logs for the associated pod. The log values might differ in your output.
[student@workstation ~]$ oc logs job/date-loop
Fri Nov 18 14:50:56 UTC 2022
Fri Nov 18 14:50:59 UTC 2022
...output omitted...Delete the pod for the date-loop job and observe that the pod is not created again.
Delete the associated pod.
[student@workstation ~]$ oc delete pod -l job-name=date-loop
pod "date-loop-wvn2q" deletedView the list of pods and observe that the pod is not re-created for the job.
[student@workstation ~]$ oc get pod -l job-name=date-loop
No resources found in deploy-workloads namespace.Verify that the job status is still listed as successfully completed.
[student@workstation ~]$oc get job -l job-name=date-loopNAME COMPLETIONS DURATION AGE date-loop1/17s 7m36s