This repo is the home of the AlmaLinux Certification Suite. We largely rely on open source utilities, tests, and benchmarks to ensure various types workloads are stable on a given hardware configuration.
The suggested way to run the certification suite is combining LTS/SUT - this means running this ansible playbook from the same host that is being tested. This avoids any network-related issues between LTS and SUT causing a failure. The expected runtime of the playbook is around 48 hours so the chance of a network blip causing a failure over publicly-networked hosts is great.
We recommend using a local console or in something like screen
or tmux
. We will use tmux
in this example.
dnf install git-core tmux python3.12 -y
git clone https://github.com/AlmaLinux/Hardware-Certification-Suite.git
# create venv
python3.12 -m venv venv
# activate venv
source venv/bin/activate
# install ansible
pip install ansible
# move into ansible playbook dir
cd Hardware-Certification-Suite
# start tmux session
tmux new-session -s almalinux-certification-tests
# run playbook
ansible-playbook -c local -i 127.0.0.1, automated.yml --tags=phoronix
# install python 3.12
dnf -y install python3.12 tmux
# create venv
python3.12 -m venv venv-almalinux-certification-suite
# activate venv
source venv-almalinux-certification-suite/bin/activate
# install ansible
pip install ansible
# start tmux session
tmux new-session -s almalinux-certification-tests
# run playbook
ansible-playbook -i <SUT IP>, automated.yml --tags phoronix
# install ansible and tmux
dnf -y install ansible tmux
# start tmux session
tmux new-session -s almalinux-certification-tests
# run playbook
ansible-playbook -i <SUT IP>, automated.yml --tags phoronix
=======
This repo is the home of the AlmaLinux Certification Suite, built and maintained by the AlmaLinux Certification SIG. Contributions to this suite are welcome, and we invite contributors to become active in the SIG itself.
The certification suite is built modularly with intention, and we would love to expand this suite as our community needs to include the creation of hardware- or software-vendor specific test(s), and running them on request.
For MariaDB database we could include a mariadb-test runner with parameters defined by MariaDB Foundation team for the suite to be able to detect functional regressions (potentially even including performance tests).
Below describes how to run the suite itself. Once the suite is run, results should be submitted to the (Certifications repo)[https://github.com/AlmaLinux/certifications].
=== Definitions:
Example SUT IP address: 192.168.244.7
Install Ansible
yum --enablerepo=epel install ansible
Add key
ssh-keygen -t rsa
Add a key to SUT, a comma after the IP address is required
ansible all -i 192.168.244.7, -m authorized_key -a "user=root key='{{ lookup('file', '/root/.ssh/id_rsa.pub') }}' path=/root/.ssh/authorized_keys manage_dir=no" --ask-pass
Check connection with SUT, comma after IP address is required
ansible all -i 192.168.244.7, -m ping -u root
Clone repository
cd ~ && git clone "https://github.com/AlmaLinux/Hardware-Certification-Suite.git"
Create your test directory in the ~/Hardware-Certification-Suite/tests
folder, for example example
.
Test directory structure for automated tests
|- tests/example
|-- roles
|--- main.yml - Ansible tasks
|-- README.md - instructions for working with the test when manually launched
|-- run_test.sh - script to run the test
Test directory structure for interactive tests
|- tests/example
|-- step1.yml - sub playbook with interactive prompts
|-- step2.yml - sub playbook with interactive prompts
|-- stepx.yml - sub playbook with interactive prompts
|-- README.md - instructions for working with the test when manually launched
Each automated test should store test results and utility output in a file name
.log in the root directory of the repository ~/Hardware-Certification-Suite/logs/
. You can get the folder path from a variable {{ lts_logs_dir }}
.
Add your automated tasks that perform the test to the ~/Hardware-Certification-Suite/automated.yml
file and interactive playbook to the ~/Hardware-Certification-Suite/interactive.yml
file located in the root of the repository.
Each test must be marked with a tag, for example tags: test_example
Add your test settings to the ~/Hardware-Certification-Suite/vars.yml
file if required
To run all automated tests:
Run tests on the LTS, comma after IP address is required
ansible-playbook -i 192.168.244.7, automated.yml
Run Phoronix tests on the LTS
ansible-playbook -i 192.168.244.7, automated.yml --tags phoronix
To run all interactive tests: Run tests on the LTS, comma after IP address is required
ansible-playbook -i 192.168.244.7, interactive.yml
Run command:
ansible-playbook -c local -i 127.0.0.1, automated.yml
Tests can be configured via ~/Hardware-Certification-Suite/vars.yml
file.
You can run automated tests by tag.
For example:
ansible-playbook -i 192.168.244.7, automated.yml --tags cpu
Available tags:
~/Hardware-Certification-Suite/tests
folder from LTS to SUTInteractive tests can't be run separately.
The ansible output will display information about each test. If there are errors, the tests will be colored red.
Summary information will display the test result. Notice the ignored=0 value. If it is > 0, the test has failed.
The value of failed is always 0, due to skipping failed tests for further sequential execution.
Example: 127.0.0.1 : ok=9 changed=7 unreachable=0 failed=0 skipped=0 rescued=0 ignored=1
local_action
.Run command on the LTS: local_action
Run command on the SUT: command, sh, etc.
screen -L -S hctest
/root
folder, to change the section, you need to change the test_phoronix['folder']
in the vars.yml
file.This repo is managed by the AlmaLinux Certification SIG