There's already the ability to have a custom message on a failing check to let admins know how to become compliant, could we have the same kind of functionality for passing checks to give more context as to why the check has passed? I'll try to give an example.
A program is configured securely in a default way with secure options hard coded as defaults, but can be changed by a configuration file to other options.
So a check for this program could be, if the config file does not exist - it's configured securely. But a validator will just see a passing check for a missing config file and not understand. So I'm thinking something like this small snippet:
describe file('/path/to/config.cnf') do
it{should_not exist, "The absence of a configuration file means the program is using default secure settings"}
end
This way we could have this message show up in the results field of a report, just like failing checks, with more context as to why this is compliant and within spec.
There's already the ability to have a custom message on a failing check to let admins know how to become compliant, could we have the same kind of functionality for passing checks to give more context as to why the check has passed? I'll try to give an example.
A program is configured securely in a default way with secure options hard coded as defaults, but can be changed by a configuration file to other options.
So a check for this program could be, if the config file does not exist - it's configured securely. But a validator will just see a passing check for a missing config file and not understand. So I'm thinking something like this small snippet:
describe file('/path/to/config.cnf') do
it{should_not exist, "The absence of a configuration file means the program is using default secure settings"}
end
This way we could have this message show up in the results field of a report, just like failing checks, with more context as to why this is compliant and within spec.