Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

helm unittests #462

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
Open

helm unittests #462

wants to merge 9 commits into from

Conversation

MichaelSp
Copy link
Contributor

@MichaelSp MichaelSp commented Nov 5, 2023

Pull Request

Description of the change

run unittests for the helm-chart during PR checks

Benefits

better quality

Possible drawbacks

??

Applicable issues

no

Additional information

Checklist

Signed-off-by: Michael Sprauer <[email protected]>
@MichaelSp MichaelSp marked this pull request as ready for review November 5, 2023 15:17
@provokateurin
Copy link
Member

Please explain what this does exactly. I'm not familiar with helm-unittest and don't even know what the tool is supposed to do.

@MichaelSp
Copy link
Contributor Author

MichaelSp commented Nov 5, 2023

https://github.com/helm-unittest/helm-unittest is a tool to help write unit tests for helm charts. It allows to write unit tests which help ensure software quality on the unit level.

Here is a very good article on what unit-tests actually are: https://martinfowler.com/bliki/UnitTest.html

btw: I also added a check that runs during PR actions

@provokateurin
Copy link
Member

I know what unit tests are, but what is tested in the context of Helm Charts?

@MichaelSp
Copy link
Contributor Author

In this https://github.com/nextcloud/helm/blob/2c871d9bdfbddc748cee7a0aa575fbc32a04b0f2/charts/nextcloud/tests/defaults_test.yaml I test the output of the helm-chart for a default value file. I only provide values to

This is necessary to generate stable output (it should not change between test runs even if the appVersion changes.
The output is matched and compared to the snapshot to make sure future changes to the templates or values don't change the output in an unexpected way.

If a future change results in a desired change of the output, the snapshot can be updated: helm unittest charts/nextcloud -u.

This is only a very basic test. You can increase coverage by defining more tests and more test-suites with different combinations of values to cover the logic in the templates and make sure use-cases like nginx, cron, mail, etc result in the correct configuration.

Using helm unittest is a best-practice in helm-charts.

@provokateurin
Copy link
Member

Sounds like a good idea, @jessebot @tvories what do you think? I think for merging this it would be good to have a bit more coverage of features

@jessebot
Copy link
Collaborator

jessebot commented Nov 6, 2023

Using helm unittest is a best-practice in helm-charts.

I agree tests are good, but I look at quite a few helm charts for a living and I haven't actually seen this tool in use before now.

I feel like a test of just installing on k3s/k3d makes more sense and is easier to understand since we still don't actually have that across k8s distros other than kind. It would also be another thing that would then need to pass before we can merge things, and we already have some trouble doing that due to our other requirements such as DCO, linting, chart version bump, and base installation on kind.

If a future change results in a desired change of the output, the snapshot can be updated: helm unittest charts/nextcloud -u.

The other issue is that we have to keep it up to date, though we could add this to the contributing docs?

Still OK to merge this, because I agree, more feature test coverage would be nice, but wanted to note some very minor downsides. I'd have to spend some time with it before I could help add anything else.

If the helm unit-tests are common enough, I guess we could put out some GitHub issues asking other community members to help us add specific features tests, but if we do that, there's also the question of how we run only necessary unit tests after each push? Otherwise, every push will take like 5+ minutes as we add more feature unit tests, which will also make the back and forth on PRs take a bit longer.

I approved it for now, so that I am not blocking, but will wait till others chime in before merging.

Update: after thinking about it more, it does make sense to have more tests though. I think I was just on about how long it can sometimes take to get things merged. Tests make sense though.

@jessebot jessebot added the enhancement New feature or request label Nov 6, 2023
@jessebot jessebot self-requested a review November 6, 2023 14:15
@MichaelSp
Copy link
Contributor Author

there's also the question of how we run only necessary unit tests after each push? Otherwise, every push will take like 5+ minutes as we add more feature unit tests, which will also make the back and forth on PRs take a bit longer.

The unit-tests even for larger test suites are pretty fast (<5sec). I bet they will be always faster than starting a kind-cluster and deploy it.

@provokateurin
Copy link
Member

I assume the unit tests just run helm template with a given values file and compare the output with the fixture? Then that would be very fast and scalable indeed.

@jessebot
Copy link
Collaborator

jessebot commented Nov 8, 2023

awesome, then that really solve the bulk of my issues :)

@jessebot
Copy link
Collaborator

if/when this is merged, I've created #469 for a contributing doc to be updated as we need :)

@provokateurin
Copy link
Member

@jessebot @MichaelSp what is the status? I think it would be good to start adding tests soon, even if there are just a few in the beginning.

@jessebot
Copy link
Collaborator

I haven't tested this locally, but let me try!

@provokateurin
Copy link
Member

I'd say the charts/__snapshots__ dir needs to be added to the filter so that the CI runs when they change to ensure everything is still correct.

generated via `helm unittest -u nextcloud` from `charts` dir

Signed-off-by: JesseBot <[email protected]>
Copy link
Collaborator

@jessebot jessebot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I regenerated the snapshot for today and rebased off of main, but I have one last comment below to take care of before we can merge this.

.github/workflows/lint-test.yaml Show resolved Hide resolved
@jessebot
Copy link
Collaborator

We also probably need to add a note about these tests to the CONTRIBUTING doc. It should include at least how to test locally, and how to regenerate the snapshots.

@jessebot
Copy link
Collaborator

jessebot commented Jul 26, 2024

@provokateurin do we still want to move this forward? 🤔 I know we have other ci tests, so not sure if we still want this. Up to you!

@provokateurin
Copy link
Member

I think this might still be a good idea to prevent any accidental issues with the templating that otherwise might go undetected.

@jessebot
Copy link
Collaborator

jessebot commented Jul 26, 2024

Okie dokie, we'll need the conflicts resolved, and then we can try it by doing a temporary test commit here like we did on the last PR?

Update: Fixed conflicts and now it should run. I also still think we should have a section in the CONTRIBUTING.md on how to update the tests if they get out of date. @MichaelSp if could do that, I can get this merged, otherwise, I'll circle back to this PR and do it when I have a free moment 🙏

@jessebot
Copy link
Collaborator

jessebot commented Jul 26, 2024

Oh, another thing before we merge, the 3 test scenarios we want to check were:

  • no changes (should show as working) - we could test this one by just moving the unit test before the ct lint step

  • a valid helm template change (should show as working) - could test this one by maybe adding a new value to values.yaml and rending it in the _helpers.tpl?

  • and invalid helm template change (should show as failing) - could test this one with an invalid image tag in the values.yaml?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants