-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow running the acceptance tests in pulumi/examples as part of the test suite #823
Conversation
2c5be3a
to
c854704
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not particularly good at GH Actions, so it would be really helpful to see a demo run of this change.
You can use the Makefile
copy this change over by modifying the ci-mgmt target. Just change @master
to the SHA of the commit you want to test.
It looks like we want to take this change on PRs (run-acceptance-tests) but not master, release or cron. Is that the plan?
provider-ci/internal/pkg/templates/bridged-provider/.github/workflows/run-acceptance-tests.yml
Outdated
Show resolved
Hide resolved
The p-azure PR I linked, pulumi/pulumi-azure#1717, is the demo! Specifically, you can see the new test job [test (nodejs, pulumiExamples)](https://github.com/pulumi/pulumi-azure/actions/runs/7913493113/job/21601801140?pr=1717) running the p/examples tests for the azure-classic examples. It also demonstrates part of the value - some examples are failing because the examples are outdated. It could also go the other way round: a provider change breaks a previously working example.
I hope we can run these tests for each PR, but we don't need to decide just yet. Running them on releases could also be valuable. Either way, we won't make their success required just yet, as the p-azure example shows. |
Great. That looks good.
I'm trying to get at the purpose of the new tests here. If we run them only in PRs (run-acceptance-tests), then they will be ignored. Engineers have no way to verify if they broke the tests or if tests were already broken on master. I generally believe that 90% of the use of tests here is gating, and if tests are allowed to be broken then they loose that ability. I'd rather that they run on PRs and master, and that they do block PRs. If the tests are sufficiently flakey that it is unsafe to block on, I'm skeptical they provide enough signal to make them worth running. |
I'm not sure I agree, this sounds to me like an argument that says "we can't make the gating workflow ideal so we'll continue willfully shipping changes that break examples." Having examples evergreen (or almost) does not sound like something we should give up on already? If examples are evergreen, then a PR failing this check is an indication the PR has a problem, which is what we want. |
If we have to enforce.. I would argue it'd be much more customer-centric to open P1s for when examples are broken than to open P1s for when CI workflows are broken as is our current practice. |
Actually to take the step back, I'd like to ask if we can link Thomas' design doc and epic/unit of work here? I was under the impression that running these tests was what was agreed on in the multiple-round reviewed design doc, am I perhaps misremembering or there's some new information to reconsider? |
Great points, @iwahbe and @t0yv0. The design doc is Test Coverage and Maintenance of Examples but it's Pulumi-internal. We had indeed agreed to run these tests in order to increase test coverage, but there are still variables as to when and how we run them. Ideally, as Ian said, we'd like to have an additional gate, at PR or release time, that prevents us from shipping regressions. Before adding the pulumi/examples tests in this way, however, we need to consider that this repo belongs to another team and gets new commits from all over Pulumi. If one of these is faulty - although theoretically that should be prevented by pulumi/example's own test suites - the provider gating its PRs on these examples is blocked from merging. Some ways around this:
|
I thought this was the agreement and I still support this option, that's the only viable option IMO. |
@thomas11 Let's get the tests in as advisory. We can always increase the impact of the tests once they are in place, and I don't want this work to bitrot while we discuss. |
9ad78fd
to
86cf165
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
This change adds a new optional flag
testPulumiExamples
that can be set in .ci-mgmt.yaml. When set, the acceptance tests in pulumi/examples will be run as part of the test suite.The goal is to increase test coverage in provider PRs or releases.
pulumi/examples
has a rich set of realistic programs that can be used for this via ProgramTest.Providers don't necessarily need to gate PRs on these tests, they can be informational only.
This requires that the provider's test suite in
examples/
is enabled for this. See pulumi/pulumi-azure#1717 for an example. It's not enough to just run the test suite in pulumi/examples because we want to use or inject the locally built provider and SDKs of the current PR.I chose the approach using the test job's matrix because the new pulumi/examples tests should run concurrently to the regular ones, but to run them in a separate job would require copying the lengthy setup steps of the test job, or extracting the setup into a composite action which I thought could be avoided.