fix: show helpful message when /retest has nothing to retest#2399
fix: show helpful message when /retest has nothing to retest#2399chmouel merged 2 commits intotektoncd:mainfrom
Conversation
Summary of ChangesHello @mathur07, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request improves the functionality of the Highlights
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request introduces a useful fix to allow /retest to re-run all pipelines even if they have all succeeded. The implementation also extends this behavior to /ok-to-test. My main feedback is to add a unit test to cover this new logic and to update the pull request description to reflect the full scope of the change, including /ok-to-test.
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request correctly implements the desired behavior for /retest and /ok-to-test commands. When all pipelines have already succeeded, these commands will now trigger a re-run of all pipelines instead of doing nothing. The logic is sound, and the addition of an integration test (TestRetestRerunsAllWhenAllSucceeded) is a great way to ensure this new functionality is well-tested. I have one suggestion to make the new test even more robust by asserting on the log output.
|
@mathur07 The comment that we're getting when doing /retest (after every check is succeeded) is misleading, I think we need to just modify the comment to something like "all PipelineRuns for this commit have already succeeded. Use |
🔍 PR Lint Feedback
|
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request aims to change the behavior of the /retest command when all pipelines have already succeeded. I've found a critical discrepancy between the described behavior in the pull request and the actual implementation in the code. My review includes a comment with a suggested fix to align the code with the PR's stated goal.
0aa799e to
07d7cc4
Compare
| if matchedPRs, err = matcher.MatchPipelinerunByAnnotation(ctx, p.logger, pipelineRuns, p.run, p.event, p.vcx, p.eventEmitter, repo, true); err != nil { | ||
| // Check if all pipelines have already succeeded - just log, don't create neutral status | ||
| if errors.Is(err, matcher.ErrAllPipelinesSucceeded) { | ||
| p.eventEmitter.EmitMessage(nil, zap.InfoLevel, "RepositoryAllPipelinesSucceeded", err.Error()) |
There was a problem hiding this comment.
IMO this should create a comment on the pull request. Since we're not creating a neutral check, if we don't post a comment, from the users perspective PaC just didn't do anything.
| p.eventEmitter.EmitMessage(nil, zap.InfoLevel, "RepositoryAllPipelinesSucceeded", err.Error()) | |
| p.eventEmitter.EmitMessage(nil, zap.InfoLevel, "RepositoryAllPipelinesSucceeded", err.Error()) | |
| if err := p.vcx.CreateComment(ctx, p.event, err.Error(), ""); err != nil { | |
| return nil, fmt.Errorf("Error adding no pipelineruns to rerun comment: %v", err) | |
| } |
There was a problem hiding this comment.
@aThorp96 if we create comment then it would be just last on MR/PR even after there are PipelineRuns to be run from that PR and also we have settings for comment whether user wants no comment at all or wants to update the comments.
There was a problem hiding this comment.
@mathur07 we have func createNeutralStatus in the same file match.go for this. you can create neutral status on PR saying that there is no PipelineRun to trigger and it also checks for comment update and other settings.
There was a problem hiding this comment.
Thanks for the review @zakisk
On comment strategy: The Repository CR docs say that comment_strategy (update / disable_all) applies only to comments about a PipelineRun’s status (e.g. “started,” “succeeded”). It also says: “Comments may still appear if there are errors validating PipelineRuns in the .tekton directory.” The message we post for “all pipelines already succeeded” is in that same category: it’s informational/guidance (like validation errors), not a PipelineRun status update, so it’s intended to be outside the scope of comment_strategy and can still be posted when the user would otherwise have no comment or only updated status comments.
On using createNeutralStatus instead of a comment: A neutral status makes sense when we’re reporting that no PipelineRun was triggered (e.g. “no match for this event”). Here we’re not reporting a check result; we’re explaining that there’s nothing to retest and how to use /retest <pipeline-name> or /test. That’s user guidance, not a CI “check,” so a PR comment is a better fit than a neutral status. Keeping it as a comment also keeps the behavior consistent with other informational messages (e.g. validation errors) that are not gated by the comment strategy.
There was a problem hiding this comment.
Let me know what you think.
There was a problem hiding this comment.
if you see here, neutral check is used to indicate that no matching pipeline run found. posting a comment for this would create confusion on next iteration (e.g. new commit push, or /test comment etc) that comment wouldn't be deleted and will be there on PR forever but if we create check-run it would disappear on next new commit push or /test command. so IMO I still want neutral check there
There was a problem hiding this comment.
@zakisk I agree more with your previous comment asking to remove the neutral check. Consider the case where all pipelineruns have succeeded and there is a comment reflecting the status. If I run /retest and no tests are run it should not update the successful-status comment. This comment does not reflect any pipelinerun status and is a direct response to a comment; I would argue it is more confusing for users to for there to be a neutral status.
For a similar UX, consider Tide. When a PR's approval status changes a comment is updated, but when someone interacts with tide incorrectly, a separate, permanent comment is made informing the user that their comment has no effect and no change is made to the status comment nor check-run status.
There was a problem hiding this comment.
@zakisk I agree more with your #2399 (comment) asking to remove the neutral check
I changed my perspective after full understanding the issue 😄
but when someone interacts with tide incorrectly
but I don't think that making /retest comment on a PR which doesn't have any failed pipeline run would be an "incorrect interaction"? and we create neutral status check when there is no matching pipeline run in .tekton directory as well so IMO when there is no failed pipeline run on /retest comment then it should be fine to create a neutral status check because in this PR changes it logs message (I am not stubborn on this 😄)
cc: @chmouel
|
Please add unit tests and ideally E2E tests as well. Screenshot and/or link to an example of the new behavior would also be helpful. Thanks! |
33b45b0 to
b8d5730
Compare
aThorp96
left a comment
There was a problem hiding this comment.
Looks generally good to me. Two requests regarding the tests. One suggestion regarding the message itself.
I notice in the screenshot the comment reads All PipelineRuns for this commit have already succeeded. Use /retest to re-run a specific pipeline or /test to re-run all pipelines., it doesn't include the <pipelinerun-name> string. Was this screenshot taken before that was included in the string, or is it possibly being treated as an unknown html tag and being hidden? If the latter, I think including the backticks should ensure it's rendered correctly
b8d5730 to
52922c4
Compare
|
/retest seems broken on gitea/forgejo and we need to test this there as well https://issues.redhat.com/browse/SRVKP-10913 |
When using /retest and all pipelines have already succeeded, show: "all PipelineRuns for this commit have already succeeded. Use /retest <pipeline-name> to re-run a specific pipeline or /test to re-run all" instead of the misleading "could not find any PipelineRun" message. Fixes: SRVKP-10373 Signed-off-by: Shubham Mathur <shumathu@redhat.com>
When using /retest and all pipelines have already succeeded, show: "all PipelineRuns for this commit have already succeeded. Use /retest <pipeline-name> to re-run a specific pipeline or /test to re-run all" instead of the misleading "could not find any PipelineRun" message. Fixes: SRVKP-10373 Signed-off-by: Shubham Mathur <shumathu@redhat.com>
|
@chmouel I tested the changes on gitea/forgejo, and it looks good:
|
|
thanks, this will be useful |



📝 Description of the Change
When using
/retestor/ok-to-testand all pipelines have already succeeded, the command now shows a helpful message instead of the misleading "could not find any PipelineRun in your .tekton/ directory" error.New message:
This guides users to:
/retest <pipeline-name>to re-run a specific pipeline/testto re-run all pipelines👨🏻 Linked Jira
Fixes: SRVKP-10373
🔗 Linked GitHub Issue
Fixes #
🚀 Type of Change
fix:)feat:)feat!:,fix!:)docs:)chore:)refactor:)enhance:)deps:)🧪 Testing Strategy
🤖 AI Assistance
If you have used AI assistance, please provide the following details:
Which LLM was used?
Extent of AI Assistance:
Important
If the majority of the code in this PR was generated by an AI, please add a
Co-authored-bytrailer to your commit message.For example:
Co-authored-by: Gemini gemini@google.com
Co-authored-by: ChatGPT noreply@chatgpt.com
Co-authored-by: Claude noreply@anthropic.com
Co-authored-by: Cursor noreply@cursor.com
Co-authored-by: Copilot Copilot@users.noreply.github.com
**💡You can use the script
./hack/add-llm-coauthor.shto automatically addthese co-author trailers to your commits.
✅ Submitter Checklist
fix:,feat:) matches the "Type of Change" I selected above.make testandmake lintlocally to check for and fix anyissues. For an efficient workflow, I have considered installing
pre-commit and running
pre-commit installtoautomate these checks.