webhooks

Subscribe to all “webhooks” posts via RSS or follow GitHub Changelog on Twitter to stay updated on everything we ship.

~ cd github-changelog
~/github-changelog|main git log main
showing all changes successfully

The secret_scanning_alert webhook is sent for activity related to secret scanning alerts. Secret scanning webhooks now support validity checks, so you can keep track of changes to validity status.

Changes to the secret_scanning_alert webhook:

  • A new validity property that is either active, inactive, or unknown depending on the most recent validity check.
  • A new action type, validated, which is triggered when a secret’s validity status changes.

Note: you must enable validity checks at the repository or organization level in order to opt in to the feature. This can be done from your secret scanning settings on the Code security and analysis settings page by selecting the option to “automatically verify if a secret is valid by sending it to the relevant partner.”

Learn more about which secret types are supported or the secret scanning webhook.

See more

Starting today, Dependabot will be able to auto-dismiss npm alerts that have limited impact (e.g. long-running tests) or are unlikely to be exploitable. With this ship, Dependabot will cut false positives and reduce alert fatigue substantially.

On-by-default for public repositories, and opt-in for private repositories, this feature will result in 15% of low impact npm alerts being auto-dismissed moving forward – so you can focus on the alerts that matter, without worrying about the ones that don’t.

What’s changing?

When the feature is enabled, Dependabot will auto-dismiss certain types of vulnerabilities that are found in npm dependencies used in development (npm devDependency alerts with scope:development). This feature will help you proactively filter out false positives on development-scoped (non-production or runtime) alerts without compromising on high risk devDependency alerts.

Dependabot alerts auto-dismissal list view

Frequently asked questions

Why is GitHub making this change?

At GitHub, we’ve been thinking deeply about how to responsibly address long-running issues around alert fatigue and false positives. Rather than over-indexing on one criterion like reachability or dependency scope, we believe that a responsibly-designed solution should be able to detect and reason on a rich set of complex, contextual alert metadata.

That’s why, moving forward, we’re releasing a series of ships powered by an underlying, all-new, flexible and powerful alert rules engine. Today’s ship, our first application, leverages GitHub-curated vulnerability patterns to help proactively filter out false positive alerts.

Why auto-dismissal, rather than purely suppressing these alerts?

Auto-dismissing ensures any ignored alerts are 1) able to be reintroduced if alert metadata changes, 2) caught by existing reporting systems and workflows, and 3) extensible as a whole to future rules-based actions, where Dependabot can decision on subsets of alerts and do things like reopen for patch, open a Dependabot pull request, or even auto-merge if very risky.

How does GitHub identify and detect low impact alerts?

Auto-dismissed alerts match GitHub-curated vulnerability patterns. These patterns take into account contextual information about how you’re using the dependency and the level of risk they may pose to your repository. To learn more, see our documentation on covered classes of vulnerabilities.

How will this activity be reported?

Auto-dismissal activity is supported across webhooks, REST, GraphQL, and the audit log for Dependabot alerts. In addition, you can review your closed alert list with the resolution:auto-dismissed filter.

How will this experience look and feel?

Alerts identified as false positives will be automatically dismissed without a notification or new pull request, and appear as special timeline event. As these alerts are closed, you’ll still be able to review any auto-dismissed alerts with the resolution:auto-dismissed filter.

How do I reopen an automatically dismissed alert?

Like any manually dismissed alert, you can reopen an auto-dismissed alert from the alert list view or details page. This specific alert won’t be auto-dismissed again.

What happens if alert metadata changes or advisory information is withdrawn?

Dependabot recognizes and immediately responds to any changes to metadata which void auto-dismissal logic. For example, if you change the dependency scope and the alert no longer meets the criteria to be auto-dismissed, the alert will automatically reopen.

How can I enable or disable the feature?

This feature is on-by-default for public repositories and opt-in for private repositories. Repository admins can opt in or out from your Dependabot alerts settings in the Code Security page.

Is this feature available for enterprise?

Yes! In addition to all free repositories, this feature will ship immediately to GHEC and to GHES in version 3.10.

What’s next?

Next, we’ll expose our underlying engine – which enables Dependabot to perform actions based on a rich set of contextual alert metadata – so you can write your own custom rules to better manage your alerts, too.

How do I learn more?

How do I provide feedback?

Let us know what you think by providing feedback — we’re listening!

See more

Following a successful beta with lots of great customer feedback, webhook forwarding in the GitHub CLI is now available to everyone.

Webhook forwarding makes it easy to test your webhooks integration in your local environment without having to worry about port forwarding.

All it takes to start receiving webhooks locally is one simple command:

gh webhook forward --repo monalisa/hello-world --events issues,pull_request --url http://localhost:4000/webhooks

To learn more, head over to "Receiving webhooks with the GitHub CLI" in the docs.

See more

A GitHub Actions workflow run is made up of one or more jobs and each job is associated with a check run. The workflow_job webhook is sent during state transitions of a workflow job. The job state is included in the webhook payload as the action property, which currently takes the values of queued, in_progress, or completed.

With this change, the workflow_job webhook will now support a new waiting state whenever a job is waiting on an environment protection rule, aligning with the waiting state of the corresponding check run. This enables better insight into the progress of a job when using environment protection rules.

In addition, when a job refers to an environment key in its YAML definition, the resulting workflow_job webhook payload will also include a new property, deployment with the metadata about the deployment created by the check run.

Learn more about using environments for deployment Jobs in a Workflow

For questions, visit the GitHub Actions community.

To see what's next for Actions, visit our public roadmap.

See more

Recently, GitHub added webhooks to our OpenAPI schema. Now, Webhook events and payloads in the GitHub documentation is built from the OpenAPI schema. The schema-generated documentation is more accurate and comprehensive and includes the payload structure for each event and action type.

Currently, the new webhook docs are available for the Free/Pro/Team and GitHub Enterprise Cloud plans. GitHub Enterprise Server and GitHub AE will get the new docs with the next version release.

Do you have ideas for improvement? Open a documentation issue to let us know.

See more

We've launched a limited public beta of a new feature in the GitHub CLI: webhook forwarding.

Webhook forwarding makes it easy to test your webhooks integration in your local environment without having to worry about port forwarding.

All it takes to start receiving webhooks locally is one simple command:

gh webhook forward --repo monalisa/hello-world --events issues,pull_request --url http://localhost:4000/webhooks

With webhook forwarding, you can iterate quickly on your integration without having to deploy your code to a test environment.

To request access to the beta program, post in our GitHub Community discussion. We add new beta users on a regular basis. Once you've been added, you will receive an email at the address registered on your GitHub account.

For more details on this new feature, head over to the docs – see "Receiving webhooks with the GitHub CLI".

See more

This feature is available to repositories enrolled in the Pull Request Merge Queue beta.

A new webhook event and GitHub Actions workflow trigger (merge_group) makes it easier to run required status checks on merge groups created by merge queue. A merge group includes the changes from one or more pull requests and must pass the status checks required by the target branch.

A merge_group webhook event, which currently has one supported action (checks_requested), is sent after a merge group is created and informs receivers, including GitHub Actions, when status checks are needed on the merge group. The event payload includes head_sha, the commit SHA that should be validated and have status reported on using check runs or commit statuses. For GitHub Actions, status is reported automatically at the conclusion of jobs in the triggered workflow.

To trigger a GitHub Actions workflow for a merge group, the merge_group trigger should be used. The following example triggers on individual pull requests and merge groups targeting the main branch:

# Trigger this workflow on individual pull requests and merge groups that target the `main` branch
on:
  pull_request:
    branches: [ main ]
  merge_group:
    branches: [ main ]

A push event is still sent when a merge group branch is created, and will trigger a GitHub Actions workflow. However, unlike a merge_group event, a push event does not include the target branch of the merge group.

Learn more about using merge queue.

Learn more about the new GitHub Actions merge_group workflow trigger and the merge_group webhook event.

See more

When a new tag is created, the push webhook payload will now always include a head_commit object that contains the data of the commit that the new tag points to. In other words, the head_commit object will always contain the commit data of the payload's after commit.

Previously, during tag creation, there were certain circumstances where the head_commit would contain the data of a different commit.

See more