Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Alert routing to multiple slack channels based on alert label is not Working #193

Open
ankitdh7 opened this issue Oct 24, 2024 · 2 comments

Comments

@ankitdh7
Copy link
Contributor

Problem

Based on the Alertmanager configuration and the PrometheusRule CRD, I expected that alerts could be routed to different Slack channels based on the slack_channel label in the alert. While alerts are indeed being sent to Slack, they always go to the same channel specified by the Slack URL in the secret configuration.

If I create a new Slack outbound webhook with a different Slack URL pointing to another channel, the alerts are routed to that new channel. However, with hundreds of Slack channels needing to receive alerts, this approach is neither scalable nor secure. Manually creating individual webhooks for each Slack channel and configuring them in the Coralogix platform every time is not practical.

Steps to reproduce

git clone https://github.com/coralogix/coralogix-operator.git && cd coralogix-operator
  • Update the slack URL in base64 encoded format in this kubernetes secret and apply.
echo "<slack_url>" | base64
kubectl apply -f config/samples/alertmanager/slack-url-secret.yaml -n observability 
  • Install the operator
helm install coralogix-operator charts/coralogix-operator -n observability --set secret.data.apiKey="<coralogix_api_key>" --set coralogixOperator.image.tag="0.2.3" --set coralogixOperator.region="EUROPE2"
  • Apply this alertmanager config
kubectl apply -f config/samples/alertmanager/example-alertmanager.yaml -n obserbability
  • Apply the below test alerts to validate alerting and routing
kubectl apply -n observability -f - <<EOF
apiVersion: monitoring.coreos.com/v1
kind: PrometheusRule
metadata:
  name: observability-test-alerts
  labels:
    role: alert-rules
    app.coralogix.com/track-recording-rules: "true"
    app.coralogix.com/track-alerting-rules: "true"
    app.coralogix.com/managed-by-alertmanger-config: "true"
spec:
  groups:
  - name: example
    rules:
    - alert: exampleAlert
      expr: vector(1)
      for: 1m
      labels:
        priority: P5
        slack_channel: "#tm_coralogix_alert_test"
      annotations:
        summary: "Example Alert Triggered"
        description: "This is an example alert that triggers when the expression vector(1) is true for 30 minutes."
  - name: example2
    rules:
    - alert: exampleAlert2
      expr: vector(1)
      for: 1m
      labels:
        priority: P5
        opsgenie_team: "demo_team"
      annotations:
        summary: "Example Alert Triggered"
        description: "This is an example alert that triggers when the expression vector(1) is true for 30 minutes."
        cxMinNonNullValuesPercentage: "20"
  - name: example3
    rules:
    - alert: exampleAlert3
      expr: vector(1)
      for: 1m
      labels:
        priority: P5
        slack_channel: "#tm_coralogix_alert_test2"
        opsgenie_team: "demo_team"
      annotations:
        summary: "Third Example Alert Triggered"
        description: "This is the third example alert that triggers when the expression vector(1) is true for 30 minutes."
  - name: recording-rules
    rules:
    - record: example:recording:rule
      expr: vector(1)
EOF

Actual Output:

  • The alert exampleAlert3, which has the label slack_channel set to tm_coralogix_alert_test2, is being routed to the Slack channel tm_coralogix_alert_test. This happens because the slack-webhook-secret contains the URL for the webhook associated with the tm_coralogix_alert_test channel.

Expected output

  • Alerts should be routed to different Slack channels based on the slack_channel label, using a single Slack outbound webhook. If there is a Coralogix Slack app that can already handle this type of dynamic routing, that would solve the issue. Alternatively, any recommended method to achieve routing based on the slack_channel label would also be great.
@assafad1
Copy link
Contributor

@ankitdh7 Hi, thanks for reporting this issue.
Assuming you used example-alertmanager.yaml as is, it seems to me that the issue is happening because of this specific example configuration. You have two alerts, each has different value for the slack_channel label, but the alertmanagerconfig example you used routes all alerts with this label (regardless of value) to the same receiver, so the resulting alerts are pointing to same outbound-webhook.

@assafad1
Copy link
Contributor

assafad1 commented Nov 13, 2024

If you want new outbound-webhook to be created for another channel and multiple alerts to be connected to it, you can add a new receiver to the alertmanagerconfig with this channel, and a route that will match those alerts and send them to this new receiver.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants