PromQL: query whether an alert is silenced

2.1k views Asked by At

I have successfully silenced an alert for a node that's currently down (and will be for a while before we have time to replace it physically).

While I assume the silence will stop the alert from re-surfacing in the slack-channel I'd also like to get rid of it on the grafana dashboard we run over the top of prometheus. Here's the query for the respective tile in grafana.

sum(ALERTS{alertname="NodeDown", alertstate="firing"})

My questions is whether there's a keyword that I can replace "firing" with, "not silenced" doesn't work (neither does "silenced " ;}) that will only show me machines whose alerts aren't silenced.

3

There are 3 answers

4
brian-brazil On

Silences exist entirely in the Alertmanager, Prometheus doesn't know anything about them. Thus there's no metric that'll let you know that the alert is silenced inside Prometheus.

2
arnittocrab On

For my use case to get a metric about how many silenced alerts we have, while I was searching the web, Rafa's answer helped me to realize how to do it, that I have to scrape alertmanager target itself to get some metrics out of it.

I have added prometheus job to scrape alertmanager metrics and you get these GAUGE metrics alertmanager_silences{}

# HELP alertmanager_silences How many silences by state.
# TYPE alertmanager_silences gauge

alertmanager_silences{state="active"} 0
alertmanager_silences{state="expired"} 0
alertmanager_silences{state="pending"} 0
    - job_name: 'alertmanager'
      static_configs: 
        - targets: prometheus1.foo.bar:9093']
3
Rafa On

You can add a prometheus job to scrape alertmanager. then you get metrics like alertmanager_silences

# HELP alertmanager_silences How many silences by state.
alertmanager_silences{state="active"} 0
alertmanager_silences{state="expired"} 0
alertmanager_silences{state="pending"} 0