Effective Slack alerts

Effective Slack alerts

Connor Mendenhall

June 25, 2019

"I think if you maintain a force in the world that comes into people's sleep, you are exercising a meaningful power."

— Don DeLillo, Underworld

Over the last few years, Slack quietly took over my world. As 8th Light grew from one office in Chicago to five cities on two continents, group chat became an important tool for collaborating across multiple time zones and diverse interests. On client teams, Slack became an always-on part of our development workflow, aggregating information from many disconnected systems in one place (right next to the cat gifs). In open source, all but the most stubborn holdouts left the old world of IRC to settle an archipelago of undiscovered Slacks.

Some days, Slack lives up to its multibillion dollar promise to "put collaboration at your fingertips." On other days, it's "an all-day meeting with random participants and no agenda." Either way, it's usually the first app I open in the morning, and the last one I close. Even then, it follows me home on my phone and seeps into my email inbox, one of those tools I can never really quit.

Slack took off in tech in part by making it easy for teams to aggregate notifications in one place and respond in real time. On many software teams, bots and integrations provide a constant stream of updates from project management tools, bug trackers, version control, and continuous integration systems, at every step of the development and deployment pipeline. Unlike email, where billions of unread automated messages have become permaclutter in inboxes and servers around the world, the real time nature of these alerts are a good match for a real time tool like chat. More important, Slack and its many integrations have made monitoring and alerting easier and more accessible for small teams.

Sending alerts to Slack can have huge benefits by providing fast feedback in one central location, but it's easy for notifications to become noisy and overwhelming. More than one of my teams have resorted to creating dedicated "no robots" channels where humans can talk to each other free of automated interference. When notifications become noise, teams lose the information sharing benefits that made Slack so useful in the first place, and critical signals can quickly drown in a sea of irrelevant detail.

Fortunately, the same principles that apply to monitoring production systems apply to Slack notifications. Good alerts of any kind should be actionable, focused, unique, real, and urgent.


If a human doesn't need to do something in response to an alert, it's probably noise. Does your team need to see every new Git commit on every branch, or just a few important ones? Does anyone need to react when the CI build passes or test coverage increases? Team chat makes it easy to tell whether notifications are actionable: just pay attention to whether anyone discusses them or does something when they appear. If an automated message doesn't provoke a conversation or communicate something useful, think hard about whether it really needs to exist.


One person's signal is another's noise. Focus notifications on the smallest set of people who need to know about a particular alert rather than notifying everyone of everything all the time. (A sure way to drive everyone bonkers all the time). Create user groups to notify subsets of the full team. Use separate channels with distinct purposes instead of sending all your alerts to one place. And consider dedicated channels for important alerts that might otherwise disrupt human conversations.


Integrations sometimes overlap. If two tools are posting similar messages at the same time, one of them is probably noise. But the perils of duplication run deeper. "Don't repeat yourself" is about knowledge, both in code and on Slack. If the same notification appears in many channels, the information it conveys and the conversation it triggers will end up fragmented between different people in different places.


At their best, false positives are distracting annoyances. At their worst, they create chronic background noise that will eventually cause real alerts to go unnoticed. If an alert is a false alarm, fix or remove it right away. This is obvious in principle but surprisingly hard in practice. Software changes frequently and alerts tend to lag behind the pace of code changes. It’s easy for alerts that used to be useful to turn into noise over time as the circumstances of your application change. Be ruthless about pruning false positives.


Real time tools are about the present. If your alert does not need attention in the present moment, don’t send it to chat. (Ever notice that Slack's own weekly summary messages are emails? Now imagine the horrifying nightmare world where Slackbot sends these messages instead). Of course, if your alert really really needs attention in the present moment, don't rely only on Slack. Send it over multiple channels and consider using a tool designed more explicitly for incident response and escalation. "Hope someone on the team hasn't discovered Do Not Disturb" is not an effective alerting strategy.

Slack has squarely conquered the tech world. As it turns towards worlds yet to conquer, automated alerts may well become part of workplaces everywhere. Whether your team monitors a fleet of servers, a folder of spreadsheets, or occasional dentist appointments, following the principles of effective monitoring will make your alerts more effective and your team happier.

Connor Mendenhall

Former Technical Director

As a skilled and experienced technical leader, Connor Mendenhall is adept at fulfilling various roles and responsibilities. Throughout his career, he has served as a software developer, architect, site reliability engineer, project manager, and engineering manager.