[Rails] Crawlers you don’t want to cause a notification for an error

(Everyday Code – instead of keeping our knowledge in a README.md let’s share it with the internet)

We are using exception_notification gem. It is great because it will send us a direct email when there is an error on the platform. This means we are not looking at the logs. We get the notification directly on the email.

But some of the crawlers (like all of them) cause errors, mainly in the cross-origin domain. The JavaScript we are delivering is not allowed for their domain and this generates an error. One way to stop these errors is to ignore errors caused crawlers in the exception_notifcation configuration:

Rails.application.config.middleware.use ExceptionNotification::Rack,
    ignore_crawlers: %w{Googlebot bingbot YandexBot MJ12bot facebookexternalhit SEMrushBot AhrefsBot},
    :email => {
      :email_prefix => ...,
      :sender_address => ...,
      :exception_recipients => ...
    }