Sunday 10 a.m.–1 p.m. in
This poster presents [Affirm](https://www.affirm.com/)'s approach to instrumenting Python. We place great importance on our metrics infrastructure because metrics help engineers find and fix bugs quickly, confidently iterate on features, and make data-driven decisions. We start by discussing how we emit metrics in our Python code by extending the Python logger. We consider how we ensure this process is reliable and how we make sure it will not impact application performance. We also present our metrics pipeline--how we use open source software for metrics collection ([Riemann](http://riemann.io/)), storage ([Elasticsearch](https://www.elastic.co/products/elasticsearch)), visualization ([Grafana](https://grafana.com/)), and alerting ([Cabot](https://github.com/Affirm/cabot)). We explain the reasons we believe the tools we chose are reliable and scalable and the checks we have built to ensure our pipeline is working. As an example, we present the metrics we collect for [celery](http://www.celeryproject.org/) tasks and examples of the insights these metrics have given us.