PyCon 2019 in Cleveland, Ohio

Sponsor Tutorials

<center> <h2>Sponsor Tutorial Registration is now Open!</h2> <a href="/2019/registration/register/" class="btn">Register Now!</a><br> <p>See the schedule in grid form <a href="/2019/schedule/sponsor-tutorials">here</a></p> </center>

42 PyCharm Tips and Tricks

Paul Everitt
Thursday 3:30 p.m.–5 p.m. in Room 13

PyCharm brings a boatload of IDE features to professional Python development. Want to “level up” and learn productivity boosters? This hands-on, fast-paced workshop, run by the PyCharm team, covers tips across all the major product features.

An Introduction to Building Scalable Web Apps with Visual Studio Code and Azure

Nina Zakharenko
Thursday 1:30 p.m.–3 p.m. in Room 23

Part 1: Setting Up Your Environment and Creating a Local Application - Python environment best practices - Setting up Visual Studio Code for development with the Python extension - Anatomy of a Django web app, connecting our application to a database, introduction to Django Migrations - Visual Studio Code productivity tips - using snippets in Visual Studio Code to speed up development by reducing common boilerplate code into reusable templates - Running a Django web app in our local environment with a local database - Debugging server code and templates - Together, we'll build a small API that allows us to accept and interact with user input   Part 2: Web Apps in the Cloud - Cloud-based databases. Creating and connecting to a PostgreSQL database within Azure, as well as running remote database migrations - Configuration – How to configure your local application to access cloud-based resources - Deployment – How to deploy our application to Azure Web Apps on Linux in a few clicks from Visual Studio Code - Looking at Logs – seeing what’s going on under the hood Conclusion Wrap up, cover what was learned, go over next steps and additional resources.

Dog Classification using TensorFlow, Azure, and Visual Studio Code

Rong Lu
Thursday 11 a.m.–12:30 p.m. in Room 13

In this workshop, we will use transfer learning to retrain a MobileNet model using TensorFlow to recognize dog and cat breeds using the Oxford IIIT Pet Dataset. Next, we’ll optimize that model using Azure Machine Learning Service HyperDrive to improve the accuracy of our model. Putting on our developer hat, we'll then refactor the notebooks into Python modules. Finally, we will deploy the model as a web service and send a few pictures to test! Along the way, you’ll use Azure Notebooks, Visual Studio Code, and Azure Machine Learning. Learning objectives: - Train models using Jupyter Notebooks and Tensorflow - Turn Notebooks into Python modules - Deploy models as web services and test them - Learn how to use Azure and Visual Studio Code

Friendly/Modern AsyncIO

John Reese
Thursday 11 a.m.–12:30 p.m. in Room 14

AsyncIO is a powerful mechanism for building high-performance services, but it has long been thought of as “too complicated”. But it’s easier than you might think, and with Python 3.7, it’s easier than ever to get started using just the standard library! We'll cover the basics of AsyncIO, take a look at the latest features in 3.7, build friendly and expressive APIs, analyze best practices for third-party libraries, and discuss integrating async code with synchronous libraries and codebases.

From Project to Productionized on Heroku

Casey Faist
Thursday 9 a.m.–10:30 a.m. in Room 23

In this workshop, we will start with a bare-bones Django application. We’ll talk about 12 Factor Django apps and why you’d want one, then configure our application for deploy on Heroku. We’ll add a custom domain name, configure CI and Github Integration for our project. Next, we’ll add some human complexity and talk about how to collaborate with others and how to control and share access on Heroku. Then, we’ll walk through some more complex code examples and how to get them running on Heroku. Come prepared with: - Python 3.7 installed - Git installed - Django pip installed

Getting Rid Devs Big Pain Points - from automating workflow to managing complex dependency trees

Pete Garcin
Thursday 9 a.m.–10:30 a.m. in Room 13

Getting to the Point with Geospatial

Jayson Delancey
Thursday 11 a.m.–12:30 p.m. in Room 23

Audience Level: Beginner to Intermediate Even if you are a seasoned Python wizard it takes some effort to get your bearings when working with geographical data.  In this workshop, we'll start with some beginner projects to build geospatial applications. - How to identify the location of an image file with PIL - How to find places on a map with NLTK - How to read/write geospatial file types with tools like geopandas - How to visualize point clouds from GPS traces or LiDAR data with PPTK - How to store and query vector tiles with XYZ By the end of our time together you will have gone from Hello World to having a roadmap for where to learn more about geospatial tools and techniques.

Google Cloud Platform for Pythonistas

Dustin Ingram
Thursday 3:30 p.m.–5 p.m. in Room 23

Support for Python on Google Cloud Platform has never been better. Join us for a tour of Python runtimes, services and client libraries, including App Engine, Cloud Functions, and more. We'll also discuss tools for monitoring and debugging your Python application, and best practices for using Python on GCP. This is explicitly a sponsored product tour - we promise to focus on solving real problems with these tools and to include lots of interactive hands-on demos and example code.

How to Create a GraphQL endpoint on top of a RESTful API

Richard Moot
Thursday 9 a.m.–10:30 a.m. in Room 25C

"Audience Level: Beginning to Intermediate Not all APIs offer a GraphQL endpoint, but it is a wonderful way to consume and explore an API. We will cover creating a GraphQL endpoint on top of a RESTful API using Flask, Graphene, and Square APIs. Through this workshop you’ll learn about Graphql: How to create schemas using Graphene When to use Queries vs Mutations What a Resolver is and how to construct them"

Production-scale PyTorch: TorchScript and the PyTorch JIT

Michael Suo, James Reed
Thursday 9 a.m.–10:30 a.m. in Room 14

In this workshop, we'll give a technical introduction to the various techniques that PyTorch uses to scale a Pythonic machine learning workflow from research to production. We'll talk about how we use Python's dynamism and reflection capabilities to build a high-performance compiler infrastructure for machine learning. After, we'll demo how you can use these tools to ship a machine translation model to production today!

How I Learned to Stop Worrying and Love Python at Google | Tech Talk Series

Wesley J Chun
Thursday 3:30 p.m.–5 p.m. in Room 25C

Abstract: Continuing with tradition, our sponsor workshop session features three awesome half-hour Python tech talks at they relate to Google. Hear about Python tools we’ve built in-house, using Python with Google developer tools to build your web & mobile applications with, or best practices on how we use Python internally at Google. You’re invited to stop by and hear from a cadre of world-class Google engineers from around the world! :-) Stay tuned for more specific tech talk descriptions … updates are coming here soon!

Python Performance for Poets

Aron Ahmadia, Gil Forsyth, Steven Lott
Thursday 1:30 p.m.–3 p.m. in Room 25C

We live in a world of ever-increasing data, complexity, and computer intelligence. We are constantly making decisions on how to make software processes faster, more efficient, or cheaper to operate. There is a lot of misguided wisdom, folklore, and superstition around software performance that can be as much help as hindrance. In our tutorial, we'll present some fundamental rules of software performance as a set of physical, intuitive "laws", with illustrative examples. Our goal is for every audience member to leave with enough of a grounding in these underlying laws to make rational performance decisions when selecting their programming tools, algorithms, and hardware. Audience: This tutorial is intended for audiences without a computer science background, or those with a computer science background who are interested in discussing and explaining performance concepts to those without it. Some programming experience in Python will be helpful, as will access to the tutorial notebooks and the ability to run them live. Mathematics will be discussed, but will not be required beyond a high school level to understand and engage.

Python @Scale: An Instagram Story

Anirudh Padmarao, Benjamin Woodruff, Carl Meyer
Thursday 3:30 p.m.–5 p.m. in Room 14

Instagram employs Python in one of the world's largest settings, using it to serve 1 billion monthly active users. We chose Python because of its reputation for simplicity and practicality, which aligns well with our philosophy of “do the simple thing first.” Come learn about the challenges, learnings, and techniques used at Instagram to maintain and advance our million-LOC, thousand endpoint Django application. We will discuss how we empower engineers to build quickly while managing complexity and maintaining a high-quality codebase. Topics may include AST-based lint frameworks, tools for large-scale codebase refactors, adopting Python type annotations, types for Django HTTP APIs, safe hot-reloading of Python code, and more. This will be an informal setting; if you've ever wondered how Python works at scale, this is your chance to ask!

Slack <3 Python: How we develop SDKs and build bots in Python

Rodney Urquhart, Jason Roche
Thursday 1:30 p.m.–3 p.m. in Room 13

In this workshop, we discuss how we develop and maintain our open-source Python tools at Slack. We’ll also teach you how to build your own useful Slack bot and leave plenty of time for Q&A. Beginners are welcome! Developers with Slack experience will learn about recently released features and tools and have face time with the Dev Rel team.

Spatial Analysis Meets Data Science

Alberto Nieto, Shairoz Sohail, Shannon Kalisky
Thursday 1:30 p.m.–3 p.m. in Room 14

Location matters in every problem you are trying to solve, and a geographic approach is often crucial to extracting meaning from data. In this workshop, learn about ArcGIS Notebooks, a built-in Python-scripting environment in web GIS (geographic information systems) that puts the power of spatial algorithms and analysis tools into the hands of problem solvers. With ArcGIS Notebooks, you can find and prepare data, develop iterative analysis workflows, train and optimize models, build information products, share results, and collaborate with the rest of your organization. You'll learn how: - Spacial can play a role in your data science notebooks - Ready-to-use datasets can augment your analysis - To visualize your analysis in beautiful maps - Train and do inferencing with a deep learning model on geospatial data

Using Machine Learning to Create Proxy Labels for Transaction Data

Tobi Bosede
Thursday 11 a.m.–12:30 p.m. in Room 25C

In banking we often find ourselves wanting to predict an outcome or variable for which no labels exist to indicate ground truth. Does this mean we cannot apply supervised machine learning techniques? No! In this talk, you will learn how we created proxy labels for recurrent transactions using python and Apache Spark. Some examples of recurring transactions are Netflix and Spotify subscriptions. The best part about the approach we used is that it did not require paying humans to manually create labels, saving both time and money while increasing accuracy since it bypasses human error. Because of the robustness of the proxy labels created, we were able to improve upon a previously rule-based process for determining recurring transactions. This ensures that customers do not experience interruption when their card is replaced and they are notified when subscription costs increase significantly, among many other benefits. ### Abstract #### Audience The target audience is machine learning engineers, data scientists, and finance/banking folks. However, anyone with an interest in machine learning or banking will benefit from the talk. #### Objectives The audience will come away with an understanding of an approach to create labels for unlabeled data and get exposure to a situation in which python and Apache Spark provide business value. #### Notes The project has evolved greatly since this article was written, but still provides useful context for understanding value proposition of predicting recurring transactions.