Building a Robot that Can Play Angry Birds on a Smartphone, (or Robots are the Future of Testing)

Type:
Talk
Audience level:
Novice
Category:
Testing
March 10th 4:15 p.m. – 4:55 p.m.

Description

Can your robot play Angry Birds? On an iPhone? Mine can. I call it "BitbeamBot". It started as an art project, but it has a much more serious practical application: mobile web testing. To trust that your mobile app truly works, you need an end-to-end test on the actual device. BitbeamBot is an Arduino-powered open-source hardware CNC robot that can test any application on any mobile device.

Abstract

BitbeamBot

(Watch the video of BitbeamBot playing Angry Birds.)

For the confidence that your mobile app truly works, you need an end-to-end test on an actual device. This means the full combination of device manufacturer, operating system, data network, and application. And since mobile devices were meant to be handled with the human hand, you need something like a real hand to do real end-to-end testing. At some point, after lots of repetitive manual testing, the inevitable questions is asked "Can we / should we automate the testing of the old features, so I can focus the manual testing effort on the new features?"

That's where the BitbeamBot comes in. BitbeamBot is an Arduino-powered open-source hardware CNC robot that can test any application on any mobile device -- touching the screen of a mobile device just like a user would. It also uses Python and the Selenium automation API to work its magic. In the future your testing will be automated... with robots.

At the moment, BitbeamBot is just a prototype, but it can play games with simple mechanics, like Angry Birds. However, it's not very smart; it can't yet "see" where objects are on the screen. From my computer, I send electrical signals to two motors to move the pin over any point on an iPhone screen. I then use a third motor to move the pin down to the screen surface and click or drag objects. This open loop, flying-blind approach to automation is how automated software testing was done years ago. It's the worst way to automate. Without any sense of what's actually visible on screen, the script will fail when there's a discrepancy between what's actually on the screen and what you expected to be on screen at the time you wrote the automation script.

A better approach to testing with BitbeamBot will involve closing the feedback loop and have software determine where to click based on what is actually on screen. There are two styles I'll experiment with: black box and grey box. Black box testing is easier to get started with, but grey box testing is more stable long term.

Black box testing requires no internal knowledge of the application. It treats the application like a metaphorical "black box". If something is not visible via the user interface, it doesn't exist. To get this approach to work with BitbeamBot, I'll place a camera over the phone, send the image to my computer, and use an image-based testing tool like Sikuli. Sikuli works by taking a screenshot and then using the OpenCV image recognition library to find elements like text or buttons in the screenshot.

The other style is grey box testing. It's a more precise approach, but it requires access to and internal knowledge of the application. I'll implement this approach by extending the Selenium library and tethering the phone to my computer via a USB cable. With the USB debugging interface turned on, I can ask the application precisely which elements are on screen, and where they are before moving the BitbeamBot pin to the right location.

BitbeamBot's home is bitbeam.org. The hardware, software, and mechanical designs are open source and available on github.