We describe the architecture of an application designed to enhance social interactions at conference gatherings by integrating data from online social networks and from wearable proximity sensors. The pipeline that processes the stream of sensor data and provides proximity-based services is implemented in Python and Twisted, and has been demonstrated to scale to thousands of simultaneous users.
Social interactions are one of the key factors to the success of conferences and similar community gatherings. This poster describes the architecture of Live Social Semantics, an application that integrates on-line information sources from the social Web with face-to-face proximity of the attendees sensed by means of wearable active RFID devices embedded in conference badges. The application was successfully deployed at several international academic conferences. Personal profiles of the participants were generated using data from online social networks (Facebook, Twitter, etc.) and semantic information from academic data sources. The user profiles and their online friendships were integrated in real-time with the face-to-face, time-varying proximity graph from wearable sensors. The integration of these heterogeneous data layers makes it possible to enhance the social experience of the attendees by visualizing their on-line and off-line networks, and by allowing them to browse and explore their on-line and off-line social neighborhoods to find unexpected connections. The poster describes the architecture of the application, with a special focus on the real-time pipeline that processes the incoming UDP stream from wearable proximity sensors, fuses on-line information and face-to-face proximity relations, and exposes Web services for the user-facing part of the application. The data processing pipeline is a multi-process system implemented entirely using Python, Twisted, and Numpy, and demonstrated to be capable of handling incoming sensor data from thousands of users simultaneously present at the conference venue.