Link to the video demonstration |

Point and Share: From Paper to Whiteboard

Collaboration using a traditional or a digitally mediated whiteboard lacks the facility of eliciting participation from those in the room using traditional media. We present the design of a system called Point and Share that encourages participation in collocated meetings by allowing users who employ traditional media (pen and paper) to author content on a shared whiteboard.

The inspiration for Point and Share stems from our own experience during group meetings and brainstorming sessions. We observed while some of us participated and collaborated using the whiteboard as a shared medium, others scribbled in their notebooks and were therefore only able to share ideas with those sitting in physical proximity to them. By closely examining the traditional input tools and affordances they provided, we designed a system that allows users to write with a pen in their notebooks and still be able to share their ideas with everyone in the meeting room.

Using the whiteboard as a shared medium through personal note pad

PointAndShare's interaction and infrastructure

PointAndShare photo


Our input device uses a digital pen, which captures and converts handwriting on paper into digital data [11]. We have expanded the pen design to include gestural pointing and, zoom in/out functions by attaching a custom designed pen-cap. The hardware attachment consists of a linear potentiometer, an IR LED connected to an Arduino board a switch. By sliding the potentiometer up and down the length of the pen cap, the user can increase or decrease the size of their canvas area on the whiteboard inside which their text displays. A laptop acts as the central server for collecting the digitized content from each users pen and projecting it on the whiteboard at the spot selected by the user. Our software enables the display of text written on paper, on the whiteboard. The software also permits the interaction mechanisms as well as takes snapshots of the whiteboard every ten seconds for archiving the content. The current system is based on Processing. For Wii-mote IR tracking we are using Johnny Chung Lee's open source code.

Collaborative Project: Austin S. Lee, Misha Sra, Gonglue Jiang and Shen-Ying Pao
Fall 2011, MAS.834: Tangible Interfaces, MIT Media Lab, Advisor: Hiroshi Ishii
25th ACM UIST Symposium, Cambridge, US (UIST 2012 Demo project)

Misha Sra, Austin Lee, Sheng-Ying Pao, Gonglue Jiang, and Hiroshii Ishii. 2012. Point and share: from paper to whiteboard. In Adjunct proceedings of the 25th annual ACM symposium on User interface software and technology (UIST Adjunct Proceedings '12). ACM, New York, NY, USA, 23-24. DOI=10.1145/2380296.2380309

Download Paper