Multi-user Telepresence Systems for Remote Collaboration across Wall-Sized Displays Master-level internship 2019 chez ExSitu (Inria Saclay)

Research team: ExSitu – Inria, Saclay.

Lab: Laboratoire de Recherche en Informatique, Bât. 650, campus Université Paris-Sud.

Advisors: Cédric Fleury (cfleury@lri.fr), Michel Beaudouin-Lafon (mbl@lri.fr).

Keywords: telepresence, computer-mediated communication, wall-sized displays.

Context:

With the increase of computer power, large and complex datasets are becoming common in science, industry, business and society. Wall-sized displays [3] are powerful tools to enable users to analyze such data and perform complex manipulation tasks. They allow to display large amounts of data with a high resolution and support multiple users collaborating in front of the display [4]. However, remote collaboration in such environment is still a challenge. This internship will take place in the context of the DIGISCOPE project (http://www.digiscope.fr/), which is creating a network of ten interconnected platforms, including virtual reality systems, large 3D displays and wall-sized displays, on the Paris-Saclay campus.

Description of the internship:

To explore remote collaboration across two wall-sized displays, we designed CamRay [2] (Fig. 1), a system that uses an array of cameras embedded in each display to capture live video of the users as they move in front of the display. At the remote side, CamRay overlays the video feed on top of the content and we implemented two methods for positioning this video: Follow-Me and Follow-You. With Follow-Me, the video window follows the horizontal position of the local user, providing constant visual contact with the remote person. With Follow-You, the video window follows the horizontal position of the remote user, conveying his or her position in front of the display. We ran various experiments and observed that each method has advantages: Follow-You works well for conveying pointing gestures [1], while Follow-Me creates a virtual face-to-face that is beneficial for one-to-one communications.

The goal of this internship is to continue this work by studying in which collaborative situations video is useful and in which other situations other forms of representation of the remote partners would be beneficial. In particular, we want to explore which non-verbal cues need to be transmitted between remote co-workers according to the current task and how users could make the transition between different kinds of representations. We are considering solutions based on simplified graphical representations or 3D audio. The proposed solutions would have to scale to more than two users and address collaboration between multiple users (e.i. more than two locations and several users by location). In addition, we want to investigate solutions to enable users to engage in a one-to-one conversation and to leave it easily as they would do in a co-located collaboration.

If successful, the work will be submitted to a top conference in Human-Computer Interaction (ACM CHI, ACM UIST, etc.). This internship can also lead to a Ph.D. in our research group.

Requirements:

We are looking for students who are interested in research in Human-Computer Interaction, especially in computer-mediated communication and remote collaboration. Solid programming skills are required as the student will have to deal with clusters managing rendering on our wall-sized displays. Experience with video streaming or cameras is appreciated.

References:

  1. Avellino I., Fleury C., and Beaudouin-Lafon M. Accuracy of Deictic Gestures to Support Telepresence on Wall-sized Displays. In Proceedings of the Conference on Human Factors in Computing Systems (CHI ’15), 2015.
  2. Avellino I., Fleury C., Mackay W., and Beaudouin-Lafon M. CamRay: Camera Arrays Support Remote Collaboration on Wall-Sized Displays. In Proceedings of the Conference on Human Factors in Computing Systems (CHI ’17), 2017.[video]
  3. Beaudouin-Lafon, M., Huot, S., Nancel, M., Mackay, W., Pietriga, E., Primet, R., Wagner, J., Chapuis, O., Pillias, C., Eagan, J., Gjerlufsen, T., and Klokmose, C. 2012. Multisurface interaction in the WILD room. IEEE Computer 45, 4 (2012), 48–56.[video]
  4. Liu C., Chapuis O., Beaudouin-Lafon M., and Lecolinet E. Shared Interaction on a WallSized Display in a Data Manipulation Task. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16), 2016.[video]
Catégorie(s) : Emploi et carrière, Offres de stages

Laisser un commentaire

Votre adresse de messagerie ne sera pas publiée. Les champs obligatoires sont indiqués avec *

*


− 1 = cinq