Research team: ExSitu – Inria, Saclay (Paris metropolitan area)
Location: Laboratory of Computer Science (LRI), Bât. 650 Ada Lovelace, University of Paris-Saclay
Duration: 4 – 6 months
Advisors: Theophanis Tsandilas (firstname.lastname@example.org) and Cédric Fleury (email@example.com)
Digital fabrication has revolutionized the way professional designers and engineers work. We have visited big architectural firms in Paris and have observed that laser cutters and 3D printers have being replacing their traditional modeling techniques. This change has speed up production times, but it has also enabled architects and modelers to experiment with new materials and improve the quality of their models. Nevertheless, current digital fabrication tools still rely on traditional CAD software, where a great part of the creation process takes place in front of a computer screen. Unfortunately, this process does not capture the way design teams work on highly creative projects, where concepts evolve through parallel representations, which include sketches on paper and low-cost physical prototypes , before they reach their final detailed form. We have recently developed material-based sensing technologies  that allow synchronizing the physical and digital representations of such models.
However, such technologies are currently hard to deploy, while digital editing is still performed in front of a computer screen.
Internship Goals: The proposed internship will investigate how to augment a collaborative physical modeling process with an augmented-reality (AR) system. We are especially interested in collaboration scenarios where each participant (e.g., the architect or the modeler) has potentially a different design role and task. This requires virtual views adapted to the expertise and design perspective of each collaborator. In our previous work, we have studied remote collaboration in immersive environments  and large wall-size displays . Our challenge is to extend this work to co-located settings, where collaborators interact over the same objects but have access to a different virtual view. Some recent work  has studied a range of techniques for sharing views in Virtual-Reality environments.
However, these techniques cannot directly apply to AR environments where collaborators share a common physical space and leveraging natural body cues such gaze and gestures is important.
The implementation will be based on HoloLens1 and/or Meta2 technologies, but we may also consider other alternative technologies.
We are looking for students who are enthusiastic about AR technology and are interested in research in Human-Computer Interaction. The intern is expected to have solid programming skills, and ideally, previous experience with C# or related programming languages (Java or C++). A background in computer graphics and 3D modeling will be a plus.
The internship could lead to a Ph.D. thesis.
 Bousseau, A., Tsandilas, T., Oehlberg, L., and Mackay, W. (2016). How Novices Sketch and Prototype HandFabricated Objects. ACM Conference on Human Factors in Computing Systems (CHI ’16), pp. 397-408.
 Wessely, M., Tsandilas, T., and Mackay, W. (2018). Shape-Aware Material: Interactive Fabrication with ShapeMe. ACM Symposium on User Interface Software and Technology (UIST ’18), pp. 127-139.
 Fleury, C., Duval, T., Gouranton, V., and Steed, A. (2012). Evaluation of Remote Collaborative Manipulation for Scientific Data Analysis. ACM Symposium on Virtual Reality Software and Technology (VRST ’12), pp. 129-136.
 Avellino, I., Fleury, C., and Beaudouin-Lafon, M. (2017). CamRay: Camera Arrays Support Remote Collaboration on Wall-Sized Displays. ACM Conference on Human Factors in Computing Systems (CHI ’17), pp. 2393-2396.
 Xia, H., Herscher, S., Perlin, K., and Wigdor, D. (2018). Spacetime: Enabling Fluid Individual and Collaborative Editing in Virtual Reality. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology (UIST ’18), pp. 853-866.