Deep Learning for Geometric Shape Understanding Workshop in conjunction with CVPR 2019
June 17, 2019
Long Beach, CA http://ubee.enseeiht.fr/skelneton/
Computer vision approaches have made tremendous efforts toward understanding shape from various data formats, especially since entering the deep learning era. Although accurate results have been obtained in detection, recognition, and segmentation, there is less attention and research on extracting topological and geometric information from shapes. These geometric representations provide compact and intuitive abstractions for modeling, synthesis, compression, matching, and analysis. Extracting such representations is significantly different from segmentation and recognition tasks, as they contain both local and global information about the shape.
This workshop aims to bring together researchers from computer vision, computer graphics, and mathematics to advance the state of the art in topological and geometric shape analysis using deep learning.
The SkelNetOn Challenge is structured around shape understanding in three domains. We provide shape datasets and some complementary resources (e.g, pre/post-processing, sampling, and data augmentation scripts) and the testing platform. The winner of each track will receive a Titan RTX GPU.
Submissions to the challenge will perform one of the following tasks:
- Shape pixels to skeleton pixels : Extract skeleton pixels from a binary shape image. This is a binary classification problem where image pixels are labeled as on or off the skeleton.
- Shapepointstoskeletonpoints:E xtractskeletonpointsfromashapepointcloud.Thismay be treated as a binary classification problem where points are labeled as on or off the skeleton, though other formulations (e.g., transformer networks) are also acceptable.
- Shapepixelstoparametriccurves:E xtractaparametricrepresentationofanetworkof curves in the skeleton and their radii, modeled as a degree-5 Bézier curve in three dimensions (two spatial coordinates and the radius). This may be thought of as a regression problem.
Call for papers :
We will have an open submission format where i) participants in the competition will be required to submit a paper, or ii) researchers can share their novel unpublished research in deep learning for geometric shape understanding. The top submissions in each category will be invited to give presentations during the workshop and will be published in the workshop proceedings.
Although we encourage all submissions to benchmark their results on the evaluation platform, there are other relevant research areas that our datasets do not address. For those areas, the scope of the submissions may include but is not limited to the following general topics:
- Boundary extraction from 2D/3D shapes
- Geometric deep learning on 3D and higher dimensions
- Generative methods for parametric representations
- Novel shape descriptors and embeddings for geometric deep learning
- Deep learning on non-Euclidean geometries
- Transformation invariant shape abstractions
- Shape abstraction in different domains
- Synthetic data generation for data augmentation in geometric deep learning
- Comparison of shape representations for efficient deep learning
- Applications of geometric deep learning in different domains
The CMT site for paper submissions is https://cmt3.research.microsoft.com/SKELNETON2019/ . Each submitted paper must be no longer than 4 pages excluding references. Please refer to the CVPR author submission guidelines for instructions at http://cvpr2019.thecvf.com/submission/main_conference/author_guidelines . The review process will be double blind but the papers will be linked to any associated challenge submissions. Selected papers will be published in IEEE CVPRW proceedings, visible in IEEE Xplore and on the CVF Website.
Feb 15: Call for Challenge/Call for Papers Mar 25: Submissions close
Apr 5: Notification to authors
Apr 10: Camera-ready paper
Jun 17: Workshop
Organizing Committee and Contact:
Ilke Demir, DeepScale, firstname.lastname@example.org
Kathryn Leonard, Occidental College, email@example.com
Géraldine Morin, Univ. of Toulouse, firstname.lastname@example.org
Camila Hahn, Bergische Universitat Wuppertal, email@example.com