TY - JOUR TI - From photos to 3D design: a product shape family design and modeling framework with image-based reconstruction and 3D convolutional neural networks DO - https://doi.org/doi:10.7282/t3-peb1-4e81 PY - 2019 AB - Current development of new product and product variation is largely driven by the increasingly sophisticated and demanding customers, and product variety plays a crucial role to gain customer satisfaction at a wider range. Our research focuses on the shape variety of product, which is inherently linked to both functional and aesthetic variety. The challenge that we are addressing here is the low conceptual design efficiency with cross-professional and back-and-forth communication due to limited design visualization tools during converting demands into ideas and then into 3D models. In our research, a learning-based product shape family design and modeling framework is developed, which integrated image-based reconstruction and 3D shape learning to simplify the modeling process but meanwhile providing abundant flexibilities for 3D shape variation and thus improving conceptual design efficiency. With the reconstruction system and learning system, raw model generating process can be as simple as taking photos around design or redesign targets and making selections from predicted models. Two subsystems are developed for the reconstruction process, a Structure from Motion system that recovers camera motion and a sparse structure, and a Multi-View Stereo system generating denser matches and thus denser point clouds. The incremental reconstruction strategy is adopted, and an initial pair selection strategy and an extrinsic matrix correction measure are derived and utilized to provide better initial camera motion estimation for any further global adjustment optimization. Our experiments shows improved accuracy and robustness of the reconstruction system. To understand what is in the reconstructed point cloud, a 3D convolutional neural network model is constructed to identify 3D shapes from point cloud data. The model is trained by labeled pre-processed point cloud data obtained from both reconstruction and CAD file sampling. The preprocessing includes normalization, down-sampling, manually labeling, and voxelization. The voxel representation made it possible to use convolutional-styled learning method. And the two-layered configuration of Convolution-ReLU-Max Pooling proved to have good performance in classifying 3D shapes. A new concept of Product Shape Family is defined, and a hierarchical-structured library of product shape family tree is proposed so that classification can be done at different level with a smaller number of candidate classes. Chain rule is used to calculate the possibility at certain shape family and this way the user can be provided with multiple best guesses and select from them at different family generations. A modularized model is defined within every product shape family including internal modules that form the product shape platform and external modules that are optional. The editing process of the selected shape family model is simply selecting desired module and edit shape with pushing and pulling operations on predefined control points. The new design could form a new family branch in the library, or it can be sampled and preprocessed in the same manner as training data and fed back to training process along with newly scanned point clouds. Two examples are presented to demonstrate performance of the framework from reconstructing 3D point clouds from photos of common design objects to classifying their shape families, and to design with modularized 3D models. Our design and modeling framework reconciles the dilemma between functionality and user-friendliness of 3D modeling tools. In this way, new shape ideas or variants can be easily modeled and visualized in real time and in 3D for non-CAD-users, which improves communication efficiency and hence improves conceptual design efficiency. KW - Mechanical and Aerospace Engineering KW - 3D-CNN KW - Three-dimensional imaging LA - English ER -