MeshMonk: Open-source large-scale intensive 3D phenotyping

Julie D. White, Alejandra Ortega-Castrillón, Harold Matthews, Arslan A. Zaidi, Omid Ekrami, Jonatan Snyders, Yi Fan, Tony Penington, Stefan Van Dongen, Mark D. Shriver, Peter Claes

Research output: Contribution to journalArticlepeer-review

22 Scopus citations


Dense surface registration, commonly used in computer science, could aid the biological sciences in accurate and comprehensive quantification of biological phenotypes. However, few toolboxes exist that are openly available, non-expert friendly, and validated in a way relevant to biologists. Here, we report a customizable toolbox for reproducible high-throughput dense phenotyping of 3D images, specifically geared towards biological use. Given a target image, a template is first oriented, repositioned, and scaled to the target during a scaled rigid registration step, then transformed further to fit the specific shape of the target using a non-rigid transformation. As validation, we use n = 41 3D facial images to demonstrate that the MeshMonk registration is accurate, with 1.26 mm average error, across 19 landmarks, between placements from manual observers and using the MeshMonk toolbox. We also report no variation in landmark position or centroid size significantly attributable to landmarking method used. Though validated using 19 landmarks, the MeshMonk toolbox produces a dense mesh of vertices across the entire surface, thus facilitating more comprehensive investigations of 3D shape variation. This expansion opens up exciting avenues of study in assessing biological shapes to better understand their phenotypic variation, genetic and developmental underpinnings, and evolutionary history.

Original languageEnglish (US)
Article number6085
JournalScientific reports
Issue number1
StatePublished - Dec 1 2019

All Science Journal Classification (ASJC) codes

  • General

Fingerprint Dive into the research topics of 'MeshMonk: Open-source large-scale intensive 3D phenotyping'. Together they form a unique fingerprint.

Cite this