A novel, automated, and real-time method for the analysis of non-human primate behavioral patterns using a depth image sensor
Cited 0 time in
- Title
- A novel, automated, and real-time method for the analysis of non-human primate behavioral patterns using a depth image sensor
- Author(s)
- S K Han; Keonwoo Kim; Y Rim; M Han; Youngjeon Lee; Sung Hyun Park; Won Seok Choi; K J Chun; D S Lee
- Bibliographic Citation
- Applied Sciences-Basel, vol. 12, no. 1, pp. 471-471
- Publication Year
- 2022
- Abstract
- By virtue of their upright locomotion, similar to that of humans, motion analysis of non-human primates has been widely used in order to better understand musculoskeletal biomechanics and neuroscience problems. Given the difficulty of conducting a marker-based infrared optical tracking system for the behavior analysis of primates, a 2-dimensional (D) video analysis has been applied. Distinct from a conventional marker-based optical tracking system, a depth image sensor system provides 3-D information on movement without any skin markers. The specific aim of this study was to develop a novel algorithm to analyze the behavioral patterns of non-human primates in a home cage using a depth image sensor. The behavioral patterns of nine monkeys in their home cage, including sitting, standing, and pacing, were captured using a depth image sensor. Thereafter, these were analyzed by observers’ manual assessment and the newly written automated program. We confirmed that the measurement results from the observers’ manual assessments and the automated program with depth image analysis were statistically identical.
- Keyword
- Depth image sensorBehavioral pattern analysisComputer-based analysisNon-human primate study
- ISSN
- 2076-3417
- Publisher
- MDPI
- DOI
- http://dx.doi.org/10.3390/app12010471
- Type
- Article
- Appears in Collections:
- Ochang Branch Institute > Division of National Bio-Infrastructure > National Primate Research Center > 1. Journal Articles
- Files in This Item:
Items in OpenAccess@KRIBB are protected by copyright, with all rights reserved, unless otherwise indicated.