MouseVUER: Home-Cage Monitoring Using a Deep Learning-based Open-Source System

In an article published in the journal Scientific Reports, researchers from the USA developed an open-source deep learning-based system called mouse video utility for experiment recordings (MouseVUER) for video monitoring of laboratory mice in their home cages. Their device uses a depth camera to capture the behavior of mice without the need for specialized test setups or markers.

Study: MouseVUER: Home-Cage Monitoring Using a Deep Learning-based Open-Source System. Image credit: unoL/Shutterstock
Study: MouseVUER: Home-Cage Monitoring Using a Deep Learning-based Open-Source System. Image credit: unoL/Shutterstock

The tool also features a custom-designed cage with a food and water hopper, a mezzanine for enrichment, and a camera mounting enclosure. Moreover, it can be integrated with a ventilated rack or a custom non-ventilated rack for efficient video acquisition, compression, and streaming.

Background

Home-cage monitoring of mice is a valuable method for studying their natural behavior, including nocturnal activity, social interactions, and health status, without the disruptions caused by human intervention or unfamiliar environments. It can also reveal changes in behavior patterns over long time frames, which are useful for assessing the effects of interventions, diseases, or environmental factors.

Several commercial and lab-specific systems are available for home-cage monitoring, but they have limitations, such as high cost and proprietary algorithms, limited scalability and reproducibility, and compatibility issues with different racks and cages. Moreover, most commercial systems rely on conventional two-dimensional (2D) video, which can be affected by lighting conditions, occlusions, and background noise and may not capture the full complexity and variability of mouse behavior. Therefore, there is a need for an open-source system that can provide three-dimensional (3D) video monitoring of mice in their home cages, with easy fabrication, installation, and operation.

About the Research

MouseVUER comprised a modified next-generation (NexGen) lid, a custom food and water hopper, a camera mounting enclosure, and a mezzanine. The modified NexGen lid featured a cutout for an unobstructed camera view and a slot to secure the hopper.

The hopper included compartments for food pellets and a water bottle protected by metal grates. The camera mounting enclosure housed an Intel RealSense D435 depth camera capable of streaming RGB, depth, and near-infrared video simultaneously. This enclosure could be hung on a rack crossbar for easy access and included a handle for convenient handling. The mezzanine metallic structure provided mice with an incline, flat surface, and tunnel for exploration.

The system operated in two modes: integrated with an Allentown NexGen ventilated rack or housed in a custom non-ventilated rack. The ventilated rack enabled continuous and automated monitoring without disrupting husbandry, while the non-ventilated option accommodated three MouseVUER units and a power and control module (PCM). The PCM supplied power and data communication and included auxiliary lines for external illumination.

The device also included various options for video acquisition, compression, and streaming. Additionally, the authors introduced custom software based on the Intel software development kit (SDK). This software enabled scheduling recordings, specifying experiment metadata, and compressing depth video by splitting each frame into two 8-bit frames. Furthermore, it facilitated simultaneous streaming of RGB, aligned depth, and infrared video from multiple computers to a central server for storage.

The system was tested with mice of different strains, ages, coat colors, and genders. Various software options were employed for video acquisition, including Intel’s RealSense Viewer, Aivero’s web-based streaming software, and the authors' custom software based on Intel’s SDK. Additionally, an open-source, deep learning tool called DeepLabCut was utilized for keypoint detection on depth images to assess system accuracy. Furthermore, the accuracy of keypoint detection was evaluated using the object keypoint similarity (OKS) metric.

Research Findings

The outcomes showed that the system successfully captured mouse behavior in the home cage with high image quality and low data volume. Additionally, it proved compatible with various software options for video acquisition and analysis. The authors reported that their device achieved a high compression ratio of 9:1 for depth video and 21:1 for RGB video using their custom software. Furthermore, their tool exhibited a high keypoint detection accuracy of 0.964 on average using DeepLabCut on the depth images. The system demonstrated robustness to variations in mouse coat color, posture, and position.

The newly developed device holds promise for various applications in mouse behavior research, including studying the impacts of genetic, pharmacological, or environmental manipulations, monitoring mice's well-being and health status, and exploring their natural and social behaviors within their home cages. It can be used for teaching and training, as it provides an accessible and affordable way to conduct home cage monitoring experiments. Furthermore, it can be customized or expanded to accommodate different research requirements or preferences, such as integrating additional cameras, sensors, or stimuli.

Conclusion

In summary, the novel tool MouseVUER is effective and efficient for video monitoring of laboratory mice in their home cage. It is open-source, low-cost, scalable, robust, and compatible with ventilated and non-ventilated racks. Moreover, it offers efficient video acquisition, compression, and streaming options while allowing for accurate and reliable keypoint detection on depth images.

The researchers highlighted that the system could provide valuable insights into the natural behavior of mice and can be utilized for various research and animal care purposes. They acknowledged limitations and challenges and suggested that the presented system can be further improved by adding more features and functionalities, such as external illumination, infrared heating, sound recording, and automated behavior analysis.

Journal reference:
Muhammad Osama

Written by

Muhammad Osama

Muhammad Osama is a full-time data analytics consultant and freelance technical writer based in Delhi, India. He specializes in transforming complex technical concepts into accessible content. He has a Bachelor of Technology in Mechanical Engineering with specialization in AI & Robotics from Galgotias University, India, and he has extensive experience in technical content writing, data science and analytics, and artificial intelligence.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Osama, Muhammad. (2024, February 08). MouseVUER: Home-Cage Monitoring Using a Deep Learning-based Open-Source System. AZoAi. Retrieved on July 27, 2024 from https://www.azoai.com/news/20240208/MouseVUER-Home-Cage-Monitoring-Using-a-Deep-Learning-based-Open-Source-System.aspx.

  • MLA

    Osama, Muhammad. "MouseVUER: Home-Cage Monitoring Using a Deep Learning-based Open-Source System". AZoAi. 27 July 2024. <https://www.azoai.com/news/20240208/MouseVUER-Home-Cage-Monitoring-Using-a-Deep-Learning-based-Open-Source-System.aspx>.

  • Chicago

    Osama, Muhammad. "MouseVUER: Home-Cage Monitoring Using a Deep Learning-based Open-Source System". AZoAi. https://www.azoai.com/news/20240208/MouseVUER-Home-Cage-Monitoring-Using-a-Deep-Learning-based-Open-Source-System.aspx. (accessed July 27, 2024).

  • Harvard

    Osama, Muhammad. 2024. MouseVUER: Home-Cage Monitoring Using a Deep Learning-based Open-Source System. AZoAi, viewed 27 July 2024, https://www.azoai.com/news/20240208/MouseVUER-Home-Cage-Monitoring-Using-a-Deep-Learning-based-Open-Source-System.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Deep Learning Enhances Urban Building Mapping