LIVE ELDERLY MONITORING SYSTEM USING DEEP LEARNING AND COMPUTER VISION
DOI:
https://doi.org/10.63300/Keywords:
Elderly monitoring, real-time surveillance, fall detection, YOLOv8, deep learning, computer vision, OpenCV, Pushbullet API, non-intrusive monitoring, caregiver alerts, emergency responseAbstract
With the growing elderly population, ensuring their safety, especially for those living alone, has become a critical concern. This paper presents a real-time elderly monitoring system that leverages computer vision and deep learning to detect falls and hazardous objects such as knives and guns. The system integrates YOLOv8 for fall detection and object recognition, OpenCV for real-time video frame processing, and Pushbullet API for instant caregiver notifications. A Flask-based user interface allows caregivers to monitor live annotated video feeds remotely, offering a non-intrusive alternative to wearable sensors that require user compliance.The proposed system achieves 95% accuracy in fall detection and 98% accuracy in hazardous object recognition, with an average alert delay of under 3 seconds. This approach ensures rapid emergency response, enhances elderly safety, and minimizes caregiver burden. Despite challenges such as privacy concerns and reliance on internet connectivity, the system provides an effective, scalable, and adaptable solution for use in homes, care facilities, and hospitals. Future enhancements include multi-camera integration, AI-based anomaly detection, and wearable device support for health monitoring to further improve elderly care and safety.
Downloads
References
[1] Redmon, J., & Farhadi, A. (2018). YOLOv3: An Incremental Improvement. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). Available at: https://arxiv.org/abs/1804.02767
[2] Bochkovskiy, A., Wang, C.-Y., & Liao, H.-Y. M. (2020). YOLOv4: Optimal Speed and Accuracy of Object Detection. arXiv preprint. Available at: https://arxiv.org/abs/2004.10934
[3] Wang, C.-Y., Bochkovskiy, A., & Liao, H.-Y. M. (2021). YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. arXiv preprint. Available at: https://arxiv.org/abs/2107.08430
[4] OpenCV Documentation. (n.d.). Open Source Computer Vision Library. Available at: https://docs.opencv.org/
[5] Pushbullet API Documentation. (n.d.). Pushbullet API for Instant Notifications. Available at: https://docs.pushbullet.com/
[6] Liu, Y., & Chang, K. (2020). Real-Time Fall Detection System for Elderly Using Deep Learning Techniques. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 1881-1886.
[7] Smith, K., & Johnson, D. (2021). Design and Implementation of a Fall Detection System Using Machine Learning and Sensor Fusion. Journal of Embedded Systems, 14(2), 202-213.
[8] Nair, N., & Kumar, P. (2022). Evaluation of Real-Time Health Monitoring Systems for Elderly Care. Journal of Health Systems Research, 18(3), 177-189.
[9] Zhou, Y., & Hu, L. (2019). Machine Learning Models for Fall Detection: A Comprehensive Review. Sensors, 19(6), 1245.
[10] Kang, H., et al. (2019). Fall Detection Using Wearable Sensors. IEEE Sensors Journal.
[11] Sanchez, A., & Perez, M. (2020). Wearable Devices for Fall Detection and Monitoring of Elderly People: A Survey. Journal of Healthcare Engineering, 2020, 1-14.
[12] Chen, L., & Liu, H. (2020). Real-Time Notification System for Elderly Monitoring Using Pushbullet. IEEE Internet of Things Journal.
[13] Girshick, R. (2015). Fast R-CNN. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). Available at: https://arxiv.org/abs/1504.08083
[14] Ren, S., He, K., Girshick, R., & Sun, J. (2016). Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. arXiv preprint. Available at: https://arxiv.org/abs/1506.01497
[15] Lin, T.-Y., Goyal, P., Girshick, R., He, K., & Dollár, P. (2017). Focal Loss for Dense Object Detection. arXiv preprint. Available at: https://arxiv.org/abs/1708.02002
Downloads
Published
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
Our journal adopts CC BY License Creative Commons Attribution 4.0 International License http://Creativecommons.org//license/by/4.0/ . It allows using, reusing, distributing and reproducing of the original work with proper citation.