Development of a computer vision module for autonomous vehicles

Authors

DOI:

https://doi.org/10.26577/JMMCS.2022.v116.i4.06
        174 0

Keywords:

Computer vision, autonomous vehicle, vehicle trajectory planning, real-time trajectory planning, unmanned solution.

Abstract

The favorable geopolitical position and very large transit potential of the Republic of Kazakhstan in the field of land freight traffic between China and Europe makes the transport logistics industry one of the most promising areas for the development of the country's economy. In this context, deployment of unmanned cargo vehicles to minimize the costs of fuel consumption and use of human labor in labor-intensive and routine operations of logistic processes both inside warehouses and during freight transportation on public roads seems natural and efficient as ever.

This paper describes the results of a research work on  development of a computer vision module for an autonomous truck prototype. The performed project stages include installation of the necessary equipment, training of computer vision models and development of a mapping between cameras and LIDAR sensor for object classification and localization purposes.

References

[1] Iskaliyev Y., “Transport logistics today сегодня is the key component in implementing of the State program Forced Industrial Development” , accessed: 07.07.2022, URL: http://portal.kazlogistics.kz/analytics/95/708/.
[2] Nazarbayev N., “State program «Strategy «Kazakhstan — 2050»: new political course of a developed country” , accessed: 07.07.2022, URL: https://online.zakon.kz.
[3] “What is the importance of logistics development for the economy of Kazakhstan” , accessed: 07.07.2022, URL: https://forbes.kz/finances/markets/birthday/.
[4] “Official website of the State program «Digital Kazakhstan»” , accessed: 07.07.2022, URL: https://digitalkz.kz.
[5] “Transport logistic centers of Kazakhstan, what was done?” , accessed: 07.07.2022, URL: http://atameken.kz/ru/articles/27077-transportkazakhstana.
[6] “Official website of Tesla” , accessed: 07.07.2022, URL: https://www.tesla.com/autopilot.
[7] “Nissan tests fully autonomous prototype technology on streets of Tokyo?” , accessed: 07.07.2022, URL: https://newsroom.nissan-global.com/releases/release-1fc537356ae3aaf048d0201b77013bf9.
[8] “Autonomous trucks in real operation” , accessed: 07.07.2022, URL: https://www.volvotrucks.com/en-en/news/volvotrucks-magazine/2019/feb/bronnoy.html.
[9] “Mining automation: The be all and end all?” , accessed: 07.07.2022, URL: https://www.australianmining.com.au/features/mining-automation-the-be-all-and-end-all/.
[10] “Robotized technique for mining and industrial companies” , accessed: 07.07.2022, URL: https://vistgroup.ru/solutions/robotizirovannaya-tekhnika/.
[11] “Self-Driving Trucks Are Now Delivering Refrigerators” , accessed: 07.07.2022, URL: https://www.wired.com/story/embark-self-driving-truck-deliveries/.
[12] “A new project for development of an autonomous vehicle based on KAMAZ NEO platform started at Nazarbayev University” , accessed: 07.07.2022, URL: https://nu.edu.kz/ru/news-ru/v-nazarbaev-universitete-startoval-proekt-posozdaniyu-robotizirovannogo-avtomobilya-kamaz-neo.
[13] Abilkassov Sh., Nurlybayev A., Soltan S., Kim A., Shpieva E., Yesmagambet N., Yessenbayev Zh., Shintemirov A., “Facilitating Autonomous Vehicle Research and Development Using Robot Simulators on the Example of a KAMAZ NEO Truck” , 2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC), (2020):1-8.
[14] Zhang Zh, “A Flexible New Technique for Camera Calibration” , IEEE Trans. Pattern Anal. Mach. Intell., 22(11), (2000):1330-1334.
[15] Abdel-Aziz Y.I., Karara H.M., “Direct linear transformation from comparator coordinates into object space coordinates in close-range photogrammetry” , In Proceedings of the Symposium on Close-Range Photogrammetry, 8 (1971):1-18.
[16] “cvlib - a simple, high level, easy to use, open source Computer Vision library for Python” , accessed: 07.07.2022, URL: https://www.cvlib.net/.
[17] “ArUco: a minimal library for Augmented Reality applications based on OpenCV” , accessed: 07.07.2022, URL: https://www.uco.es/investiga/grupos/ava/node/26.
[18] Lin T.Y., Maire M., Belongie S., Hays J., Perona P., Ramanan D., Doll?r P., Zitnick C.L., “Microsoft COCO: Common Objects in Context” , European conference on computer vision, (2014):740-755.
[19] “YoloMark - Windows and Linux GUI for marking bounded boxes of objects in images for training Yolo v3 and v2”, accessed: 07.07.2022, URL: https://github.com/AlexeyAB/Yolo_mark.
[20] “Russian Traffic Sign Dataset” , accessed: 07.07.2022, URL: http://graphics.cs.msu.ru/en/research/projects/rtsd.
[21] Redmon J., Divvala S., Girshick R., Farhadi A., “You only look once: Unified, real-time object detection” , In Proceedings of the IEEE conference on computer vision and pattern recognition, (2016):779-788.
[22] Duan K., Bai S., Xie L., Qi H., Huang Q.,Tian Q., “CenterNet: Keypoint Triplets for Object Detection” , In Proceedings of the IEEE/CVF international conference on computer vision, (2019):6569-6578.
[23] Geiger A., Lenz P., Stiller C., Urtasun R., “Vision meets robotics: The KITTI dataset” , International Journal of Robotics Results, 32(11)(2013):1231–1237.

Additional Files

How to Cite

Yessenbayev, Z., Kozhirbayev, Z., & Shintemirov, A. (2022). Development of a computer vision module for autonomous vehicles. Journal of Mathematics, Mechanics and Computer Science, 116(4). https://doi.org/10.26577/JMMCS.2022.v116.i4.06