Preview

Doklady of the National Academy of Sciences of Belarus

Advanced search

Object tracking algorithm by moving video camera

https://doi.org/10.29235/1561-8323-2020-64-2-144-149

Abstract

The algorithm ACT (Adaptive Color Tracker) to track objects by a moving video camera is presented. One of the features of the algorithm is the adaptation of the feature set of the tracked object to the background of the current frame. At each step, the algorithm extracts from the object features those that are more specific to the object and at the same time are at least specific to the current frame background, since the rest of the object features not only do not contribute to the separation of the tracked object from the background, but also impede its correct detection. The features of the object and background are formed based on the color representations of scenes. They can be computed in two ways. The first way is 3D-color vectors of the clustered image of the object and the background by a fast version of the well-known k-means algorithm. The second way consists in simpler and faster partitioning of the RGB-color space into 3D-parallelepipeds and subsequent replacement of the color of each pixel with the average value of all colors belonging to the same parallelepiped as the pixel color. Another specificity of the algorithm is its simplicity, which allows it to be used on small mobile computers, such as the Jetson TXT1 or TXT2.

The algorithm was tested on video sequences captured by various camcorders, as well as by using the well-known TV77 data set, containing 77 different tagged video sequences. The tests have shown the efficiency of the algorithm. On the test images, its accuracy and speed overcome the characteristics of the trackers implemented in the computer vision library OpenCV 4.1.

About the Author

B. A. Zalesky
United Institute of Informatics Problems of the National Academy of Sciences of Belarus
Belarus

Zalesky Boris A. – D. Sc. (Physics and Mathematics), Head of the Laboratory.

6, Surganov Str., 220012, Minsk



References

1. Yilmaz A., Javed O., Shah M. Object tracking: A survey. Acm computing surveys, 2006, vol. 38, no. 4, pp. 1–13. https://doi.org/10.1145/1177352.1177355

2. Smeulders A. W., Chu D. M., Cucchiara R., Calderara S., Dehghan A., Shah M. Visual tracking: An experimental survey. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, vol. 36, no. 7, pp. 1442–1468. https://doi.org/10.1109/tpami.2013.230

3. Tian B., Yao Q., Gu Y., Wang K., Li Y. Video processing techniques for traffic flow monitoring: A survey. 14th International IEEE Conference on Intelligent Transportation Systems, Washington, 2011, pp. 1103–1108. https://doi.org/10.1109/itsc.2011.6083125

4. Chen Z., Hong Z., Tao D. An experimental survey on correlation filter-based tracking, 2005. Available at: https://arxiv.org/abs/1509.05520 (accessed 5 September 2019).

5. Li X., Hu W., Shen C., Zhang Z., Dick A., Van Den Hengel A. A survey of appearance models in visual object tracking. ACM Transactions on Intelligent Systems and Technology, 2013, vol. 4, no. 4, pp. 1–48. https://doi.org/10.1145/2508037.2508039

6. Liu Q., Zhao X., Hou Z. Survey of single-target visual tracking methods based on online learning. IET Computer Vision, 2014, vol. 8, no. 5, pp. 419–428. https://doi.org/10.1049/iet-cvi.2013.0134

7. Fiaz M., Mahmood A., Jung S.-K. Tracking Noisy Targets: A Review of Recent Object Tracking Approaches, 2018. Available at: https://arxiv.org/abs/1802.03098 (accessed 5 September 2019).

8. Lucas B., Kanade T. An iterative image registration technique with an application to stereo vision. Proceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI ‘81), Vancouver, 1981, vol. 2, pp. 674–679.

9. Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Long Beach, 2019. Available at: http://openaccess.thecvf.com/CVPR2019.py (accessed 5 September 2019).


Review

Views: 790


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 1561-8323 (Print)
ISSN 2524-2431 (Online)