Open Access Open Access  Restricted Access Subscription Access

An Improved SSD Model for Small Size Work-pieces Recognition in Automatic Production Line

Xiaoning Bo,
Zhiyuan Zhang,
Yipeng Wang,


Aiming at the problems of slow recognition speed and low recognition accuracy of arbitrarily placed workpiece by machine vision in traditional automated production lines, a workpiece recognition algorithm based on improved SSD is proposed. Firstly, the improved DarkNet53 is used to replace the backbone network in the original SSD network framework, and the network enhancement is used in the backbone network to solve the defect of small target missed detection. Then, channel attention module and deep semantic feature fusion module are added, in order to improve the recognition ability and detection accuracy of the small target features. Lastly, the loss function was optimized, and the problem caused by sample imbalance was solved by changing the weight distribution of positive and negative samples. In the experiment, image datasets of typical bolts, nuts, and connecting plates were constructed for the network training, the experimental results showed that, the recognition accuracy and speed have been optimized and meet the requirements of automatic work-piece detection in actual production, compared with traditional YOLOv4 and the original SSD algorithm in the work-piece recognition task.


Deep learning, Automatic production line, Work-piece recognition, SSD, Feature fusion

Citation Format:
Xiaoning Bo, Zhiyuan Zhang, Yipeng Wang, "An Improved SSD Model for Small Size Work-pieces Recognition in Automatic Production Line," Journal of Internet Technology, vol. 25, no. 2 , pp. 215-222, Mar. 2024.

Full Text:



  • There are currently no refbacks.

Published by Executive Committee, Taiwan Academic Network, Ministry of Education, Taipei, Taiwan, R.O.C
JIT Editorial Office, Office of Library and Information Services, National Dong Hwa University
No. 1, Sec. 2, Da Hsueh Rd., Shoufeng, Hualien 974301, Taiwan, R.O.C.
Tel: +886-3-931-7314  E-mail: