Regular paper

Split Viewer

Journal of information and communication convergence engineering 2023; 21(3): 252-260

Published online September 30, 2023

https://doi.org/10.56977/jicce.2023.21.3.252

© Korea Institute of Information and Communication Engineering

The Bullet Launcher with A Pneumatic System to Detect Objects by Unique Markers

Jasmine Aulia1, Zahrah Radila1, Zaenal Afif Azhary1, Aulia M. T. Nasution 1, Detak Yan Pratama 1, Katherin Indriawati 1*, Iyon Titok Sugiarto 2, and Wildan Panji Tresna2*

1Department of Physics Engineering, Sepuluh Nopember Institute of Technology, 60111, Indonesia
2Research Center for Photonics, National Research and Innovation Agency, South Tangerang, Banten, 15314, Indonesia

Correspondence to : Katherin Indriawati1 (E-mail: katherin@ep.its.ac.id), Wildan Panji Tresna2 (E-mail: wild004@brin.go.id)
1Department of Physics Engineering, Sepuluh Nopember Institute of Technology, 60111, Indonesia
2Research Center for Photonics, National Research and Innovation Agency, South Tangerang, Banten, 15314, Indonesia

Received: March 10, 2023; Revised: June 12, 2023; Accepted: June 15, 2023

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

A bullet launcher can be developed as a smart instrument, especially for use in the military section, that can track, identify, detect, mark, lock, and shoot a target by implementing an image-processing system. In this research, the application of object recognition system, laser encoding as a unique marker, 2-dimensional movement, and pneumatic as a shooter has been studied intensively. The results showed that object recognition system could detect various colors, patterns, sizes, and laser blinking. Measuring the average error value of the object distance by using the camera is ±4, ±5, and ±6% for circle, square and triangle form respectively. Meanwhile, the average accuracy of shots on objects is 95.24% and 85.71% in indoor and outdoor conditions respectively. Here, the average prototype response time is 1.11 s. Moreover, the highest accuracy rate of shooting results at 50 cm was obtained 98.32%.

Keywords Unique marker laser, Object recognition system, Pneumatic system, Bullet launcher

Now and in the future, there are many compact optoelectronics devices with affordable and multifunctional [1]. One of the applications is the target marker and the gun [2,3]. Generally, cameras and Computer Vision (CV) can provide information that depicts an object based on color, pattern, and size. One of the applications of CV in vision marker recognition can be developed into a unique marker recognition system. The system detects unique markers to meet the needs of target shooting automation. The marker is a laser beam with a unique color, shape, size, and blinking [4] so that a programmed computer only recognizes it. Object recognition uses color, dimensions, and shape contour detection methods [5-7]. Meanwhile, the object marked with markers will be detected and tracked as a shot marker. When the object moves, the camera will follow the object movement assisted by a dual-axis servo motor [8].

Furthermore, the camera will become a sensor to determine the distance between the object and the launcher [9,10]. In general, several basic methods exist for estimating the distance between an object and the monocular camera, such as comparing virtual images in the mirrors [4]. When the camera detects the presence of an object, it will determine the size of the object based on the area detected by the computer. Then the data is compared with the object's actual size to find the accurate distance between the camera and the object [11-13]. Moreover, the estimated distance data determines the force to launch a bullet to hit the target precisely [14].

This research aims to build and develop a unique object recognition system using a blinking laser so that the launcher can shoot on objects that have been marked system with precision and accuracy [15].

A. Object Recognition

A marker is an artificial sign that identifies objects easily [16]. Then, the camera with image-processing software functioned to recognize and identify objects [6,16,17]. The camera has a resolution of 640×480 pixels, a frame rate of 30 fps, and a capturing angle of 54°. The image captured by the camera is processed by a computer using the OpenCV library, which contains an image-processing program for recognizing and identifying objects [18-20]. In this research, the features developed by OpenCV for recognizing and identifying objects are the Unique Marker Recognizing [17].

The object is carried out hierarchically based on predefined colors, patterns, and sizes. Then, the coordinates of the target midpoint are detected in the image, and it will be used as feedback on the servo motor [19-21]. Moreover, when the object moves, the camera and its propulsion system will follow the object’s movement, as shown in Fig. 1.

Fig. 1. Integration of object detection, unique marker, and bullet launcher mechanism.

The camera with a biconvex lens influences the relationship between the object and its shadow, satisfying the triangle similarity principle. Based on this condition, an estimated calculation regarding the focal length ratio divided by the shadow's width equals the real object distance to the camera divided by the lens width (images) [6]. Here, by increasing the distance between the real object and the camera, the size, and width of the object captured by the camera gets smaller. Moreover, when the size of the object and the focal length of the camera lens are known, the real distance of the object to the camera can be calculated. The distance based on these calculations becomes the input launcher to shoot the target with precision and accuracy.

B. Encoding Laser

Encoding laser is built by laser with the variant of pulsed wave modulation (PWM). The PWM signal consists of a special pattern controlled by time toward a target and measuring the time it takes for pulses to be reflected from the target to the receiver in the form of a camera. Then, such target marking was implemented with a laser lighting quantity [20], the control system against the amount of electricity (dimmer) by using the Arduino Uno-based pulse width modulation (PWM) method [19,22]. In this case, the dimmer circuit uses the principle of voltage control to produce a PWM signal, as shown in Fig. 2.

Fig. 2. (a) The PWM of laser blinking consists of the ton/tbright, toff, and tdim. (b) Observation in real time by the camera, and masking side in computer.

A unique marker is built with a laser coding system, a combination of ton and toff, with unique variants, as shown in Fig. 2(a). The marker is a laser beam with a unique size, color, and pattern that the user can only recognize. One of the essential purposes of this research is the application of unique markers, which will be developed using a cue code in the form of a blinking laser.

C. Trajectory System

The launcher controls based on a pneumatic system capable of shooting a unique automatic marker proposed is a close loop control [23]. Furthermore, several calculation steps are needed to adjust the air pressure in the system. When analyzing the air pressure on the launcher, data such as object distance, bullet size, bullet mass, and target size become important [21]. The bullet trajectory analysis follows the parabolic form, and this concept obeys the straight-line motion with constant acceleration. When the camera detects a target, the computer will perform calculations to determine the required pressure. Once the amount of pressure is known, the force needed to eject a bullet can be determined. When the force has been determined, the valve will be controlled to open, and a bullet ejects toward the target.

The block diagram of the bullet launcher with a unique automatic marker is shown in Fig. 3. Here, the block diagram of the bullet launcher consists of several blocks, such as process, actuator, and feedback. The primary variable designed for this system is the air pressure inside the compressor.

Fig. 3. Block diagram of bullet launcher with a unique automatic marker.

The feedback system is designed using a pressure sensor as a response. The microcontroller will process the amount of pressure detected as feedback, and this value will be compared with the set point and affect the process output. Here, an actuator used in an automatic unique marker launcher is a solenoid valve consisting of a drain valve and a firing valve.

The process of the pneumatic system on this bullet launcher is that the compressor provides the air pressure when the solenoid valve is open. The regulator measured the pressure. Then the storage filling process is completed according to the set point. Here, the set point value is the result of converting the object’s distance to the launcher. Another process is the solenoid valve connected to the drain will open and connect to the pressure sensor. When the measured pressure follows the set point, the output solenoid valve connected to the plant will open. This entire process is finalized at the user's command to launch the bullets. If there is still air in the storage tank after processing, the solenoid valve connected to the drain will open to remove the remaining pressure.

D. Dual Axis of Bullet Launcher

The turret and gun system are integrated systems for the bullet launcher system. The turret and gun are 2-axis; a turret is a horizontal movement while the gun is the up and down movement. Here, the servo motor movement is controlled by an Arduino microcontroller at the coordinates. The coordinates obtained by the computer are sent serially to Arduino, and it is used as input to drive the servo motor. Moreover, the system will signal to stop the motor when the coordinates match the input data. The output generated from this control is the angle of the motor movement [8].

The rotation of the servo motor uses the PWM (pulse width modulation) method with a closed-loop system. When the servo motor rotates, the camera reads each rotation angle so that the difference in coordinates between the camera and the object will be known. Here, the Arduino microcontroller processes the information and adjusts the angle according to the reference [24-26]. The reference value is the output from the potentiometer with a length of 10 bits (0-1023). The output generated from the servo motor control system is the angle corresponding to the position input in the software [27].

A. Color Detections

Color detection of indoor conditions is carried out with several color variations like red, green, and blue. Here, the purpose of color detection is to discover that Computer Vision can recognize object colors in various conditions. Color detection is carried out by taking the Hue Saturation Value (HSV) parameter. Moreover, the filtering mechanism based on object color is shown in Fig. 4.

Fig. 4. Color filter mechanism as a detector in indoor conditions.

Indoor conditions with stable lighting give the stable threshold parameter as well on the CV when detecting the color of an object. Meanwhile, in outdoor conditions, the threshold parameters must be adjusted. Due to outdoor lighting conditions depending heavily on sunlight, this light is affected by weather, clouds, and measurement time.

B. Pattern and Size Detection

Pattern detection is carried out with several variations of shapes, such as circles, squares, and triangles. One of the results of object pattern detection at a certain color is shown in Fig. 5.

Fig. 5. Object with blue color in pattern detection

Meanwhile, object size detection is performed to determine the real distance between the camera and the object. This detection is carried out with several variations in sizes and shapes, such as circles, triangles, and squares, with dimensions of 5, 10, and 15 cm, respectively. Table 1 shows the results of the detection of pattern and size.

Table 1 . Detection result of pattern, size, and approximate distance

Real Object Side Size (cm)Detection DistancePatternDetected Object Size
Min (cm)Max (cm)Min (pixel)Max (pixel)
5120240Circle26x2311x10
1012040052x4912x11
1512068078x7512x11
5120240Rectangle26x2512x11
1012040053x5314x13
1512064080x8313x13
5120120Triangle22.4x18.622.4x18.6
1012020045x4525x25
1512036075x73.122x23


The measurement results show that the camera can detect distances well in several object patterns, such as a circle with a diameter of 15 cm detected at a distance of 680 cm. Meanwhile, for other objects, it is measured by the edge of the pattern.

Meanwhile, the camera’s ability to distinguish the size of objects in the same shape and color is illustrated in Fig. 6.

Fig. 6. Determination of the size of the object at the same shape and color

C. Detection of Approximate Distance

The approximate distance measurement by the camera is carried out on an object with a diameter of 15 cm. here, the real distance is 20 to 240 cm, with measures every 20 cm.

The results of measuring the circle (see Fig.7) show that the standard deviation value for the slope is 0.002715 and 0.002548 for up and down measurements, respectively.

Fig. 7. Distance measurement by the camera on a circle

Meanwhile, Fig. 8 shows that the results of the distance approximation of the square have the highest value for 100% accuracy and precision obtained at the distance of 240 cm.

Fig. 8. Distance measurement by the camera on a square

Then, the results of measuring the square show that the standard deviation value for the slope is 0.002536 and 0.000324 for up and down measurements, respectively.

Then the calculation of the average value of the camera measurement error as a distance sensor is ±5% for square. Moreover, measuring the triangle results show that the slope’s standard deviation value is 0.003330 and 0.004332 for up and down measurements, respectively. And Fig. 9 shows the calculation of the average value of the camera measurement error as a distance sensor is ±6%.

Fig. 9. Distance measurement by the camera on a triangle.

D. Tracking Objects Measurement

Object tracking is carried out with various distances from 50 to 300 cm. Here, synchronization between object recognition by the camera and the drive system on the servo motor by the microcontroller, as shown in Fig. 10.

Fig. 10. Tracking objects mechanism (a) instrument position and objects, and (b) target on the computer view.

This measurement was carried out with seven times repetitions to see the consistency of the results. Then, the ability of the camera to track a moving object is shown in the accuracy.

E. Characteristics of Encoding Laser

Laser blinking is made by controlling the PWM signal.

Fig. 11 shows the results of measuring variations in the duty cycle on blue, red, green, and blue diode lasers. Here, the combination of ton and toff represents the duty cycle.

Fig. 11. The effect of increasing the duty cycle on the output voltage

Fig. 11 shows that the increasing the percentage of duty cycle that is set, the voltage that is identified also increases.

Furthermore, the results of outdoor measurements are presented in Table 2. Here, the intensity measured by the lux meter is 91.6 lux in the middle of the day and 0 lux at night. Measurements in outdoor conditions were carried out at 50 cm and 300 cm with variations in duty cycle starting from 20 to 60%. When the intensity of sunlight gets brighter, it makes it difficult for the camera to recognize the laser’s color, so the laser’s intensity also increases, and the camera can detect it properly. In addition, the greater the intensity of the laser light, the farther the range in recognizing objects, and vice versa. So, it can be concluded that the HSV value set is the same for the intensity and distance of different objects. This indicates that the trackbar configuration on the system has been properly calibrated.

Table 2 . Condition of laser measurement as a target marker, this measurement is carried out outdoors



Measurements were made to determine the effect of ambient light on the camera in recognizing the laser as a target marker. The effect of ambient light is differentiated in indoor and outdoor measurements. Figs. 12 and 13 show the results of the effect of the duty cycle on the light intensity of the blue laser.

Fig. 12. The effect of increasing duty cycle on intensity. These measurements were taken condition indoors.
Fig. 13. The effect of increasing duty cycle on intensity. These measurements were taken condition outdoors.

From Figs. 12 and 13 regarding the measurement of laser intensity in indoor and outdoor conditions, there are significant differences in results between indoor and outdoor conditions.

Moreover, the ambient light intensity affects the detection system. Moreover, the darker ambient light conditions will make the camera work better.

F. Measurement of Trajectory

Measurement of accuracy and precision of the bullet launcher is divided into three categories, such as response time, the accuracy of the bullet launcher against the target, and measurement of shot precision. The delay time measurement is needed to determine how fast the compressor response can fill the air storage, the servo motor response to executing the commands, and the camera response to the movement of the target marker. In this study, measurement of the delay time was carried out using variations in air pressure from 100 to 500 kPa with a stepping of 50 kPa. Table 3 shows the measurement data for air-filling delay time in storage.

Table 3 . Measurement of the time to produce air pressure and store in the storage

Pressure (kPa)Compressor Charging Respond Time (s)
1000,6
1500,8
2001,01
2502,05
3002,56
3502,83
4003,86
4504,39
5005,24


Table 3 shows that the delay time increases linearly with increasing air pressure requirements. To produce a pressure of 100 kPa, the compressor takes 0.6 seconds to fill the air storage, while for a pressure of 500 kPa, the compressor takes 5.24 seconds to fill the air storage.

Furthermore, launcher accuracy measurements were carried out indoors and outdoors with several variations of distance. This measurement is carried out after the microcontroller synchronizes the object recognition system, servo motor movement system, and launcher system. The purpose of this measurement is to find out if the synchronization process is working well. The camera can recognize and send commands to the servo motor so that the prediction of the launcher can be at the target’s midpoint. Measurements were made at 50 cm to 300 cm with a measurement range of 50 cm. The measurement results are shown in Tables 4 and 5.

Table 4 . The measurement of the accuracy of the bullet launcher with several repetitions. This measurement was taken in indoor conditions

NoObject Distance from SystemRepeat Number (1=success, 0=fail)The accuracy of the automatic weapon aiming system (%)
1234567
150 cm101111185.71%
2100 cm1111111100.00%
3150 cm1111111100.00%
4200 cm111101185.71%
5250 cm1111111100.00%
6300 cm1111111100.00%
Average Accuracy Test95.24%

Table 5 . The measurement of the accuracy of the bullet launcher with several repetitions. This measurement was taken in an outdoor condition

NoObject Distance from SystemRepeat Number (1=success, 0=fail)The accuracy of the automatic weapon aiming system (%)
1234567
150 cm1111111100,00%
2100 cm1111111100.00%
3150 cm111101185,71%
4200 cm111101185,71%
5250 cm111100171,42%
6300 cm111010171,42%
Average Accuracy Test85,71%


Based on measurements with seven repetitions for distance variations, as shown in Tables 3 and 4, the accuracy of a launcher throwing bullets at the object area illuminated by a unique marker with the help of a drive servo motor in directing an automatic system has an average accuracy of 95.24 and 85.71% for indoor and outdoor conditions respectively. Then the precision value that hits the target can be calculated. The accuracy and precision of the shot were measured on a circle with a diameter of 15 cm at an angle of 90.85 of the bullet launchers. This measurement, to obtain the results of the accuracy and precision of the shot, is carried out with seven repetitions. Then the average data is taken for each target position. The measurement results are shown in Table 6.

Table 6 . Calculation results of approximated error, accuracy, and precision of the shot

Target (90,85)Object Distance
Distance of the Bullet Against the Center of the Target (cm)Test Number50 cm100 cm150 cm200 cm
10.51.52.85
20.81.52.85.2
311.635.3
41.423.25.4
51.52.23.45.4
61.52.245.8
722.64.26.6
RMSE (cm)0.842.063.385.55
Accuracy98.32%97.94%97.75%97.23%
Precision74.06%81.93%86.57%93.06%


Using the Root Mean Square Error (RMSE) method, an estimate of the average measurement error, accuracy, and precision is obtained. Table 6 shows that the estimated error shooting data at 50 cm is 0.84 cm. At 100 cm, it is 2.06 cm. At 150 cm, it is 3.38 cm, and at 200 cm, it is 5.55 cm. With a simple calculation, it can be obtained that the accuracy of the shot at 50 cm is 98.32%, at 100 cm is 97.94%, at 150 cm is 97.75%, and at 200 cm is 97.23%. While the precision value obtained from the calculation results at 50 cm is 74.06%, at 100 cm is 81.93%, at 150 cm is 86.57%, and at 200 cm is 93.06%.

From the measurement of the accuracy and precision of shots with variations in object distance, the highest accuracy rate was obtained for shots at 98.32% at 50 cm, while the highest precision for shots was obtained for shots with 93.06% at 200 cm. So, from the calculation results, it can be concluded that the closer the target distance is, the higher the accuracy will be. Still, the lower the precision of the shot, it's because the bullet’s point spreads around the target’s midpoint and vice versa.

G. Target Position and Response Time tracking measurement

This measurement is carried out to see the response of tracking time to the target and the position of the system moving from right to left, from top to bottom, and vice versa. This movement is done when aiming the gun. Here, the bullet launcher may respond when the target is within range of the camera. The result of this measurement is shown in Table 7.

Table 7 . Measurement of target position and response time tracking to the target

PositionServo DegreeResponse Time (s)Accuracy of the system aiming the weapon when the laser on
HorizontalVertical
Right (Axis x)30501.1Accurate
Left (Axis x)15500.65Accurate
Up (Axis y)90600.87Accurate
Down (Axis y)90450.43Accurate
Right (Axis x) + Up (Axis y)115601.57Accurate
Right (Axis x) + Down (Axis y)115451.36Accurate
Left (Axis x) + Up (Axis y)75601.56Accurate
Right (Axis x) + Down (Axis y)75451.34Accurate
Average Response Time Tracking Target1.11Accurate


In this system, the measurement of a bullet launcher system that can shoot accurately based on the target position has an average response time of 1.11 s. The bullet launcher system also directs the gun as soon as the target marker lights up at the set position.

The automatic unique marker shooter system is designed to be able to track, detect, lock, mark, and shoot the objects that a unique marker has marked. This system consists of object recognition, a drive, and a bullet launcher. Here, the automatic unique marker shooter is designed with a closedloop control system. Furthermore, pneumatics is used as the source of bullet power to be launched. Then, the feedback signal on the control system is a pressure sensor. The pressure sensor used in the system has a measurement error value of ±3%. Meanwhile, this system has been able to shoot precisely with an accuracy of 98.32% for a 50 cm target distance and 93.06% for 200 cm.

The authors are grateful to the Institute Teknologi Sepuluh Nopember for their financial support under the publication writing and IPR incentive program (PPHKI) project scheme.
  1. B. G. Disha, and G. Indumathi, Automated test system for laser designator and laser range finder, International Journal of Engineering Research & Technology (IJERT), vol. 4, no. 5, pp. 126-130, May, 2015. DOI: 10.17577/IJERTV4IS050204.
    CrossRef
  2. H. Kaushal, and G. Kaddoum, Application of laser and tactical military operations, IEEE Access, vol. 5, pp. 20736-20753, 2017. DOI: 10.1109/ACCESS.2017.2755678.
    CrossRef
  3. M. L. W. B. Anderberg, “Laser weapons: The dawn of a new military age,” New York, USA: Springer, 2013.
  4. N. Isoyama, and T. Terada, and J. Akita, and M. Tsukamoto, A method to control LED blinking for position detection of devices on conductive clothes, in 9th International Conference on Advances in Mobile Computing and Multimedia, Ho Chi Minh City, Vietnam, pp. 123-130, 2011. DOI: 10.1145/2095697.2095721.
    CrossRef
  5. H. M. Zangana, A new algorithm for shape detection, IOSR Journal of Computer Engineering (IOSR-JCE), vol. 19, no. 3, pp. 71-76, May, 2017. DOI: 10.9790/0661-1903017176.
    CrossRef
  6. J. Liu, and R. Zhang, Vehicle detection and ranging using two different focal length cameras, Journal of Sensors, vol. 2020, pp. 1-14.ID. 4372847, Mar, 2020. DOI: 10.1155/2020/4372847.
    CrossRef
  7. Y. Fang, and H. Zhao, and H. Zha, and X. Zhao, and W. Yao, Camera and LiDAR fusion for on-road vehicle tracking with reinforcement learning, in 2019 IEEE Intelligent Vehicles Symposium (IV), Paris, France, pp. 1723-1730, 2019. DOI: 10.1109/IVS.2019.8814028.
    CrossRef
  8. H. M. Saputra and T. A. Prambudi and D. G. Subagjo, Rancang bangun umpan balik eksternal untuk kendali sudut motor servo berbasis arduino, Jurnal Teknologi Bahan dan Barang Teknik, vol. 6, no. 2, pp. 43-48, Dec., 2016. DOI: 10.37209/jtbbt.v6i2.68.
    CrossRef
  9. L. -W. Tsai and J. -W. Hsieh and K. -C. Fan, Vehicle detection using normalized color and edge map, IEEE Transactions on Image Processing, vol. 16, no. 3, pp. 850-864, Mar., 2007. DOI: 10.1109/TIP.2007.891147.
    Pubmed CrossRef
  10. L. -C. Liu and C. -Y. Fang and S. -W. Chen, A novel distance estimation method leading a forward collision avoidance assist system for vehicles on highways, IEEE Transactions on Intelligent Transportation Systems, vol. 18, no. 4, pp. 937-949, Apr., 2017. DOI: 10.1109/TITS.2016.2597299.
    CrossRef
  11. J. Han, and O. Heo, and M. Park, and S. Kee, and M. Sunwoo, Vehicle distance estimation using a mono-camera for FCW/AEB systems, International Journal of Automotive Technology, vol. 17, no. 3, pp. 483-491, Jun., 2016. DOI:10.1007/s12239-016-0050.
    CrossRef
  12. A. C. Sankaranarayanan and A. Veeraraghavan and R. Chellappa, Object detection, tracking and recognition for multiple smart cameras, Proceedings of the IEEE, vol. 96, no. 10, pp. 1606-1624, Oct, 2008. DOI: 10.1109/JPROC.2008.928758.
    CrossRef
  13. V.V. Priya, V. Priya, K., and S. Roy, “Camera based fod detection using image processing technique,” International Journal of Engineering Science Invention (IJESI), vol. 7, issue 6, pp. 30-33, Jun. 2018.
  14. C. J. Jermak and M. Rucki and M. Jakubowicz, Accuracy of the pneumatic follower for the wooden surface quality assessment, European Journal of Wood and Wood Products, vol. 78, pp. 1149-1159, Aug., 2020. DOI: 10.1007/s00107-020-01593-y.
    CrossRef
  15. Z. Liu, and W. Chen, Research and Analysis on firing accuracy of Naval Gun, Journal of Physics: Conference Series, vol. 1948, no. 1, 012081, pp. 1-7, Jun, 2021. DOI: 10.1088/1742-6596/1948/1/012081.
    CrossRef
  16. W. P. Tresna, and U. A. Ahmad, and Isnaeni, and R. R. Septiawan, and I. T. Sugiarto, and A. L. Suherman, Encoding LED for unique markers on object recognition system, International Journal of Advanced Computer Science and Applications (IJACSA), vol. 12, no. 12, pp. 678-683, Dec., 2021. DOI: 10.14569/IJACSA.2021.0121284.
    CrossRef
  17. R. C. Gonzalez and R. E. Woods, “Digital image processing second edition,” Prentice Hall, New Jersey, 2002.
  18. K. Adi, and S. Pujianto, and R. Gernowo, and A. Pamungkas, and A. B. Putranto, Automatic vehicle counting using background subtraction method on gray scale images and morphology operation, Journal of Physics: Conference. Series, vol. 9, no. 23, pp. 13917-13924, May, 2018. DOI: 10.1088/1742-6596/1025/1/012025.
    CrossRef
  19. J. Biswas, and M. Veloso, Depth camera based indoor mobile robot localization and navigation, in 2012 IEEE International Conference on Robotics and Automation, Saint. Paul, USA, pp. 1697-1702, 2012. DOI: 10.1109/ICRA.2012.6224766.
    CrossRef
  20. C-H. Hsieh, and J-H. Chou, Analysis and optimal control of pulse width modulation feedback systems, Proceedings of the Institution of Mechanical Engineers, Part I: Journal of Systems and Control Engineering, vol. 218, no. 4, pp. 277-286, Jun, 2004. DOI: 10.1177/095965180421800403.
    CrossRef
  21. Y. Zhao, and Y. Le, and Q. Wang, and F. Meng, and Y. Lin, and S. Wang, and B. Lin, and S. Cao, and J. Cao, and Z. Fang, and E. Zang, 100-Hz linewidth diode laser with external optical feedback, IEEE Photonics Technology Letters, vol. 24, no. 20, pp. 1795-1798, Oct., 2012. DOI: 10.1109/LPT.2012.2214029.
    CrossRef
  22. C. L. Wan and K. W. Dickinson and T. D. Binnie, A cost-effective image sensor system for transport applications utilizing a miniature CMOS single chip camera, IFAC Transportation systems, Tianjin, Proceedings, vol. 27, no. 12, pp. 151-156, Aug, 1994. DOI: 10.1016/S1474-6670(17)47460-X.
    CrossRef
  23. H. Adinanta, and H. Kato, and A. W. S. Putra, and T. Maruyama, Enhancement of beam tracking response using color filtering method for optical wireless power transmission, AIP Conference Proceedings, vol. 2256, no. 1, p. 020009, 2020. DOI: 10.1063/5.0022440.
    CrossRef
  24. Z. Abidin, and A. G. R. Mahendra, and D. F. Mahendra, and M. R. N. Imami, and W. M. H. Mehanny, Performance Analysis of LED Driver for Transmitter of Visible Light Communication Using Pulse Width Modulation, in 10th Electrical Power, Electronics, Communications, Controls and Informatics Seminar (EECCIS), Malang, Indonesia, pp. 178-182, 2020. DOI: 10.1109/EECCIS49483.2020.9263424.
    CrossRef
  25. L. Zhou, and J. Li, and J. Sheng, and J. Cao, and Z. Li, Closed-loop identification for motion control system, in 8th World Congress on Intelligent Control and Automation (WCICA), Jinan, China, pp. 477-480, 2010. DOI: 10.1109/WCICA.2010.5553795.
    CrossRef
  26. A. Sarsenov, and A. Yessenbayeva, and A. Shintemirov, and A. Yazici, Detection of objects and trajectories in real-time using deep learning by a controlled robot, in International Conference on Robotics, Computer Vision and Intelligent Systems (ROBOVIS 2020), Buhapest, Hungary, pp. 131-140, 2020. DOI: 10.5220/0010215201310140.
    CrossRef
  27. B. Murtianta, Pengaruh duty cycle dan frekuensi terhadap kecepatan putar motor DC, Techné Jurnal Ilmiah Elektroteknika, vol. 17, no. 1, pp. 13-26, Apr., 2018. DOI:10.31358/techne.v17i01.167.
    CrossRef

Jasmine Aulia

She was born on May 19th, 1999, in Central Jakarta, DKI Jakarta, Indonesia. She received associate degree in electrical engineering from Diponegoro University, Semarang, Indonesia in 2020. And received bachelor’s degree in physics engineering from Institut Teknologi Sepuluh Nopember, Surabaya, Indonesia in 2022. Her current research interest included sensors, renewable energy, and control system.


Zahrah Radila

She was born on July 6th, 1999, in Central Jakarta, DKI Jakarta, Indonesia. She received associate degree in electrical engineering from Jakarta State pf Polythenic (PNJ), Depok, Indonesia in 2020. She received bachelor’s degree in physics engineering from Institut Teknologi Sepuluh Nopember, Surabaya, Indonesia in 2023. Her current research interest included sensors, renewable energy, and control system.


Zaenal Afif Azhary

He was born on September 6th, 1998, in East Jakarta, DKI Jakarta, Indonesia. He received associate degree in electrical engineering from Politeknik Negeri Jakarta, Depok, Indonesia in 2019. And received bachelor’s degree in physics engineering from Institut Teknologi Sepuluh Nopember, Surabaya, Indonesia in 2022. His current research interest included sensors, renewable energy, and control system.


Aulia Nasution

Dr.rer.nat. Aulia Nasution was born in Surabaya on November 17th, 1967. He received his Engineer (Ir.) degree in Engineering Physics in 1993 from ITB Bandung, and M.Sc. in Medical Physics from University of Science Malaysia (USM) Penang Malaysia. He received his Dr.rer.nat. in Experimental Physics (Optical Diagnostics) in 2006 from the Georg August University of Göttingen Germany. Currently he served as Assistant Professor at the Department of Engineering Physics, the Institut Teknologi Sepuluh Nopember (ITS). His research interest is in Biomedical Photonics and Optical Engineering.


Detak Yan Pratama

He was born on January 1st, 1984, in Surabaya, Jawa Timur, Indonesia. He received bachelor’s degree in engineering physics from Institut Teknologi Sepuluh Nopember, Surabaya, Indonesia in 2007. In 2008, he continued his master’s degree at Universiti Kebangsaan Malaysia. He currently works as a lecturer in Physics Engineering Department, Institut Teknologi Sepuluh Nopember in Indonesia. His current research interests include photonic materials utilizing natural materials, and optical fiber applications.


Katherin Indriawati

She was born on May 23rd, 1976, in East Java, Indonesia. She received B.Eng Degree in Engineering Physics from Institut Teknologi Sepuluh Nopember, Surabaya, Indonesia in 1998. She received M.Eng. Degree in Instrumentation & Control, from Institut Teknologi Bandung, Bandung, Indonesia in 2005. And received Ph.D. degree at Institut Teknologi Sepuluh November, Surabaya, Indonesia in 2017. She currently works as a lecturer in Physics Engineering Department, Institut Teknologi Sepuluh Nopember in Indonesia. Her current research interest included supervisory control, fault tolerant control, fault detection, diagnosis, and decision-making scheme.


Iyon Titok Sugiarto

He received a BSc in Physics from Gadjah Mada University, Yogyakarta, Indonesia in 2006, and master’s degree in electrical engineering Option Telecommunications Engineering, School of Electrical Engineering, and Informatics fromInstitut Teknologi Bandung, Indonesia in 2013. In addition to PhD degree in Mechanical Science and Engineering, Natural Science and Technology, Kanazawa University, Japan in 2020. He currently works as researcher at National Research and Innovation Agency in Indonesia. His current research interest is optic, spectroscopy laser and applied physics.


Wildan Panji Tresna

He was born on August 31st, 1985, in Brebes, Central Java, Indonesia. He received a BSc in Physics from Diponegoro University, Semarang, Indonesia in 2007, and master’s degree in electrical engineering option micro device, Univeristy of Indonesia in 2014. In addition to PhD degree in Electrical Engineering and Computer Science, Kanazawa University, Japan in 2020. He currently works as researcher at National Research and Innovation Agency in Indonesia. His current research interest included laser applications, optical waveguides, and sensors.


Article

Regular paper

Journal of information and communication convergence engineering 2023; 21(3): 252-260

Published online September 30, 2023 https://doi.org/10.56977/jicce.2023.21.3.252

Copyright © Korea Institute of Information and Communication Engineering.

The Bullet Launcher with A Pneumatic System to Detect Objects by Unique Markers

Jasmine Aulia1, Zahrah Radila1, Zaenal Afif Azhary1, Aulia M. T. Nasution 1, Detak Yan Pratama 1, Katherin Indriawati 1*, Iyon Titok Sugiarto 2, and Wildan Panji Tresna2*

1Department of Physics Engineering, Sepuluh Nopember Institute of Technology, 60111, Indonesia
2Research Center for Photonics, National Research and Innovation Agency, South Tangerang, Banten, 15314, Indonesia

Correspondence to:Katherin Indriawati1 (E-mail: katherin@ep.its.ac.id), Wildan Panji Tresna2 (E-mail: wild004@brin.go.id)
1Department of Physics Engineering, Sepuluh Nopember Institute of Technology, 60111, Indonesia
2Research Center for Photonics, National Research and Innovation Agency, South Tangerang, Banten, 15314, Indonesia

Received: March 10, 2023; Revised: June 12, 2023; Accepted: June 15, 2023

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

A bullet launcher can be developed as a smart instrument, especially for use in the military section, that can track, identify, detect, mark, lock, and shoot a target by implementing an image-processing system. In this research, the application of object recognition system, laser encoding as a unique marker, 2-dimensional movement, and pneumatic as a shooter has been studied intensively. The results showed that object recognition system could detect various colors, patterns, sizes, and laser blinking. Measuring the average error value of the object distance by using the camera is ±4, ±5, and ±6% for circle, square and triangle form respectively. Meanwhile, the average accuracy of shots on objects is 95.24% and 85.71% in indoor and outdoor conditions respectively. Here, the average prototype response time is 1.11 s. Moreover, the highest accuracy rate of shooting results at 50 cm was obtained 98.32%.

Keywords: Unique marker laser, Object recognition system, Pneumatic system, Bullet launcher

I. INTRODUCTION

Now and in the future, there are many compact optoelectronics devices with affordable and multifunctional [1]. One of the applications is the target marker and the gun [2,3]. Generally, cameras and Computer Vision (CV) can provide information that depicts an object based on color, pattern, and size. One of the applications of CV in vision marker recognition can be developed into a unique marker recognition system. The system detects unique markers to meet the needs of target shooting automation. The marker is a laser beam with a unique color, shape, size, and blinking [4] so that a programmed computer only recognizes it. Object recognition uses color, dimensions, and shape contour detection methods [5-7]. Meanwhile, the object marked with markers will be detected and tracked as a shot marker. When the object moves, the camera will follow the object movement assisted by a dual-axis servo motor [8].

Furthermore, the camera will become a sensor to determine the distance between the object and the launcher [9,10]. In general, several basic methods exist for estimating the distance between an object and the monocular camera, such as comparing virtual images in the mirrors [4]. When the camera detects the presence of an object, it will determine the size of the object based on the area detected by the computer. Then the data is compared with the object's actual size to find the accurate distance between the camera and the object [11-13]. Moreover, the estimated distance data determines the force to launch a bullet to hit the target precisely [14].

This research aims to build and develop a unique object recognition system using a blinking laser so that the launcher can shoot on objects that have been marked system with precision and accuracy [15].

II. SYSTEM MODEL AND METHODS

A. Object Recognition

A marker is an artificial sign that identifies objects easily [16]. Then, the camera with image-processing software functioned to recognize and identify objects [6,16,17]. The camera has a resolution of 640×480 pixels, a frame rate of 30 fps, and a capturing angle of 54°. The image captured by the camera is processed by a computer using the OpenCV library, which contains an image-processing program for recognizing and identifying objects [18-20]. In this research, the features developed by OpenCV for recognizing and identifying objects are the Unique Marker Recognizing [17].

The object is carried out hierarchically based on predefined colors, patterns, and sizes. Then, the coordinates of the target midpoint are detected in the image, and it will be used as feedback on the servo motor [19-21]. Moreover, when the object moves, the camera and its propulsion system will follow the object’s movement, as shown in Fig. 1.

Figure 1. Integration of object detection, unique marker, and bullet launcher mechanism.

The camera with a biconvex lens influences the relationship between the object and its shadow, satisfying the triangle similarity principle. Based on this condition, an estimated calculation regarding the focal length ratio divided by the shadow's width equals the real object distance to the camera divided by the lens width (images) [6]. Here, by increasing the distance between the real object and the camera, the size, and width of the object captured by the camera gets smaller. Moreover, when the size of the object and the focal length of the camera lens are known, the real distance of the object to the camera can be calculated. The distance based on these calculations becomes the input launcher to shoot the target with precision and accuracy.

B. Encoding Laser

Encoding laser is built by laser with the variant of pulsed wave modulation (PWM). The PWM signal consists of a special pattern controlled by time toward a target and measuring the time it takes for pulses to be reflected from the target to the receiver in the form of a camera. Then, such target marking was implemented with a laser lighting quantity [20], the control system against the amount of electricity (dimmer) by using the Arduino Uno-based pulse width modulation (PWM) method [19,22]. In this case, the dimmer circuit uses the principle of voltage control to produce a PWM signal, as shown in Fig. 2.

Figure 2. (a) The PWM of laser blinking consists of the ton/tbright, toff, and tdim. (b) Observation in real time by the camera, and masking side in computer.

A unique marker is built with a laser coding system, a combination of ton and toff, with unique variants, as shown in Fig. 2(a). The marker is a laser beam with a unique size, color, and pattern that the user can only recognize. One of the essential purposes of this research is the application of unique markers, which will be developed using a cue code in the form of a blinking laser.

C. Trajectory System

The launcher controls based on a pneumatic system capable of shooting a unique automatic marker proposed is a close loop control [23]. Furthermore, several calculation steps are needed to adjust the air pressure in the system. When analyzing the air pressure on the launcher, data such as object distance, bullet size, bullet mass, and target size become important [21]. The bullet trajectory analysis follows the parabolic form, and this concept obeys the straight-line motion with constant acceleration. When the camera detects a target, the computer will perform calculations to determine the required pressure. Once the amount of pressure is known, the force needed to eject a bullet can be determined. When the force has been determined, the valve will be controlled to open, and a bullet ejects toward the target.

The block diagram of the bullet launcher with a unique automatic marker is shown in Fig. 3. Here, the block diagram of the bullet launcher consists of several blocks, such as process, actuator, and feedback. The primary variable designed for this system is the air pressure inside the compressor.

Figure 3. Block diagram of bullet launcher with a unique automatic marker.

The feedback system is designed using a pressure sensor as a response. The microcontroller will process the amount of pressure detected as feedback, and this value will be compared with the set point and affect the process output. Here, an actuator used in an automatic unique marker launcher is a solenoid valve consisting of a drain valve and a firing valve.

The process of the pneumatic system on this bullet launcher is that the compressor provides the air pressure when the solenoid valve is open. The regulator measured the pressure. Then the storage filling process is completed according to the set point. Here, the set point value is the result of converting the object’s distance to the launcher. Another process is the solenoid valve connected to the drain will open and connect to the pressure sensor. When the measured pressure follows the set point, the output solenoid valve connected to the plant will open. This entire process is finalized at the user's command to launch the bullets. If there is still air in the storage tank after processing, the solenoid valve connected to the drain will open to remove the remaining pressure.

D. Dual Axis of Bullet Launcher

The turret and gun system are integrated systems for the bullet launcher system. The turret and gun are 2-axis; a turret is a horizontal movement while the gun is the up and down movement. Here, the servo motor movement is controlled by an Arduino microcontroller at the coordinates. The coordinates obtained by the computer are sent serially to Arduino, and it is used as input to drive the servo motor. Moreover, the system will signal to stop the motor when the coordinates match the input data. The output generated from this control is the angle of the motor movement [8].

The rotation of the servo motor uses the PWM (pulse width modulation) method with a closed-loop system. When the servo motor rotates, the camera reads each rotation angle so that the difference in coordinates between the camera and the object will be known. Here, the Arduino microcontroller processes the information and adjusts the angle according to the reference [24-26]. The reference value is the output from the potentiometer with a length of 10 bits (0-1023). The output generated from the servo motor control system is the angle corresponding to the position input in the software [27].

III. RESULT AND DISCUSSION

A. Color Detections

Color detection of indoor conditions is carried out with several color variations like red, green, and blue. Here, the purpose of color detection is to discover that Computer Vision can recognize object colors in various conditions. Color detection is carried out by taking the Hue Saturation Value (HSV) parameter. Moreover, the filtering mechanism based on object color is shown in Fig. 4.

Figure 4. Color filter mechanism as a detector in indoor conditions.

Indoor conditions with stable lighting give the stable threshold parameter as well on the CV when detecting the color of an object. Meanwhile, in outdoor conditions, the threshold parameters must be adjusted. Due to outdoor lighting conditions depending heavily on sunlight, this light is affected by weather, clouds, and measurement time.

B. Pattern and Size Detection

Pattern detection is carried out with several variations of shapes, such as circles, squares, and triangles. One of the results of object pattern detection at a certain color is shown in Fig. 5.

Figure 5. Object with blue color in pattern detection

Meanwhile, object size detection is performed to determine the real distance between the camera and the object. This detection is carried out with several variations in sizes and shapes, such as circles, triangles, and squares, with dimensions of 5, 10, and 15 cm, respectively. Table 1 shows the results of the detection of pattern and size.

Table 1 . Detection result of pattern, size, and approximate distance.

Real Object Side Size (cm)Detection DistancePatternDetected Object Size
Min (cm)Max (cm)Min (pixel)Max (pixel)
5120240Circle26x2311x10
1012040052x4912x11
1512068078x7512x11
5120240Rectangle26x2512x11
1012040053x5314x13
1512064080x8313x13
5120120Triangle22.4x18.622.4x18.6
1012020045x4525x25
1512036075x73.122x23


The measurement results show that the camera can detect distances well in several object patterns, such as a circle with a diameter of 15 cm detected at a distance of 680 cm. Meanwhile, for other objects, it is measured by the edge of the pattern.

Meanwhile, the camera’s ability to distinguish the size of objects in the same shape and color is illustrated in Fig. 6.

Figure 6. Determination of the size of the object at the same shape and color

C. Detection of Approximate Distance

The approximate distance measurement by the camera is carried out on an object with a diameter of 15 cm. here, the real distance is 20 to 240 cm, with measures every 20 cm.

The results of measuring the circle (see Fig.7) show that the standard deviation value for the slope is 0.002715 and 0.002548 for up and down measurements, respectively.

Figure 7. Distance measurement by the camera on a circle

Meanwhile, Fig. 8 shows that the results of the distance approximation of the square have the highest value for 100% accuracy and precision obtained at the distance of 240 cm.

Figure 8. Distance measurement by the camera on a square

Then, the results of measuring the square show that the standard deviation value for the slope is 0.002536 and 0.000324 for up and down measurements, respectively.

Then the calculation of the average value of the camera measurement error as a distance sensor is ±5% for square. Moreover, measuring the triangle results show that the slope’s standard deviation value is 0.003330 and 0.004332 for up and down measurements, respectively. And Fig. 9 shows the calculation of the average value of the camera measurement error as a distance sensor is ±6%.

Figure 9. Distance measurement by the camera on a triangle.

D. Tracking Objects Measurement

Object tracking is carried out with various distances from 50 to 300 cm. Here, synchronization between object recognition by the camera and the drive system on the servo motor by the microcontroller, as shown in Fig. 10.

Figure 10. Tracking objects mechanism (a) instrument position and objects, and (b) target on the computer view.

This measurement was carried out with seven times repetitions to see the consistency of the results. Then, the ability of the camera to track a moving object is shown in the accuracy.

E. Characteristics of Encoding Laser

Laser blinking is made by controlling the PWM signal.

Fig. 11 shows the results of measuring variations in the duty cycle on blue, red, green, and blue diode lasers. Here, the combination of ton and toff represents the duty cycle.

Figure 11. The effect of increasing the duty cycle on the output voltage

Fig. 11 shows that the increasing the percentage of duty cycle that is set, the voltage that is identified also increases.

Furthermore, the results of outdoor measurements are presented in Table 2. Here, the intensity measured by the lux meter is 91.6 lux in the middle of the day and 0 lux at night. Measurements in outdoor conditions were carried out at 50 cm and 300 cm with variations in duty cycle starting from 20 to 60%. When the intensity of sunlight gets brighter, it makes it difficult for the camera to recognize the laser’s color, so the laser’s intensity also increases, and the camera can detect it properly. In addition, the greater the intensity of the laser light, the farther the range in recognizing objects, and vice versa. So, it can be concluded that the HSV value set is the same for the intensity and distance of different objects. This indicates that the trackbar configuration on the system has been properly calibrated.

Table 2 . Condition of laser measurement as a target marker, this measurement is carried out outdoors.



Measurements were made to determine the effect of ambient light on the camera in recognizing the laser as a target marker. The effect of ambient light is differentiated in indoor and outdoor measurements. Figs. 12 and 13 show the results of the effect of the duty cycle on the light intensity of the blue laser.

Figure 12. The effect of increasing duty cycle on intensity. These measurements were taken condition indoors.
Figure 13. The effect of increasing duty cycle on intensity. These measurements were taken condition outdoors.

From Figs. 12 and 13 regarding the measurement of laser intensity in indoor and outdoor conditions, there are significant differences in results between indoor and outdoor conditions.

Moreover, the ambient light intensity affects the detection system. Moreover, the darker ambient light conditions will make the camera work better.

F. Measurement of Trajectory

Measurement of accuracy and precision of the bullet launcher is divided into three categories, such as response time, the accuracy of the bullet launcher against the target, and measurement of shot precision. The delay time measurement is needed to determine how fast the compressor response can fill the air storage, the servo motor response to executing the commands, and the camera response to the movement of the target marker. In this study, measurement of the delay time was carried out using variations in air pressure from 100 to 500 kPa with a stepping of 50 kPa. Table 3 shows the measurement data for air-filling delay time in storage.

Table 3 . Measurement of the time to produce air pressure and store in the storage.

Pressure (kPa)Compressor Charging Respond Time (s)
1000,6
1500,8
2001,01
2502,05
3002,56
3502,83
4003,86
4504,39
5005,24


Table 3 shows that the delay time increases linearly with increasing air pressure requirements. To produce a pressure of 100 kPa, the compressor takes 0.6 seconds to fill the air storage, while for a pressure of 500 kPa, the compressor takes 5.24 seconds to fill the air storage.

Furthermore, launcher accuracy measurements were carried out indoors and outdoors with several variations of distance. This measurement is carried out after the microcontroller synchronizes the object recognition system, servo motor movement system, and launcher system. The purpose of this measurement is to find out if the synchronization process is working well. The camera can recognize and send commands to the servo motor so that the prediction of the launcher can be at the target’s midpoint. Measurements were made at 50 cm to 300 cm with a measurement range of 50 cm. The measurement results are shown in Tables 4 and 5.

Table 4 . The measurement of the accuracy of the bullet launcher with several repetitions. This measurement was taken in indoor conditions.

NoObject Distance from SystemRepeat Number (1=success, 0=fail)The accuracy of the automatic weapon aiming system (%)
1234567
150 cm101111185.71%
2100 cm1111111100.00%
3150 cm1111111100.00%
4200 cm111101185.71%
5250 cm1111111100.00%
6300 cm1111111100.00%
Average Accuracy Test95.24%

Table 5 . The measurement of the accuracy of the bullet launcher with several repetitions. This measurement was taken in an outdoor condition.

NoObject Distance from SystemRepeat Number (1=success, 0=fail)The accuracy of the automatic weapon aiming system (%)
1234567
150 cm1111111100,00%
2100 cm1111111100.00%
3150 cm111101185,71%
4200 cm111101185,71%
5250 cm111100171,42%
6300 cm111010171,42%
Average Accuracy Test85,71%


Based on measurements with seven repetitions for distance variations, as shown in Tables 3 and 4, the accuracy of a launcher throwing bullets at the object area illuminated by a unique marker with the help of a drive servo motor in directing an automatic system has an average accuracy of 95.24 and 85.71% for indoor and outdoor conditions respectively. Then the precision value that hits the target can be calculated. The accuracy and precision of the shot were measured on a circle with a diameter of 15 cm at an angle of 90.85 of the bullet launchers. This measurement, to obtain the results of the accuracy and precision of the shot, is carried out with seven repetitions. Then the average data is taken for each target position. The measurement results are shown in Table 6.

Table 6 . Calculation results of approximated error, accuracy, and precision of the shot.

Target (90,85)Object Distance
Distance of the Bullet Against the Center of the Target (cm)Test Number50 cm100 cm150 cm200 cm
10.51.52.85
20.81.52.85.2
311.635.3
41.423.25.4
51.52.23.45.4
61.52.245.8
722.64.26.6
RMSE (cm)0.842.063.385.55
Accuracy98.32%97.94%97.75%97.23%
Precision74.06%81.93%86.57%93.06%


Using the Root Mean Square Error (RMSE) method, an estimate of the average measurement error, accuracy, and precision is obtained. Table 6 shows that the estimated error shooting data at 50 cm is 0.84 cm. At 100 cm, it is 2.06 cm. At 150 cm, it is 3.38 cm, and at 200 cm, it is 5.55 cm. With a simple calculation, it can be obtained that the accuracy of the shot at 50 cm is 98.32%, at 100 cm is 97.94%, at 150 cm is 97.75%, and at 200 cm is 97.23%. While the precision value obtained from the calculation results at 50 cm is 74.06%, at 100 cm is 81.93%, at 150 cm is 86.57%, and at 200 cm is 93.06%.

From the measurement of the accuracy and precision of shots with variations in object distance, the highest accuracy rate was obtained for shots at 98.32% at 50 cm, while the highest precision for shots was obtained for shots with 93.06% at 200 cm. So, from the calculation results, it can be concluded that the closer the target distance is, the higher the accuracy will be. Still, the lower the precision of the shot, it's because the bullet’s point spreads around the target’s midpoint and vice versa.

G. Target Position and Response Time tracking measurement

This measurement is carried out to see the response of tracking time to the target and the position of the system moving from right to left, from top to bottom, and vice versa. This movement is done when aiming the gun. Here, the bullet launcher may respond when the target is within range of the camera. The result of this measurement is shown in Table 7.

Table 7 . Measurement of target position and response time tracking to the target.

PositionServo DegreeResponse Time (s)Accuracy of the system aiming the weapon when the laser on
HorizontalVertical
Right (Axis x)30501.1Accurate
Left (Axis x)15500.65Accurate
Up (Axis y)90600.87Accurate
Down (Axis y)90450.43Accurate
Right (Axis x) + Up (Axis y)115601.57Accurate
Right (Axis x) + Down (Axis y)115451.36Accurate
Left (Axis x) + Up (Axis y)75601.56Accurate
Right (Axis x) + Down (Axis y)75451.34Accurate
Average Response Time Tracking Target1.11Accurate


In this system, the measurement of a bullet launcher system that can shoot accurately based on the target position has an average response time of 1.11 s. The bullet launcher system also directs the gun as soon as the target marker lights up at the set position.

IV. CONCLUSIONS

The automatic unique marker shooter system is designed to be able to track, detect, lock, mark, and shoot the objects that a unique marker has marked. This system consists of object recognition, a drive, and a bullet launcher. Here, the automatic unique marker shooter is designed with a closedloop control system. Furthermore, pneumatics is used as the source of bullet power to be launched. Then, the feedback signal on the control system is a pressure sensor. The pressure sensor used in the system has a measurement error value of ±3%. Meanwhile, this system has been able to shoot precisely with an accuracy of 98.32% for a 50 cm target distance and 93.06% for 200 cm.

ACKNOWLEDGMENTS

The authors are grateful to the Institute Teknologi Sepuluh Nopember for their financial support under the publication writing and IPR incentive program (PPHKI) project scheme.

Fig 1.

Figure 1.Integration of object detection, unique marker, and bullet launcher mechanism.
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Fig 2.

Figure 2.(a) The PWM of laser blinking consists of the ton/tbright, toff, and tdim. (b) Observation in real time by the camera, and masking side in computer.
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Fig 3.

Figure 3.Block diagram of bullet launcher with a unique automatic marker.
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Fig 4.

Figure 4.Color filter mechanism as a detector in indoor conditions.
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Fig 5.

Figure 5.Object with blue color in pattern detection
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Fig 6.

Figure 6.Determination of the size of the object at the same shape and color
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Fig 7.

Figure 7.Distance measurement by the camera on a circle
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Fig 8.

Figure 8.Distance measurement by the camera on a square
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Fig 9.

Figure 9.Distance measurement by the camera on a triangle.
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Fig 10.

Figure 10.Tracking objects mechanism (a) instrument position and objects, and (b) target on the computer view.
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Fig 11.

Figure 11.The effect of increasing the duty cycle on the output voltage
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Fig 12.

Figure 12.The effect of increasing duty cycle on intensity. These measurements were taken condition indoors.
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Fig 13.

Figure 13.The effect of increasing duty cycle on intensity. These measurements were taken condition outdoors.
Journal of Information and Communication Convergence Engineering 2023; 21: 252-260https://doi.org/10.56977/jicce.2023.21.3.252

Table 1 . Detection result of pattern, size, and approximate distance.

Real Object Side Size (cm)Detection DistancePatternDetected Object Size
Min (cm)Max (cm)Min (pixel)Max (pixel)
5120240Circle26x2311x10
1012040052x4912x11
1512068078x7512x11
5120240Rectangle26x2512x11
1012040053x5314x13
1512064080x8313x13
5120120Triangle22.4x18.622.4x18.6
1012020045x4525x25
1512036075x73.122x23

Table 2 . Condition of laser measurement as a target marker, this measurement is carried out outdoors.


Table 3 . Measurement of the time to produce air pressure and store in the storage.

Pressure (kPa)Compressor Charging Respond Time (s)
1000,6
1500,8
2001,01
2502,05
3002,56
3502,83
4003,86
4504,39
5005,24

Table 4 . The measurement of the accuracy of the bullet launcher with several repetitions. This measurement was taken in indoor conditions.

NoObject Distance from SystemRepeat Number (1=success, 0=fail)The accuracy of the automatic weapon aiming system (%)
1234567
150 cm101111185.71%
2100 cm1111111100.00%
3150 cm1111111100.00%
4200 cm111101185.71%
5250 cm1111111100.00%
6300 cm1111111100.00%
Average Accuracy Test95.24%

Table 5 . The measurement of the accuracy of the bullet launcher with several repetitions. This measurement was taken in an outdoor condition.

NoObject Distance from SystemRepeat Number (1=success, 0=fail)The accuracy of the automatic weapon aiming system (%)
1234567
150 cm1111111100,00%
2100 cm1111111100.00%
3150 cm111101185,71%
4200 cm111101185,71%
5250 cm111100171,42%
6300 cm111010171,42%
Average Accuracy Test85,71%

Table 6 . Calculation results of approximated error, accuracy, and precision of the shot.

Target (90,85)Object Distance
Distance of the Bullet Against the Center of the Target (cm)Test Number50 cm100 cm150 cm200 cm
10.51.52.85
20.81.52.85.2
311.635.3
41.423.25.4
51.52.23.45.4
61.52.245.8
722.64.26.6
RMSE (cm)0.842.063.385.55
Accuracy98.32%97.94%97.75%97.23%
Precision74.06%81.93%86.57%93.06%

Table 7 . Measurement of target position and response time tracking to the target.

PositionServo DegreeResponse Time (s)Accuracy of the system aiming the weapon when the laser on
HorizontalVertical
Right (Axis x)30501.1Accurate
Left (Axis x)15500.65Accurate
Up (Axis y)90600.87Accurate
Down (Axis y)90450.43Accurate
Right (Axis x) + Up (Axis y)115601.57Accurate
Right (Axis x) + Down (Axis y)115451.36Accurate
Left (Axis x) + Up (Axis y)75601.56Accurate
Right (Axis x) + Down (Axis y)75451.34Accurate
Average Response Time Tracking Target1.11Accurate

References

  1. B. G. Disha, and G. Indumathi, Automated test system for laser designator and laser range finder, International Journal of Engineering Research & Technology (IJERT), vol. 4, no. 5, pp. 126-130, May, 2015. DOI: 10.17577/IJERTV4IS050204.
    CrossRef
  2. H. Kaushal, and G. Kaddoum, Application of laser and tactical military operations, IEEE Access, vol. 5, pp. 20736-20753, 2017. DOI: 10.1109/ACCESS.2017.2755678.
    CrossRef
  3. M. L. W. B. Anderberg, “Laser weapons: The dawn of a new military age,” New York, USA: Springer, 2013.
  4. N. Isoyama, and T. Terada, and J. Akita, and M. Tsukamoto, A method to control LED blinking for position detection of devices on conductive clothes, in 9th International Conference on Advances in Mobile Computing and Multimedia, Ho Chi Minh City, Vietnam, pp. 123-130, 2011. DOI: 10.1145/2095697.2095721.
    CrossRef
  5. H. M. Zangana, A new algorithm for shape detection, IOSR Journal of Computer Engineering (IOSR-JCE), vol. 19, no. 3, pp. 71-76, May, 2017. DOI: 10.9790/0661-1903017176.
    CrossRef
  6. J. Liu, and R. Zhang, Vehicle detection and ranging using two different focal length cameras, Journal of Sensors, vol. 2020, pp. 1-14.ID. 4372847, Mar, 2020. DOI: 10.1155/2020/4372847.
    CrossRef
  7. Y. Fang, and H. Zhao, and H. Zha, and X. Zhao, and W. Yao, Camera and LiDAR fusion for on-road vehicle tracking with reinforcement learning, in 2019 IEEE Intelligent Vehicles Symposium (IV), Paris, France, pp. 1723-1730, 2019. DOI: 10.1109/IVS.2019.8814028.
    CrossRef
  8. H. M. Saputra and T. A. Prambudi and D. G. Subagjo, Rancang bangun umpan balik eksternal untuk kendali sudut motor servo berbasis arduino, Jurnal Teknologi Bahan dan Barang Teknik, vol. 6, no. 2, pp. 43-48, Dec., 2016. DOI: 10.37209/jtbbt.v6i2.68.
    CrossRef
  9. L. -W. Tsai and J. -W. Hsieh and K. -C. Fan, Vehicle detection using normalized color and edge map, IEEE Transactions on Image Processing, vol. 16, no. 3, pp. 850-864, Mar., 2007. DOI: 10.1109/TIP.2007.891147.
    Pubmed CrossRef
  10. L. -C. Liu and C. -Y. Fang and S. -W. Chen, A novel distance estimation method leading a forward collision avoidance assist system for vehicles on highways, IEEE Transactions on Intelligent Transportation Systems, vol. 18, no. 4, pp. 937-949, Apr., 2017. DOI: 10.1109/TITS.2016.2597299.
    CrossRef
  11. J. Han, and O. Heo, and M. Park, and S. Kee, and M. Sunwoo, Vehicle distance estimation using a mono-camera for FCW/AEB systems, International Journal of Automotive Technology, vol. 17, no. 3, pp. 483-491, Jun., 2016. DOI:10.1007/s12239-016-0050.
    CrossRef
  12. A. C. Sankaranarayanan and A. Veeraraghavan and R. Chellappa, Object detection, tracking and recognition for multiple smart cameras, Proceedings of the IEEE, vol. 96, no. 10, pp. 1606-1624, Oct, 2008. DOI: 10.1109/JPROC.2008.928758.
    CrossRef
  13. V.V. Priya, V. Priya, K., and S. Roy, “Camera based fod detection using image processing technique,” International Journal of Engineering Science Invention (IJESI), vol. 7, issue 6, pp. 30-33, Jun. 2018.
  14. C. J. Jermak and M. Rucki and M. Jakubowicz, Accuracy of the pneumatic follower for the wooden surface quality assessment, European Journal of Wood and Wood Products, vol. 78, pp. 1149-1159, Aug., 2020. DOI: 10.1007/s00107-020-01593-y.
    CrossRef
  15. Z. Liu, and W. Chen, Research and Analysis on firing accuracy of Naval Gun, Journal of Physics: Conference Series, vol. 1948, no. 1, 012081, pp. 1-7, Jun, 2021. DOI: 10.1088/1742-6596/1948/1/012081.
    CrossRef
  16. W. P. Tresna, and U. A. Ahmad, and Isnaeni, and R. R. Septiawan, and I. T. Sugiarto, and A. L. Suherman, Encoding LED for unique markers on object recognition system, International Journal of Advanced Computer Science and Applications (IJACSA), vol. 12, no. 12, pp. 678-683, Dec., 2021. DOI: 10.14569/IJACSA.2021.0121284.
    CrossRef
  17. R. C. Gonzalez and R. E. Woods, “Digital image processing second edition,” Prentice Hall, New Jersey, 2002.
  18. K. Adi, and S. Pujianto, and R. Gernowo, and A. Pamungkas, and A. B. Putranto, Automatic vehicle counting using background subtraction method on gray scale images and morphology operation, Journal of Physics: Conference. Series, vol. 9, no. 23, pp. 13917-13924, May, 2018. DOI: 10.1088/1742-6596/1025/1/012025.
    CrossRef
  19. J. Biswas, and M. Veloso, Depth camera based indoor mobile robot localization and navigation, in 2012 IEEE International Conference on Robotics and Automation, Saint. Paul, USA, pp. 1697-1702, 2012. DOI: 10.1109/ICRA.2012.6224766.
    CrossRef
  20. C-H. Hsieh, and J-H. Chou, Analysis and optimal control of pulse width modulation feedback systems, Proceedings of the Institution of Mechanical Engineers, Part I: Journal of Systems and Control Engineering, vol. 218, no. 4, pp. 277-286, Jun, 2004. DOI: 10.1177/095965180421800403.
    CrossRef
  21. Y. Zhao, and Y. Le, and Q. Wang, and F. Meng, and Y. Lin, and S. Wang, and B. Lin, and S. Cao, and J. Cao, and Z. Fang, and E. Zang, 100-Hz linewidth diode laser with external optical feedback, IEEE Photonics Technology Letters, vol. 24, no. 20, pp. 1795-1798, Oct., 2012. DOI: 10.1109/LPT.2012.2214029.
    CrossRef
  22. C. L. Wan and K. W. Dickinson and T. D. Binnie, A cost-effective image sensor system for transport applications utilizing a miniature CMOS single chip camera, IFAC Transportation systems, Tianjin, Proceedings, vol. 27, no. 12, pp. 151-156, Aug, 1994. DOI: 10.1016/S1474-6670(17)47460-X.
    CrossRef
  23. H. Adinanta, and H. Kato, and A. W. S. Putra, and T. Maruyama, Enhancement of beam tracking response using color filtering method for optical wireless power transmission, AIP Conference Proceedings, vol. 2256, no. 1, p. 020009, 2020. DOI: 10.1063/5.0022440.
    CrossRef
  24. Z. Abidin, and A. G. R. Mahendra, and D. F. Mahendra, and M. R. N. Imami, and W. M. H. Mehanny, Performance Analysis of LED Driver for Transmitter of Visible Light Communication Using Pulse Width Modulation, in 10th Electrical Power, Electronics, Communications, Controls and Informatics Seminar (EECCIS), Malang, Indonesia, pp. 178-182, 2020. DOI: 10.1109/EECCIS49483.2020.9263424.
    CrossRef
  25. L. Zhou, and J. Li, and J. Sheng, and J. Cao, and Z. Li, Closed-loop identification for motion control system, in 8th World Congress on Intelligent Control and Automation (WCICA), Jinan, China, pp. 477-480, 2010. DOI: 10.1109/WCICA.2010.5553795.
    CrossRef
  26. A. Sarsenov, and A. Yessenbayeva, and A. Shintemirov, and A. Yazici, Detection of objects and trajectories in real-time using deep learning by a controlled robot, in International Conference on Robotics, Computer Vision and Intelligent Systems (ROBOVIS 2020), Buhapest, Hungary, pp. 131-140, 2020. DOI: 10.5220/0010215201310140.
    CrossRef
  27. B. Murtianta, Pengaruh duty cycle dan frekuensi terhadap kecepatan putar motor DC, Techné Jurnal Ilmiah Elektroteknika, vol. 17, no. 1, pp. 13-26, Apr., 2018. DOI:10.31358/techne.v17i01.167.
    CrossRef
JICCE
Sep 30, 2024 Vol.22 No.3, pp. 173~266

Stats or Metrics

Share this article on

  • line
  • mail

Journal of Information and Communication Convergence Engineering Jouranl of information and
communication convergence engineering
(J. Inf. Commun. Converg. Eng.)

eISSN 2234-8883
pISSN 2234-8255