Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Multiple instruments motion trajectory tracking in optical surgical navigation

Open Access Open Access

Abstract

Optical surgical navigation system has been a hot research topic because of its high accuracy. This paper focuses on identifying and tracking multiple surgical instruments to meet the requirements of applying multiple surgical instruments in clinical medicine. The methods of instrument identification based on the marker's geometrical arrangement and instrument tracking based on markers’ motion vector were applied in the proposed algorithm. The experiments of multiple instruments’ identification and tracking, the instruments’ stability, the space distance and rotation of a pair of instruments at the same tracking time were performed to verify the proposed algorithm. The stereoscopic camera is applied to capture images, and two 850 nm filters are added in front of the binocular camera. The tracking experiment shows that ten instruments can be fully and accurately identified, and then all of them can be quickly and accurately tracked at the same time. A pair of instruments is simultaneously measured in the stability test, such as the typical surgical instrument (TSI) and the miniature surgical instrument (MSI). The ranges of standard deviations (SD) of the stability test for the TSI in the X-, Y-, and Z- axes are from 0.016 mm to 0.127 mm, from 0.011 mm to 0.090 mm, and from 0.124 mm to 0.901 mm, respectively. And the ranges of SDs for the MSI’s stability test are from 0.011 mm to 0.133 mm in the X-axis, from 0.010 mm to 0.106 mm in the Y-axis, and from 0.093 mm to 0.932 mm in the Z-axis. The sub-millimeter SDs show that the proposed algorithm has a high stability. Moreover, the space distance test and the rotation test were performed for simultaneously tracking TSI and MSI. All experimental results indicate the proposed algorithm is able to meet the clinical accuracy requirements.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Surgical navigation systems have been widely accepted and used in recent years because of their high precision and minimal invasiveness. Surgical navigation consists of a positioning system, an image processing workstation and surgical instruments. The patient's image data is acquired preoperatively. During the operation, the positional relationship between the surgical instrument and the lesion is obtained by the positioning system. And also, the surgical instrument is registered with the CT or MRI image acquired before the operation. Then, the pose of the surgical instrument can be displayed on the screen in real time as a virtual surgical instrument to guide the doctor in performing the surgery [1]. It has two advantages. One is that the surgical navigation system has high accuracy and improves the success ratio and precision of the surgery. The other is that the system can perform operations in as small an area as possible, which reduces bleeding, relieves pain, and shortens recovery time. Initially, surgical navigation systems were mainly used in neurosurgery. Nowadays, the application of surgical navigation are growing. Many researches based on surgical navigation systems are conducted in a variety of surgeries, such as ear nose and throat (ENT) surgery, knee arthroplasty and maxillofacial surgery [2–7]. Surgical navigation systems are playing an important role in more and more clinical operations. The application of surgical navigation system in cervical spine surgery [8] is proposed. The positioning system used in this paper is an optical positioning system. The study was validated by the cadaveric testing and clinical experiments. For the cadaveric testing, the axial and sagittal translational navigation errors (mean ± SD) were 1.66 ± 1.18 mm and 2.08 ± 2.21 mm, whereas axial and sagittal angular errors were 4.11 ± 3.79 degrees and 6.96 ± 5.40 degrees, respectively. In the clinical validation, the axial and sagittal translational errors were 1.92 ± 1.37 mm and 1.27 ± 0.97 mm, whereas axial and sagittal angular errors were 3.68 ± 2.59 degrees and 3.47 ± 2.93 degrees, respectively. The application of surgical navigation system in corrective spine surgery [9] is studied. The translation error was 1.4 ± 1.2mm, while the residual orientation error was 2.7 ± 2.6 degrees, 2.8 ± 2.4 degrees and 2.5 ± 2.8 degrees in the coronal, sagittal, and axial planes, respectively. The application of surgical navigation system in percutaneous needle placement [10] is performed. A pre-clinical experiment showed that the maximum angle error was 0.94 degrees and the maximum position error of a target located 80 mm below the end-effector was 1.31 mm. Compared to these clinical accuracy, according to the experimental results, the proposed system can completely meet the requirement. Optical navigation system [11] which is increasingly used because of its sub-millimeter accuracy. But usually the cost of commercial optical navigation systems is high because multiple surgical instruments are tracked by complicated algorithm. Usually, the infrared navigation system has two methods of tracking surgical instruments. One is the active optical tracking, where the camera captures the light emitted [12,13] by infrared markers fixed on the surgical instruments. The other is the passive optical tracking, where the illuminators emit light and the camera captures the light reflected by retro-reflective markers attached to the instruments. But the accuracy of the passive optical tracking system is lower than the active optical tracking system because of the retro-reflective markers are vulnerable in a repeatable manner. So the proposed algorithm is based on an active optical tracking system.

Now a number of commercial optical tracking systems, such as the Polaris optical measurement system of Northern Digital Inc. (NDI) [14], Stealth station system of Medtronic Navigation Inc [15], NAV3i platform of Stryker [16], and ART Surgical Navigation System of OMNIlife science Inc [17], are developed. Furthermore, many universities and research institutions are devoted to developing navigation systems and improving the accuracy of tracking instruments. At the same time, a lot of Chinese companies are also committed to developing optical navigation systems. Shenzhen Anke [18] developed the ASA-610V navigation system used for neurosurgery and ENT surgery. The minimum positioning deviation is 1 mm. TINAVI [19] of Beijing developed TiRobot for spine within 1 mm accuracy. Most of the studies are now limited to tracking one surgical instrument. However, more than one instrument is needed in many operations [20,21]. For example, two surgical instruments need to be fixed on the femur and tibia in total keen arthroplasty [22]. Some methods of tracking multiple instruments are proposed below. The Polaris Vega optical system of NDI can simultaneously track the passive tools equipped with reflective marker spheres and active wireless tools. The tracking method is based on measuring the lengths of line segments and the angles formed by the intersection of the line segments [23]. In this scenario, six active wireless tools can be tracked at most. The disadvantage of the algorithm is that the lengths of line segments cannot be equal. Otherwise, the tools cannot be tracked. The Hybrid Polaris Spectra system can track nine wired active tools [14]. The wired active markers have to be activated individually to track the rigid bodies equipped with these markers [24]. However, this method is costly and the number of trackable wired active tools is greatly limited due to the limitation of hardware. Li Liu et al. [25] applied the extended Kalman filter to track multiple targets or markers. Then tracking multiple instruments is based on the different distances between all markers. The real-time tracking trajectory based on Kalman filter is basically the same as the reconstructed trajectory calculated by stereo triangulation. Although the algorithm is able to solve the delay problem to a certain extent, the targets cannot be well tracked when they were moving fast. And the distances between all markers must not be equal. Ken Cai et al. [26] proposed that the line between a marker and the optical center of left camera and the line between the same marker and the optical center of right camera must intersect. To correctly identify and track multiple instruments, the markers were tagged. This method can correctly match the markers in images, but the distances between the markers must be unequal. Additionally, it can’t track the instrument if the shape formed by the markers was an isosceles triangle. For this reason, an algorithm which consists of instrument identification based on the markers’ geometrical arrangement (IIMGA) and instrument tracking based on markers’ motion vector (ITMMV) is proposed.

At present, different markers’ geometrical arrangements have been proposed for identification and tracking of multiple surgical instruments. Usually, the distances between adjacent markers and the angles of the geometry formed by markers are used to distinguish multiple instruments. In this method, the length ratio and perimeter of the triangles formed by the markers are used for instrument identification. In addition, the motion vector technique is usually used in video coding and computer vision. For the first time, the authors combine the motion vector technique with the simultaneous tracking of markers on multiple different surgical instruments. In this paper, the method of IIMGA is applied to accurately identify multiple instruments. In IIMGA, the length ratio and perimeter formed by the markers of each instrument are used. It allows that the distances of two adjacent markers of each instrument are either equal or unequal. The method of ITMMV is then used to track multiple identified instruments simultaneously. This method tracks markers based on their motion vectors in real-time, which enables quickly tracking the instruments.

The organization of the rest of this paper is as follows. Section 2 describes the proposed algorithm in detail, including identifying and tracking multiple instruments. The framework of the proposed algorithm is introduced in Section 2.1. Section 2.2 proposes the IIMGA method to perform identification quickly, and Section 2.3 proposes the ITMMV method which implement trajectory tracking in real-time. Next, the experiments about the measurement accuracy are performed in Section 3. There are three parts of Section 3: In Section 3.1, the stability of two instruments tracked at the same time are measured; In Section 3.2, the accuracy of the space distance test is verified. In addition, in Section 3.3, the rotation test of the two instruments is performed. In Section 4, the experimental results are discussed and concluded.

2. Method

2.1 Framework

In previous work [27], an optical tracking system based on a binocular camera is designed. But only a single simulated surgical instrument was tracked in that scheme. However, multiple surgical instruments need to be tracked in the surgery. Based on previous work, an algorithm was proposed to identify and track multiple instruments at the same time. The proposed algorithm mainly includes two parts. One is the identification method based on the unique geometries formed by the markers attached to a surgical instrument. And the other is the tracking of each surgical instrument simultaneously based on the motion vector. The framework of the proposed algorithm is illustrated in Fig. 1.

 figure: Fig. 1

Fig. 1 Framework of the proposed optical tracking system.

Download Full Size | PDF

The procedure of the proposed algorithm is as follows. At first, a stereoscopic camera is used to capture images of multiple surgical instruments. Zhang’s calibration [28] is applied to obtain intrinsic and extrinsic parameters. A pair of filters is added in front of the camera to eliminate the effects of ambient light. Next, the pixel coordinates of the markers’ centers are calculated. The IIMGA algorithm is applied to identify different instruments, and the positions of the tips are calculated based on the instruments’ calibrations. Then ITMMV is used to track the instruments’ trajectories. If multiple instruments are identified in the (t-1)th frame, they can be rapidly tracked in the tth frame based on the motion vector of makers between adjacent frames. As shown in Fig. 1, MV is the motion vector of markers between the tth and (t-1)th frames, where t >1. Finally, the tips’ coordinates of multiple instruments are calculated. The proposed algorithm can accurately identify and rapidly track multiple instruments.

2.2 Instrument identification based on markers’ geometrical arrangement

In the tracking system, each surgical instrument is equipped with three light-emitting diodes (LED). The principle of the optical tracking system is stereo vision. The actual distance of two adjacent markers of an instrument is predetermined. But the pixel distance of the two markers in the pixel coordinate system will change with different shooting angle and depth. The system cannot automatically identify the correspondence between the pixel coordinates of the captured image and the world coordinates used for measuring the shapes of markers’ geometrical arrangement. In addition, the order of the markers in the left and right recorded images can be different due to the placement of the cameras. Therefore, an algorithm based on the unique markers’ geometrical arrangement of instruments is proposed. A different length ratio and the perimeter of each triangle are formed by markers when all instruments are parallel to cameras. The steps are as follows:

  • 1) Get the pixel coordinates of markers’ centers. Capture the images of multiple surgical instruments to be identified. Calculate the pixel coordinates of all markers’ centers in the left and right images using the region growing and gray centroid method [27]. The pixel coordinates of a marker’s center are denoted as (u,v), and they are calculated by
    {u=i=1nuif(ui,vi)i=1nf(ui,vi)v=i=1nvif(ui,vi)i=1nf(ui,vi)

    where n is the number of the pixels. (ui,vi) are the coordinates of the ith pixel, and f(ui,vi) is the corresponding grayscale value of the ith pixel. Suppose there are L surgical instruments to be identified, and at least three markers {A,B,C} are attached on each instrument, denoted as {A1,B1,C1},{A2,B2,C2},...,{Al,Bl,Cl}. In addition, l = 1, 2, …, L, where l is the number of the instruments.

  • 2) Find two markers of the surgical instrument to be identified. Taking the left image as an example, sort all markers by the value of u in the pixel coordinate system, and set the marker whose value of u is the minimum as point Al(uAl,vAl). Find a marker such that the difference between the u value of the marker found and point A is not bigger than a predetermined coordinate value, and the marker found is the nearest to point A in all qualifying points. The marker selected is denoted as point Bl(uBl,vBl).
  • 3) Find the third marker: there are two steps. First, the length ratio is applied to identify the instruments. The conditions about the third marker are set according to the geometrical arrangement of the markers on the instrument to be identified. For example, the u value of a marker can be between the u value of the remaining two markers, and the v value of the marker is larger than the others. The third marker is denoted as point Cl(uCl,vCl). Then the Euclidean distances are calculated between points Al and Bl, and between points Al and Cl; they are denoted as DAlBl and DAlCl, respectively. The ratio of DAlBl and DAlCl, denoted as η, is calculated as Eq. (2):
    η=DAlBl/DAlCl
where DAlBl and DAlCl are calculated by

{DAlBl=[(uAluBl)2+(vAlvBl)2]12DAlCl=[(uAluCl)2+(vAlvCl)2]12

Suppose the predetermined ratio of the lth surgical instrument is δl, then the difference of the ratio δl and η is expressed as follows,

|ηδl|<ε1
where the empirical error ε1 is given to address the problem that the measured ratios of an instrument can’t be completely precise. If η and δl satisfy Eq. (4), the three markers belong to the lth instrument. The instrument can be identified if the difference between η and δl is less than ε1.

The instruments with different markers’ geometrical arrangement can be identified by their length ratios. However, the length ratios of two different instruments are so similar that they cannot be effectively distinguished. So the perimeters are also used to recognize multiple surgical instruments. If the perimeter of an instrument in the world coordinate system is a constant P, the pixel perimeter of the instrument in the image can be calculated by the principle of stereo vision. As shown in Fig. 2, there is a point in the camera coordinate system, denoted as A(XA,YA,ZA), and point a(xa,ya) is the projection of point A on the imaging plane. The camera model uses pinhole imaging and the focal length is expressed by f.

 figure: Fig. 2

Fig. 2 The transformation between the camera coordinate system and the image coordinate system

Download Full Size | PDF

According to similar triangle calculation from Fig. 2, the Eq. (5) can be derived,

{xa=fZAXAya=fZAYA
Suppose there are two points in the camera coordinate system, namely points A(XA,YA,ZA) and B(XB,YB,ZB). The projection of the two points in the image plane are a(xa,ya) and b(xb,yb), respectively. The Euclidean distance between those two points is represented as follows:
E=[(XAXB)2+(YAYB)2+(ZAZB)2]12
Combining Eq. (5) with Eq. (6), the following formula is derived:
E=[(xaZAfxbZBf)2+(yaZAfybZBf)2+(ZAZB)2]12
Suppose ZA=ZB=Z, then Eq. (7) can also be written as:
E=Zf[(xaxb)2+(yayb)2]12
Furthermore, the relationship between the image coordinate system and the pixel coordinate system is expressed as,
{u=xdx+u0v=ydy+v0
where the coordinates of a point in the image coordinate system are (x,y). The physical dimensions of each pixel in the x- and y- axes of the image plane are dx and dy. The origin’s coordinates in the image coordinate system are denoted as (u0,v0) in the pixel coordinate system. The coordinates of the point (x,y) in the pixel coordinate system is denoted as (u,v) and can be calculated in Eq. (9).

Combining Eq. (9) with Eq. (8), the following formula is obtained:

E=Zf[(uaub)2dx2+(vavb)2dy2]12
where dx=dy in the camera used in this paper, and the value of f/dx can be obtained after camera calibration. Then Eq. (11) can be obtained:
E=dxZfd
where d is the distance between points a and b in the pixel coordinate system. The coordinates of points a and b in the pixel coordinate system are (ua,va) and (ub,vb), respectively.

The distances in the world coordinate system and the camera coordinate system are the same because their origins are consistent. If there are three markers installed on an instrument, it is assumed that the distances of each marker to the camera are the same. Therefore, with this assumption, the relationship of the actual perimeter P and the pixel perimeter p is given in Eq. (12).

P=dxZwfp

where P is the actual perimeter of the geometry shape formed by three markers in the world coordinate system. Zw is the actual distance of a marker to the camera in the world coordinate system, and p is the pixel perimeter of the three markers in the pixel coordinate system. Equation (12) shows that the pixel perimeter p decreases as Zw increases. Pl is defined as the perimeter of the lth instrument. If P and Pl satisfy the following formula, the three markers belong to the lth instrument:

|PPl|<ε2
where ε2 is an error to address the problem that the distances of each marker to the camera may be slightly different.

  • 4) Delete the identified markers. And repeat the above steps, if there are still remaining markers. The same operation as the above left image is performed for the right image.

Through the above steps, the three markers of each instrument are accurately identified. The stereo matching and 3D reconstruction [27] are then performed for each instrument, and the coordinates of the markers are calculated. Finally, the coordinates of each instrument are derived based on its calibration.

2.3 Instrument tracking based on the markers’ motion vector

The surgeon needs to adjust the surgical instrument’s position and orientation according to the lesion’s position in real-time during clinical surgery. The markers’ geometrical arrangement will vary when the surgical instrument rotates, the length ratio will exceed a predetermined range if the rotation angle is too large. Therefore, in this paper, the ITMMV algorithm is proposed to track multiple surgical instruments in real-time. Its steps are as follows:

  • 1) If multiple surgical instruments are accurately identified in the (t-1)th frame, then the coordinates of the markers on every instrument can be calculated to locate the tips of multiple instruments. The tth frame images are then captured to get the pixel coordinates of all markers’ centers.
  • 2) The markers are tracked based on the motion vector. Since the relative motion of the adjacent frame is small, that is, the motion vector of the adjacent frame is small, the amplitude of the motion vector can be defined in Eq. (14):
    IV,W(t)=|MVV,W|=|PW(t)PV(t1)|

    in the above formula, IV,W(t) is the amplitude of the motion vector MVV,W from point V in the (t-1)th frame to point W in the tth frame. The parameters PV(t1) and PW(t) are the position vectors of point V in the (t-1)th frame and point W in the tth frame, respectively, where PV(t1)=(uV,vV)and PW(t)=(uW,vW). Suppose the three markers on the lth identified instrument in Section 2.2 are Al, Bl, and Cl. Next, the pixel coordinates of all markers’ centers are calculated in the (t-1)th frame. The amplitudes of the motion vectors are used to determine which three markers belong to the lth instrument in the tth frame. It is illustrated in Eq. (15).

    {MAl(t)=min{IAlA1(t),IAlB1(t),IAlC1(t),...,IAlAL(t),IAlBL(t),IAlCL(t)}MBl(t)=min{IBlA1(t),IBlB1(t),IBlC1(t),...,IBlAL(t),IBlBL(t),IBlCL(t)}MCl(t)=min{IClA1(t),IClB1(t),IClC1(t),...,IClAL(t),IClBL(t),IClCL(t)}

    In Eq. (15), MAl(t), MBl(t), and MCl(t) are the minimum amplitudes of the motion vectors from the markers Al, Bl, and Cl in the (t-1)th frame to all the markers in the tth frame, respectively. The markers with the minimum values belong to the lth instrument in the tth frame. Thus, each instrument can be tracked based on the amplitudes of the motion vectors of their markers.

  • 3) The markers’ coordinates of each instrument are calculated based on stereo matching and 3D reconstruction. Finally, the coordinates of multiple instruments’ tips are obtained.

Therefore, the ITMMV algorithm achieves the tracking of markers fixed on every instrument based on the motion vectors of the adjacent frames, and calculates the positions of the instruments’ tips.

2.4 Experiment methods

In order to verify the validity and accuracy of the proposed algorithm, four experiments are carried out, which are identification and real-time tracking test, stability test, space distance test and rotation test of multiple surgical instruments. In the instruments’ identification and real-time tracking test, ten simulated surgical instruments are used to verify the effectiveness of the proposed algorithm. In the stability test, a Total Station is applied, and two simulated instruments are fixed on the Total Station at the same time. Then the stability results of the two instruments are measured simultaneously. In the space distance test, the simulation of the instruments’ translational movements in the X-axis and Z-axis were carried out to measure the translational accuracy of two instruments. Two instruments are rigidly fixed on the sliding head of a Grating Ruler and moved at the same time. The Grating Ruler is used to measure the actual distances that the two instruments moved. And then the distances are calculated by the proposed algorithm and compared with the actual distances. In the rotation test, the rotational accuracy of two instruments are measured at the same time. A Total Station is used to measure the angles that the two instruments rotate. The angles displayed on the screen of the Total Station are regarded as the ground truth. Then the angles calculated by the proposed algorithm are compared with the true angles to obtain the rotational accuracy. The schematic view of the experimental setup is shown in Fig. 3.

 figure: Fig. 3

Fig. 3 The schematic view of the experimental setup.

Download Full Size | PDF

3. Results

A Bumblebee2 camera [29] of FILR Integrated Imaging Solutions Inc. is used to capture images. A pair of infrared filters whose wavelength range is from 800 nm to 1100 nm is added in front of the camera to eliminate interference from ambient light. There are several surgical instruments made by the authors where three LEDs are fixed on each instrument, ten of them are shown in Fig 4. Every LED is marked by a red circle in the figure. The markers’ geometrical arrangements of the above instruments are different. The geometrical arrangement of the markers on instrument VII is approximately a right triangle, that of instrument X is an obtuse triangle, and those of the other instruments are approximately isosceles triangles. In addition, the T-shaped fixed bases of instruments I to VII are made by the authors, and the instruments VIII to X are biopsy needles. The wireless instruments IX and X are battery powered, and the other instruments are wired.

 figure: Fig. 4

Fig. 4 Surgical instruments designed by authors with different markers’ geometrical arrangements.

Download Full Size | PDF

The following measurement experiments are designed and completed in order to verify the effectiveness and feasibility of the proposed algorithm. In the simulation, two Bumblebee2 cameras were used. One camera is named Hori-camera, and the other is named Vert-camera. Hori-camera is applied in real-time tracking, the stability, the horizontal distance, and the rotation angle test. The Vert-camera is used in the vertical distance test. All experiments were carried out on a Precision Optical Table (OTPOT20-10, Zolix Instruments Co., Ltd, Beijing, China) as shown in Fig. 5. The Grating ruler (KA-300/5U/1020 mm, SION, Guangzhou, China) and the Total station (R-202NE, Pentax, Tokyo, Japan) are used except the real-time tracking experiment.

 figure: Fig. 5

Fig. 5 Diagram of experimental measurements at different viewpoints. (a) Diagram of the Cameras, and the Grating Ruler; (b) Diagram of the Hori-camera, the Vert-camera and the Precision Optical Table, and (c) Diagram of the filters, the TSI and the MSI attached to the Total Station.

Download Full Size | PDF

Calibration is performed before the measurement experiments. The object of camera calibration is to obtain the model parameters of a camera, including intrinsic and extrinsic parameters. In order to determine the relationship between a point in a three-dimensional space and its corresponding point in the two-dimensional image plane, it is necessary to establish a camera model. According to Zhang's calibration, it’s convenient that a printed checkerboard is adopted as calibration board. By continuously changing the posture of the checkerboard, the images of the checkboard were obtained. Then, the correspondence of the corner points on the images and on the checkboard are calculated. Finally, the intrinsic and extrinsic parameters are obtained. At least three images of the checkerboard in different poses are needed in Zhang’s calibration to obtain the intrinsic and extrinsic parameters. In order to obtain more accurate intrinsic and extrinsic parameters of the camera, 20 images of the checkerboard in different poses was captured by the Hori-camera in Fig. 6(a), and more images were captured by the Vert-camera in Fig. 6(b). The rectangular slices in Fig. 6(a) or 6(b) represent the checkerboard image in different poses. And the digit on each slice is the number of the checkboard image captured during the process of camera calibration, respectively. Due to the different distances between the camera and the checkerboard in the two calibrations, the position of images of the checkerboard are different. The results of the Hori-camera’s calibration are shown in Fig. 6(a). The distances between the left and right camera in the X-, Y-, and Z- axes are 119.8 mm, −0.5094 mm, and 0.1512 mm, respectively. The distances show that the left and right cameras are approximately aligned. The Vert-camera’s calibration results are shown in Fig. 6(b). The distances between the left camera and the right camera in the X-, Y-, and Z- axes are 120 mm, −5.292 mm, and −0.1863 mm, respectively. Although the distances show that the left and right cameras is a certain deviation in the Y-axis, the X- and Z-axes are well aligned.

 figure: Fig. 6

Fig. 6 Calibration results demonstrating two cameras’ alignment. (a) The Hori-camera’s calibration result; (b) The Vert-camera’s calibration result.

Download Full Size | PDF

In the simulated experiment of multiple instruments, in order to show the differences between the positioning of multiple instruments simultaneously and positioning a single instrument in the translational test and rotational test, two instruments are fixed on Grating ruler or Total station at the same time.

3.1 Instrument identification and real-time tracking test

In order to verify whether the proposed algorithm can identify multiple instruments and track them simultaneously, the ten self-made instruments were used for simulation, and each instrument has three markers. The two figures in Fig. 7 show the tracking trajectories of ten simulated surgical instruments which moved in groups and independently, respectively. And the motion trajectories of different instruments are represented with different colors. For example, the red lines in Fig. 7(a) and Fig. 7(b) both represent the simulated surgical instrument VI. In Fig. 7(a), the brown, red, and purple lines represent the trajectories of instruments III, VI, and VIII which moved as a group. But in Fig. 7(b), the former mentioned three color lines are the trajectories of instruments III, VI, and VIII which moved independently. The two pictures show that the proposed algorithm can track multiple instruments under the circumstances of moving in groups or independently.

 figure: Fig. 7

Fig. 7 Tracked trajectories of the ten instruments using the proposed algorithm under different conditions. (a) The trajectories of the ten instruments moving within groups (see Visualization 1); (b) The trajectories of the ten instruments moving independently (see Visualization 2).

Download Full Size | PDF

In Fig. 7, it indicates that the proposed algorithm is able to display the positions and trajectories of the tips of multiple surgical instruments in real-time. Although NDI’s Polaris Vega can track six active wireless tools. The number of active instruments that can be simultaneously tracked is limited. And the active tools have to be wireless. Moreover, although the Hybrid Polaris Spectra is able to track nine wired active tools at the same time, the method of tracking wired active tools is complicated and costly. Both methods limit the number of tools that can be tracked. To the proposed algorithm, the tracking of multiple instruments is not limited by the marker's arrangement pattern or the way of the marker's activation. Moreover, the computational cost of the proposed system is low.

3.2 Stability test

In the stability test, the experiment was carried out by fixing multiple instruments on the Total Station. However, it is difficult to fix more instruments on the Total Station. In order to simulate stability when there is more than one instrument at the same time, two instruments were selected from the ten instruments and they are shown in Fig. 8(a). The bigger instrument is the TSI, and the other is the MSI. TSI and MSI are defined III and VII in Fig. 4, respectively. And then, they are fixed on the Total Station as shown in Fig. 8(b).

 figure: Fig. 8

Fig. 8 Diagram of the TSI and MSI. (a) Markers’ geometrical arrangements for the TSI and MSI; (b) The TSI and MSI attach to the Total Station, and (c) The TSI and MSI attach to the Grating Ruler.

Download Full Size | PDF

The TSI and MSI are measured in the stability simulation at the same time. All the knobs are tightened to keep instruments rigidly attached to the Total Station. The Total Station with the TSI and MSI is placed on a Precision Optical Table. The position points are selected at intervals of about 75 mm for the X-axis, and at intervals of about 50 mm for the Z-axis. Pairs of images for the simultaneously-activated TSI and MSI are captured 100 times in each position and the standard deviations (SD) of the X-, Y-, and Z- axes are calculated. The standard deviation is a measure that reflects the degree of dispersion of the coordinates for the X- axis, denoted by

SD=[1Ni=1N(sis¯)2]12
where N is the number of images captured, si is the ith coordinate value of the X- axis, and s¯ is the mean value of the coordinate values for the X- axis. Equation (16) is also applied to the Y-, and Z- axes. The stability is represented by SD. The results are shown in Fig. 9.

 figure: Fig. 9

Fig. 9 The SDs’ results for the TSI and MSI. (a) is the SDs for the TSI’s X-, Y-, and Z- axes, and (b) is the SDs for the MSI’s X-, Y-, and Z- axes.

Download Full Size | PDF

The stability results in the X-axis, Y-axis and Z-axis of the TSI, namely, the stability of the TSI’s tip at x-, y-, and z- coordinates are shown in Fig. 9(a), and the stability results in the X-axis, Y-axis and Z-axis for the TSI are shown as red, blue and black circles, respectively. The stability results in the X-axis, Y-axis and Z-axis of the MSI are shown in Fig. 9(b). In Fig. 9, the x-coordinate of the TSI’s tip or MSI’s tip is taken as the abscissa, and the z-coordinate of the TSI’s tip or the MSI’s tip is taken as the ordinate. The SDs of X-, Y-, Z- axes are denoted by SDX, SDY, and SDZ. For the TSI, the minimum of SDX, SDY, and SDZ are 0.016 mm, 0.011 mm, and 0.124 mm, respectively, and the maximum values are 0.127 mm, 0.090 mm, and 0.901 mm, respectively. The measurement range for the Z-axis is from 477 mm to 1032 mm. For the MSI, the minimum and maximum values of SD for the X-axis are 0.011 mm and 0.133 mm, respectively. They are 0.010 mm and 0.106 mm for the Y- axis, and 0.093 mm and 0.932 mm for the Z-axis. The measurement range for the Z-axis is from 483 mm to 1036 mm.

The SDs are enlarged several times in order to show in figure because the values are very small. Therefore, the radii of the circles in the above figures are SDX*100, SDY*100, and SDZ*10 for the TSI or MSI, respectively.

The SDs for the TSI and MSI for the X-, Y-, and Z- axes are submillimeter. The results indicate that the proposed algorithm have a high stability. The SDs of the points at the edge of the measurement range is larger than the SDs of the points in the middle of the measurement range. It can be observed that the SDs of the points close to the camera is smaller than the SDs of the points far from the camera, which seems to have a tendency to increase as the depth increases [30].

3.3 Space distance test

To verify the accuracy of tracking multiple instruments in the space, the horizontal and vertical distance test are performed. As shown in Fig. 5(a), the Hori-camera and the Vert-camera are used for horizontal and vertical distance testing, respectively. A Grating Ruler is applied to measure the actual distance that multiple instruments move at the same time. The resolution of the Grating Ruler is 0.005 mm. Two instruments are fixed with a sensor slip head, and the Grating Ruler is parallel to the X-axis. The directions of the instruments in the figure are indicated by red dashed lines as shown in Fig. 5(b).

The Hori-camera in Fig. 5 is used to capture images for the horizontal distance test. The TSI and MSI are parallel to the camera so that it can capture their respective markers. The measurement steps are as follows. First, the two instruments are placed at the starting point, and the value of the Grating Ruler’s display is reset. At the same time, 100 pairs of images are captured and the average coordinates of the TSI and MSI are calculated. And then, the sliding head is moved to a certain distance so that the markers can be captured by the stereo camera at all times. Next, 100 frames are captured when the TSI and MSI are stable and the average coordinates of the two instruments are calculated. The value of the Grating Ruler is recorded and the value is set as the ground truth value. Finally, the ground truth value is compared with the distance calculated by the proposed algorithm. The root mean square error (RMSE) is used to measure the deviation between the observed values and the ground truth values. The RMSE is determined for N by

RMSE=[1Ni=1N(xobs,ixmodel,i)2]12
where N is the number of samples, xobs,i is the ith observed sample value, and xmodel,i is the ith predicted value. Ten measurement results for the TSI and MSI are shown in Table 1.

Tables Icon

Table 1. Horizontal distance results for the TSI and MSI

The z-distances for the TSI and MSI are about 363 mm and 364 mm, respectively. The average absolute errors of the TSI and MSI are 0.057 mm and 0.056 mm, respectively. The RMSE for the TSI and MSI are 0.065 mm and 0.061 mm, respectively. The accuracy for the TSI is 0.057 ± 0.027 mm, and 0.056 ± 0.015 mm for the MSI. The measurement results verify the high precision of the proposed algorithm.

In addition, the Vert-camera in Fig. 5 is applied and calibrated for the vertical distance test. The vertical distance test is performed to verify the accuracy of tracking multiple instruments for the Z-axis. The TSI and MSI are also fixed on the Grating Ruler. The measurement steps are similar to the above horizontal distance test. Ten groups of results are shown in Table 2.

Tables Icon

Table 2. Vertical distance results of TSI and MSI

As shown in Table 2, the average absolute error and RMSE for the TSI are 0.201 mm and 0.347 mm, respectively, and the average absolute error and RMSE for the MSI are 0.136 mm and 0.261 mm, respectively. The accuracy of the TSI is 0.201 ± 0.183 mm, and the accuracy of the MSI is 0.136 ± 0.145 mm. The depths in the vertical distance test for the TSI and MSI are [650, 796] mm and [580,726] mm, respectively.

3.4 Rotation test

In the rotation test, the Total Station is also used to measure the accuracy of the rotation angle. The TSI and MSI are attached to both sides of the Total Station, as is shown in Fig. 5(c). The trajectory of rotation in the figure is marked by a red curved arrow. The knob on the Total Station is tightened to avoid free rotation in the test.

The measurement steps of the rotation test are as follows. First, when the Total Station with the two instruments is still, capture 100 frames of the TSI and MSI at the same time and calculate the average angle as the starting angle. Record the angle reading on the digital display of the Total Station. Next, rotate the two instruments. Then, capture images until the digital reading stabilizes, and calculate the average angle from 100 frames as the ending angle. The angle reading of the Total Station is also recorded. The ending angle minus the starting angle, which is the rotation angle measured by the Total Station. Finally, the rotation angle measured by the Total Station is compared to the angle calculated by the proposed algorithm. The results are shown in Table 3.

Tables Icon

Table 3. Rotation angle comparison for the TSI and MSI

There are 10 angle groups measured from 5 degrees to 60 degrees. The distance for the TSI from the camera is about 610 mm, and about 622 mm for the MSI. The average absolute error and RMSE for the TSI are 0.059 degrees and 0.082 degrees, respectively. The average absolute error of the MSI is 0.040 degrees and the average of the RMSE is 0.087 degrees. The accuracy for the TSI is 0.059 ± 0.043 degrees, and the accuracy for the MSI is 0.040 ± 0.049 degrees.

The two instruments, the TSI and MSI, can be captured by the stereo camera at all times in the rotation test. The process simulates the rotation of surgical instruments in clinical practice. The experimental results show that the proposed algorithm not only identifies and tracks multiple instruments accurately, but also has high rotational accuracy.

4. Discussion and conclusion

In this paper, an algorithm for accurately identifying and quickly tracking surgical instruments is proposed. There are some advantages. First, they can meet the accuracy of clinical requirements. And comparing to other commercial surgical tracking systems, the accuracy of the proposed system is on the same order of magnitude. Second, the proposed method can effectively track multiple instruments at the same time. Whether the instruments move in groups or by themselves, the proposed algorithm can accurately identify them and track their motion trajectories in the three dimensional world coordinate system. At last, new algorithms in identification and tracking were proposed. The IIMGA is applied for identification which are not confined by the geometrical shape of the markers’ arrangement or the way of activation of the active markers. Moreover, the ITMMV is proposed and implemented for continuously tracking of an identified instrument, in which the motion vector technique in computer vison was adopted.

In the simulation experiments, the TSI and MSI are both fixed on the Total Station or the Grating Ruler, 100 frames are captured at the same position for all measurement experiments. The SDs for the TSI and MSI in the stability test show the instruments are stable. In the horizontal distance test, the MSI and TSI have similar high translational accuracy. However, the results in the vertical distance test are not as accurate as the horizontal distance test. It may be due to different calibration coefficients of different cameras, different depths captured by the cameras or human factors. Even so, the results of the experiments show a high accuracy. Therefore, the proposed algorithm can accurately identify and rapidly track multiple instruments.

However, in the application of the algorithm, some parts still need improvement. For camera, the framerate, the resolution, and the capture range are the main factors which directly affect the stability and the accuracy of position of instruments. For instruments, the material, fixation of infrared luminescent markers are also important for identifying and tracking. On the other hand, the experimental results of the proposed algorithm verify the high stability and accuracy which meets the needs of tracking multiple surgical instruments in clinical surgical navigation systems.

Funding

National Natural Science Foundation of China (No.61672362, 61272255); Beijing Natural Science Foundation (No.4172012); Scientific Research Common Program of Beijing Municipal Commission of Education (No.KM201710025011).

Acknowledgments

The authors would like to thank Mr.Hans Hermans for manuscript editing, Mr. Zhentian Zhou and Mr. Boqi Jia for simulated the instruments.

Disclosures

The authors declare that there are no conflicts of interest related to this article.

References

1. G. Widmann, R. Widmann, E. Widmann, W. Jaschke, and R. Bale, “Use of a surgical navigation system for CT-guided template production,” Int. J. Oral Maxillofac. Implants 22(1), 72–78 (2007). [PubMed]  

2. S. Engelhardt, R. De Simone, S. Al-Maisary, S. Kolb, M. Karck, H. P. Meinzer, and I. Wolf, “Accuracy evaluation of a mitral valve surgery assistance system based on optical tracking,” Int. J. CARS 11(10), 1891–1904 (2016). [CrossRef]   [PubMed]  

3. R. Niehaus, D. Schilter, P. Fornaciari, C. Weinand, M. Boyd, M. Ziswiler, and S. Ehrendorfer, “Experience of total knee arthroplasty using a novel navigation system within the surgical field,” Knee 24(3), 518–524 (2017). [CrossRef]   [PubMed]  

4. K. M. Peters, E. Hutter, R. A. Siston, J. Bertran, and M. J. Allen, “Surgical navigation improves the precision and accuracy of tibial component alignment in canine total knee replacement,” Vet. Surg. 45(1), 52–59 (2016). [CrossRef]   [PubMed]  

5. G. Dagnino, I. Georgilas, S. Morad, P. Gibbons, P. Tarassoli, R. Atkins, and S. Dogramadzi, “Image-Guided surgical robotic system for percutaneous reduction of joint fractures,” Ann. Biomed. Eng. 45(11), 2648–2662 (2017). [CrossRef]   [PubMed]  

6. A. F. Mavrogenis, O. D. Savvidou, G. Mimidis, J. Papanastasiou, D. Koulalis, N. Demertzis, and P. J. Papagelopoulos, “Computer-assisted navigation in orthopedic surgery,” Orthopedics 36(8), 631–642 (2013). [CrossRef]   [PubMed]  

7. T. J. Liu, A. T. Ko, Y. B. Tang, H. S. Lai, H. F. Chien, and T. M. H. Hsieh, “Clinical application of different surgical navigation systems in complex craniomaxillofacial surgery: The use of multisurface 3-Dimensional images and a 2-Plane reference system,” Ann. Plast. Surg. 76(4), 411–419 (2016). [CrossRef]   [PubMed]  

8. D. Guha, R. Jakubovic, N. M. Alotaibi, R. Deorajh, S. Gupta, M. G. Fehlings, T. G. Mainprize, A. Yee, and V. X. D. Yang, “Optical topographic imaging for spinal intraoperative 3-dimensional navigation in the cervical spine: initial preclinical and clinical feasibility,” Clin. Spine Surg.1 (2019).

9. H. Jobidon-Lavergne, S. Kadoury, D. Knez, and C. E. Aubin, “Biomechanically driven intraoperative spine registration during navigated anterior vertebral body tethering,” Phys. Med. Biol. (to be published).

10. Z. Han, K. Yu, L. Hu, W. Li, H. Yang, M. Gan, N. Guo, B. Yang, H. Liu, and Y. Wang, “A targeting method for robot-assisted percutaneous needle placement under fluoroscopy guidance,” Comput. Assist. Surg. (AbfFingdon)1–9 (2019).

11. S. J. Rhee, S. H. Park, H. M. Cho, and J. T. Suh, “Comparison of precision between optical and electromagnetic navigation systems in total knee arthroplasty,” Knee Surg. Relat. Res. 26(4), 214–221 (2014). [CrossRef]   [PubMed]  

12. K. Xu, L. Huang, Z. Zhang, J. Zhao, Z. Zhao, Z. Zhang, L. W. Snyman, and J. W. Swart, “Light emission from a poly-silicon device with carrier injection engineering,” Materials science and engineering b-advanced functional solid-state materials 231, 28–31 (2018).

13. W. Chen, B. Javidi, and X. Chen, “Advances in optical security systems,” Adv. Opt. Photonics 6(2), 120–155 (2014). [CrossRef]  

14. Northern Digital Inc, “NDI Products: Polaris Spectra and Vicra,” https://www.ndigital.com.

15. Medtronic, “Products: Surgical Navigation Systems,” http://www.medtronic.com/us-en/healthcare-professionals/products.html.

16. Stryker Corporation, “Product: Computer assisted surgery platforms,” http://www.stryker.com/cn/products/OREquipmentTelemedicine/SurgicalNavigation/SurgicalNavigationSystems/index.htm.

17. OMNIlife science, “Products: Robotics,” http://apexknee.com/products/navigation.

18. Anke, “Products & Solutions: Neurosurgery Series,” http://www.anke.com/ProductsMore.asp?ClassID=11.

19. TINAVI, “Porducts: TiRobot,” http://www.tinavi.com/index.php?m=content&c=index&a=lists&catid=25.

20. T. C. Clark and F. H. Schmidt, “Robot-Assisted navigation versus computer-assisted navigation in primary total knee arthroplasty: Efficiency and Accuracy,” ISRN Orthop. 2013, 794827 (2013). [CrossRef]   [PubMed]  

21. B. Li, L. Zhang, H. Sun, S. G. F. Shen, and X. Wang, “A new method of surgical navigation for orthognathic surgery: optical tracking guided free-hand repositioning of the maxillomandibular complex,” J. Craniofac. Surg. 25(2), 406–411 (2014). [CrossRef]   [PubMed]  

22. M. Bolognesi and A. Hofmann, “Computer navigation versus standard instrumentation for TKA: a single-surgeon experience,” Clin. Orthop. Relat. Res. 440(440), 162–169 (2005). [CrossRef]   [PubMed]  

23. S. E. Leis, “System for determining the spatial position and orientation of a body,” US6061644 (2000).

24. A. D. Wiles, D. G. Thompson, and D. D. Frantz, “Accuracy assessment and interpretation for optical tracking systems,” Proc. SPIE 5367, 421 (2004). [CrossRef]  

25. L. Liu, B. Sun, N. Wei, C. Hu, and M. Q. H. Meng, “A Novel marker tracking method based on extended kalman filter for multi-camera optical tracking systems,” in 2011 5th International Conference on Bioinformatics and Biomedical Engineering, (Academic, 2011), pp. 1–5. [CrossRef]  

26. K. Cai, R. Yang, Q. Lin, and Z. Wang, “Tracking multiple surgical instruments in a near-infrared optical system,” Comput. Assist. Surg. (Abingdon) 21(1), 46–55 (2016). [CrossRef]   [PubMed]  

27. Z. Zhou, B. Wu, J. Duan, X. Zhang, N. Zhang, and Z. Liang, “Optical surgical instrument tracking system based on the principle of stereo vision,” J. Biomed. Opt. 22(6), 65005 (2017). [CrossRef]   [PubMed]  

28. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000). [CrossRef]  

29. FLIR Integrated Imaging Solutions Inc, “Stereo vision bumblebee2 1394A,” https://www.ptgrey.com/bumblebee2-firewire-stereo-vision-camera-systems.

30. R. Khadem, C. C. Yeh, M. Sadeghi-Tehrani, M. R. Bax, J. A. Johnson, J. N. Welch, E. P. Wilkinson, and R. Shahidi, “Comparative tracking error analysis of five different optical tracking systems,” Comput. Aided Surg. 5(2), 98–107 (2000). [CrossRef]   [PubMed]  

Supplementary Material (2)

NameDescription
Visualization 1       Ten simulated surgical instruments are accurately tracked when they are moved in groups.
Visualization 2       Ten simulated surgical instruments are accurately tracked when they are moved individually.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Framework of the proposed optical tracking system.
Fig. 2
Fig. 2 The transformation between the camera coordinate system and the image coordinate system
Fig. 3
Fig. 3 The schematic view of the experimental setup.
Fig. 4
Fig. 4 Surgical instruments designed by authors with different markers’ geometrical arrangements.
Fig. 5
Fig. 5 Diagram of experimental measurements at different viewpoints. (a) Diagram of the Cameras, and the Grating Ruler; (b) Diagram of the Hori-camera, the Vert-camera and the Precision Optical Table, and (c) Diagram of the filters, the TSI and the MSI attached to the Total Station.
Fig. 6
Fig. 6 Calibration results demonstrating two cameras’ alignment. (a) The Hori-camera’s calibration result; (b) The Vert-camera’s calibration result.
Fig. 7
Fig. 7 Tracked trajectories of the ten instruments using the proposed algorithm under different conditions. (a) The trajectories of the ten instruments moving within groups (see Visualization 1); (b) The trajectories of the ten instruments moving independently (see Visualization 2).
Fig. 8
Fig. 8 Diagram of the TSI and MSI. (a) Markers’ geometrical arrangements for the TSI and MSI; (b) The TSI and MSI attach to the Total Station, and (c) The TSI and MSI attach to the Grating Ruler.
Fig. 9
Fig. 9 The SDs’ results for the TSI and MSI. (a) is the SDs for the TSI’s X-, Y-, and Z- axes, and (b) is the SDs for the MSI’s X-, Y-, and Z- axes.

Tables (3)

Tables Icon

Table 1 Horizontal distance results for the TSI and MSI

Tables Icon

Table 2 Vertical distance results of TSI and MSI

Tables Icon

Table 3 Rotation angle comparison for the TSI and MSI

Equations (17)

Equations on this page are rendered with MathJax. Learn more.

{ u= i=1 n u i f( u i , v i ) i=1 n f( u i , v i ) v= i=1 n v i f( u i , v i ) i=1 n f( u i , v i )
η= D A l B l / D A l C l
{ D A l B l = [ ( u A l u B l ) 2 + ( v A l v B l ) 2 ] 1 2 D A l C l = [ ( u A l u C l ) 2 + ( v A l v C l ) 2 ] 1 2
| η δ l |< ε 1
{ x a = f Z A X A y a = f Z A Y A
E= [ ( X A X B ) 2 + ( Y A Y B ) 2 + ( Z A Z B ) 2 ] 1 2
E= [ ( x a Z A f x b Z B f ) 2 + ( y a Z A f y b Z B f ) 2 + ( Z A Z B ) 2 ] 1 2
E= Z f [ ( x a x b ) 2 + ( y a y b ) 2 ] 1 2
{ u= x dx + u 0 v= y dy + v 0
E= Z f [ ( u a u b ) 2 d x 2 + ( v a v b ) 2 d y 2 ] 1 2
E=dx Z f d
P=dx Z w f p
| P P l |< ε 2
I V,W (t)=| MV V,W |=| P W (t) P V (t1) |
{ M A l (t)=min{ I A l A 1 (t), I A l B 1 (t), I A l C 1 (t),..., I A l A L (t), I A l B L (t), I A l C L (t) } M B l (t)=min{ I B l A 1 (t), I B l B 1 (t), I B l C 1 (t),..., I B l A L (t), I B l B L (t), I B l C L (t) } M C l (t)=min{ I C l A 1 (t), I C l B 1 (t), I C l C 1 (t),..., I C l A L (t), I C l B L (t), I C l C L (t) }
SD= [ 1 N i=1 N ( s i s ¯ ) 2 ] 1 2
RMSE= [ 1 N i=1 N ( x obs,i x model,i ) 2 ] 1 2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.