Skip to main content

Experience-based virtual training system for knee arthroscopic inspection

Abstract

Background

Arthroscopic surgical training is inherently difficult due to limited visibility, reduced motion freedom and non-intuitive hand-eye coordination. Traditional training methods as well as virtual reality approach lack the direct guidance of an experienced physician.

Methods

This paper presents an experience-based arthroscopic training simulator that integrates motion tracking with a haptic device to record and reproduce the complex trajectory of an arthroscopic inspection procedure. Optimal arthroscopic operations depend on much practice because the knee joint space is narrow and the anatomic structures are complex. The trajectory of the arthroscope from the experienced surgeon can be captured during the clinical treatment. Then a haptic device is used to guide the trainees in the virtual environment to follow the trajectory.

Results

In this paper, an experiment for the eight subjects’ performance of arthroscopic inspection on the same simulator was done with and without the force guidance. The experiment reveals that most subjects’ performances are better after they repeated the same inspection five times. Furthermore, most subjects’ performances with the force guidance are better than those without the force guidance. In the experiment, the average error with the force guidance is 33.01% lower than that without the force guidance. The operation time with the force guidance is 14.95% less than that without the force guidance.

Conclusions

We develop a novel virtual knee arthroscopic training system with virtual and haptic guidance. Compared to traditional VR training system that only has a single play-script based on a virtual model, the proposed system can track and reproduce real-life arthroscopic procedures and create a useful training database. From our experiment, the force guidance can efficiently shorten the learning curve of novice trainees. Through such system, novice trainees can efficiently develop required surgical skills by the virtual and haptic guidance from an experienced surgeon.

Background

In recent years, arthroscopy has played a significant role for orthopaedic surgeons to perform minimally invasive surgery (MIS) on human joints. Compared to traditional open surgery, the MIS surgical technique provides benefits of less trauma, reduced pain, and faster healing. However, training of arthroscopic procedures is inherently difficult because of the limited visibility, reduced degrees of freedom of the instrument, and non-intuitive hand-eye coordination. It is vital for novice arthroscopic trainees to receive extensive and sufficient training before real surgeries to avoid surgical mistakes or unexpected injuries to patients. Conventional arthroscopic skill training relies on the use of cadavers, animals, or physical models. However, cadavers and animals cannot be repeatedly used and the physical model cannot provide realistic sensation feedback [1, 2].

To overcome the above-mentioned shortcomings, virtual reality (VR) simulation provides an alternative solution [38]. A virtual model can be created from medical imaging data such as computed tomography (CT) or magnetic resonance imaging (MRI) and training can be performed in a VR immersed environment. A VR simulator can be reused many times and avoids risking patients’ health. Megali et al. [3] added training exercises to their already-developed navigation system. Results showed that when performing the training exercises, performance increases with surgical experience. Mabrey et al. [4] devised a virtual reality arthroscopic knee simulator consisting of a video display, two force feedback devices (one for the tool, one for the arthroscope), and a surrogate leg model with built-in sensors. However, they noted that hardware costs were high, with initial software development costs even higher. Heng et al. [5] built a virtual reality training system for arthroscopic knee surgery. They also developed a specific two-hand haptic device and put it into a box to present user with force feedback in a purely virtual environment. In their system, inspection training is supported and then the user could explore the virtual knee joint. In a similar way, Bayonat et al. [6] developed a virtual shoulder arthroscopy training simulator with force feedback. Moody et al. [7] presented a training environment that incorporated a realistic manipulable leg model to provide tactile augmentation to their virtual training environment. Again, virtual and external views of the arthroscope were shown. Tuijthof et al. [8, 9] developed a physical environment to practice arthroscopic surgical skills to simulate operative real-life treatment. Y. Wang [10] developed a surgical procedure simulation system called vKASS for arthroscopic anterior cruciate ligament (ACL) reconstruction. Compared to previous studies, this system built a complete simulation for the entire procedure of arthroscopic ACL reconstruction. Recently, two commercial systems, such as Insight ArthroVR [11] and ARTHRO Mentor™ [12], can help trainees practice created scenarios and get force feedback by SensAble Omni®. Furthermore, they also provide a set of performance metrics for the objective assessment of skills development and the learning process at the end of each session.

Although the VR arthroscopic system can be used many times and a systematic training program can be embedded into the VR environment to replace conventional training methods, most of the current systems still have the following drawbacks. First, the training program is mostly based on a single play-script and a virtual model, which means it is not integrated with real-life clinical treatments. Real surgical trajectories cannot be captured and embedded into the system to match with different clinical situations. Second, traditionally, the rendering of a VR model from CT or MRI normally was monochrome. As a result, experienced surgeons still find it difficult to pass on their surgical skills to residents or novices with a VR-based training system. Without good guidance and practice, medical students cannot develop mature skills for such delicate task. Therefore, currently most VR-based training systems improve their rendering by texture mapping to strength their visual guidance. Study even showed that poor manipulation of the arthroscopic instrument can result in harmful collisions in real clinical situations [6]. Hence, the force guidance is a good choice to overcome these situations. Various studies [1318] have shown that force guidance can effectively shorten the learning curves in many fields.

Moreover, most systems are costly due to expensive system integration of complex hardware and software components. Last but not the least, it is important to objectively evaluate or assess the skill levels of the trainees so that they can continuously improve their surgical skills. However, since real-life surgical case is difficult to be reproduced and incorporated into the traditional VR-based system, it is also hard to set up an objective evaluation standard to assess the arthroscopic surgical skills.

Currently, computer-aided surgical navigation systems are available to assist surgeons in different treatments [1921]. In these systems, surgical planning, camera calibration, registration and motion tracking are combined. The use of the tracking technology continuously registers the position of patient and surgical instruments using different methods [22]. The magnetic, optical and vision-based tracking techniques are well-known. Generally, the cameras suffer from a distortion and then the perspective matrix is obtained using intrinsic and extrinsic matrix. These can be achieved using well-established camera calibration techniques [23, 24].

During the treatment, the trajectory of the surgical tool operated by a surgeon can be recorded by the motion tracking. As a result, the clinical trajectory from an experienced surgeon can be integrated into the traditional VR-based training. Through the integration, the training system provides different real-life surgical cases and helps novices follow the clinical trajectory from the experienced surgeon. This paper presents an experience-based haptic training system for knee arthroscopic inspection. The key element of the proposed system is the integration of a motion tracking module with a haptic device such that the real-life surgical procedure and inspection trajectory can be captured and used to guide a novice trainee to repeat and practice the same clinical routine. Compared to traditional VR training system that has a single play-script based on a virtual model, the proposed system can capture and store different real-life clinical arthroscopic procedures and create a useful training database. Experienced surgeons can record their arthroscopic procedures in clinical treatment and pass on their surgical skills and precious experiences.

Methods

The overall system consists of a pre-processing module, a clinical module and a training module, as shown in Figure 1. In the pre-processing module, a calibration needs to be done for a vision based tracking. Then, a virtual knee-joint model is built from a real patient’s medical images. The compartments are segmented from the virtual knee-joint model. In this paper, instead of using monotone shading or artificial texture mapping, we proposed to map the original images from the arthroscope to the reconstructed triangular surfaces from CT so that the model for the training will not be just a single-script example. This is important because some damaged or inflammation areas can not be displayed without color images. Hence, the 2D clinical arthroscopic view from clinical treatment is mapped to the 3D surface of the virtual knee-joint compartments. In the clinical module, the motion tracking is utilized to compute and record the arthroscopic procedure or inspection trajectory during a virtual surgical case performed by an experienced surgeon. The arthroscopic procedure is operated by an experienced surgeon and one fiducial marker is fixed on the arthroscope for the motion tracking. During the simulated inspection, a calibrated vision based tracking is used for the motion tracking. Thus, the simulated trajectory of the arthroscope including the position and orientation can be recorded. After the pre-processing and clinical inspection, the simulated trajectory of the arthroscope, the clinical arthroscopic view and its texture data can be integrated with the virtual knee-joint model to create a real-world clinical database. Finally, the trainees can select different scenarios from the database to practice their skills during the training module. To filter unwanted noise in the trajectory data, NURBS (Non-Uniform Rational B-Spline) interpolation is used to provide smooth trajectory of the arthroscope. Then, a registration procedure between the trajectory and the virtual knee-joint compartments is performed manually to match their positions and orientations in space. Using the tracking module, the realistic knee-joint model can be loaded from the database for visualization. The trainees not only can see the trajectory of the arthroscope in the realistic environment, but also operate a haptic device as the arthroscope. During the operation, the trainees can feel a guiding force if their operation is far from the original trajectory. Finally, we evaluate the trainees’ performances against the original trajectory with and without force guidance.

Figure 1
figure 1

The system architecture.

Pre-processing module

Camera calibration

For vision based tracking, camera calibration is important to obtain good tracking accuracy before the motion tracking. In this system, a two-step calibration method [25] is adopted to reduce the distortion and compute the camera calibration parameters. The first step, distortion calibration, is used to solve the uneven spacing between the dots in the camera image. On the other hand, the second step of the camera calibration is for the computation of the camera intrinsic matrix, which consists of focal length, image center and effective pixel size to perform the perspective transformation for the camera.

Segmentation of medical images and texture mapping

To build the virtual knee-joint compartments, a medical image segmentation software (MIMICS, Leuven, Belgium) is used to segment the meniscus, articular cartilage, plica, ligament, femur, tibia, patella and fibula from medical images. After the segmentation, there are still unnecessary meshes such as holes and self-intersection in the virtual model. Hence, a scanning software (Geomagic Studio®, Morrisville, USA) is used to repair these unnecessary meshes and then carry out texture mapping. In the proposed system, the texture is from the clinical arthroscopic view and it is mapped to the surface of the virtual knee-joint compartments. After the texture mapping, finer surface details can be observed from these virtual compartments, as shown in Figure 2. Damaged or inflammatory surface areas can also be revealed after the texture mapping.

Figure 2
figure 2

The texture mapping for the anterior cruciate ligament. (a) The clinical arthroscopic view, (b) The virtual ligament model, (c) The textured model.

Motion tracking

In the motion tracking, a transformation, called camera extrinsic matrix (T cm ), from the camera coordinates to the marker coordinates can be computed and recorded when the marker is detected. In this procedure, a camera intrinsic matrix (K c ) is required to represent the relationship between the camera coordinates and camera screen coordinates. The main reason is that the marker detection is in the camera screen coordinates. The transformation among the camera, camera screen and marker coordinates can be illustrated in Figure 3.

Figure 3
figure 3

The relationship between camera, camera screen and marker coordinates.

K c can be computed by the two-step camera calibration method and it only needs to be done once before the motion tracking. Once K c is obtained, T cm can be computed and recorded continuously. In Figure 3, a 3D point P in the marker and camera coordinates are separately P m ?=?[x m , y m , z m ]T and Pc?=?[x c , y c , z c ]T. The relationship is as follows:

P c = x c y c z c 1 = R 3 × 3 t 3 × 1 0 0 0 1 4 × 4 x m y m z m 1 = T cm P m
(1)

where R3×3 and t3×1 are the rotation matrix and translation vector in T cm . In this paper, a square marker with a known-size is used as a base of the marker coordinates. Then, the camera extrinsic and intrinsic matrices can be computed by the Kato’s method [24]. During the motion tracking, the first stage is the marker detection in camera screen coordinates. Hence, an image analysis in camera screen coordinates is required. The image analysis includes building the binary image and identifying the black marker frame and symbol. Finally, the position and orientation for the marker can be obtained by the known K c and the above information.

In the proposed system, two cameras are used to overcome the occlusion problem. Thus, the associated coordinate systems and the transformations in the virtual world are illustrated in Figure 4. All the coordinate systems used in this paper are right-handed. In the virtual world, the key references are the marker coordinates {X m ,Y m ,Z m }, the left tracking camera coordinates {X l ,Y l ,Z l }, the right tracking camera coordinates {X r ,Y r ,Z r }, the left tracking camera screen coordinates {X lu ,Y lv }, and the right tracking camera screen coordinates {X ru ,Y rv }.

Figure 4
figure 4

The illustration of the virtual world components and transformations in two cameras.

Given a 2D projection p l on the viewing plane of the left tracking camera screen coordinates, its 3D point P l in the left tracking camera coordinates can be computed by K l . Then, the T lm -1 times P l makes P m . Once P m is obtained, its 3D projection P r in the right tracking camera coordinates can be computed by T rl . Its 3D relation between P l and P r is given by

P = T mr P r = T ml P l P r = T mr 1 T ml P l = T rl P l T rl = T mr 1 T ml
(2)

The flowchart for computing the trajectory is shown in Figure 5. Initially, there is a specific marker whose size is known and there are two CCDs (Charge-Coupled Device) in the proposed system. The proposed system can record the left and right frames in 30 frames per second at the same time. Then, the specific marker is used to detect the marker in these frames. During the detection for the specific marker, there are three kinds of conditions. First, the marker is detected in the left CCD frame and then the current trajectory can be obtained. Second, if the marker is not detected in the left CCD frame, the other detection in the right CCD frame can be executed to find the specific marker. If the marker is detected, the current trajectory computed can also be obtained by the transformation from Eq. (2). Third, if the marker cannot be detected from the left and right CCD frames, the current trajectory will be replaced by the previous trajectory. Finally, all trajectories in all frames can be recorded.

Figure 5
figure 5

The flowchart of the motion tracking.

Filtering of tracking data by NURBS

There are inherent measurement noises in the tracking data of the arthroscope. It is not desirable since a smooth trajectory is required for the arthroscopic training. In this paper, we adopt the use of NURBS fitting to the tracking data to generate smooth trajectory curves. It also helps compress the large amount of tracking data and provides efficient data storage. A pth-degree NURBS curve [26] is defined by.

C u = i = 0 n N i , p u w i P i i = 0 n N i , p u w i = i = 0 n R i , p u P i 0 u 1
(3)

where P i are the control points, w i the weights, and N i,p the ith B-spline basis function of the pth-degree defined on the non-periodic knot vector U.

U = 0 , 0 , , 0 p + 1 , u p + 1 , , u n , 1 , 1 , , 1 p + 1
(4)

N i,p is defined by Eqs. (5)-(6) and R i,p the rational basis function, is defined by Eq. (7).

N i , o u = 1 0 if u i u < u i + 1 otherwise
(5)
N i , p u = u u i u i + p u i N i , p 1 u + u i + p + 1 u u i + p + 1 u i + 1 N i + 1 , p 1 u
(6)
R i , p u = N i , p u w i j = 0 n N j , p u w j
(7)

Given an arthroscopic trajectory data with n discrete points {Q i |i?=?0,1,…n-1}, a NURBS curve which consists of m control points {P j |j?=?0,1,…,m-1} (m?<?n) can be constructed using Eq. (8) [27].

1 0 0 0 R 0 , p u ¯ 1 R 1 , p u ¯ 1 R 2 , p u ¯ 1 R m 1 , p u ¯ 1 R 0 , p u ¯ 2 R 1 , p u ¯ 2 R 2 , p u ¯ 2 R m 1 , p u ¯ 1 R 0 , p u ¯ i R 1 , p u ¯ i R 2 , p u ¯ i R m 1 , p u ¯ i 0 0 0 1 nxm P 0 P 1 P 2 P i P m 1 nxm = Q 0 Q 1 Q 2 Q i Q m 1 n × 1 m < n
(8)

where the parameter u ¯ i corresponding to the tracking data points Q i can be calculated as

u ¯ i = t i t 0 t 1 t 0
(9)

where t0 is the tracking start time, t1 the tracking end time, and ti the time corresponding to Q i . An example is described in Figure 6.

Figure 6
figure 6

The NURBS curve.

After the trajectory of the arthroscopic inspection is captured, the tracking data is filtered by NURBS. Finally, the trajectory is registered to the virtual model created from CT data.

Force guidance of inspection trajectory

After obtaining the trajectory of the arthroscope, trainees can follow the optimal inspection path from experienced surgeons according to the clinically recorded trajectory. In this system, trainees not only can see the trajectory in the virtual environment, but they can also be guided by haptic force feedback. In order to provide force feedback, a haptic device (Phantom, SensAble Technologies, Wilmington, MA) is used for the creation of the guiding force. The guiding force can provide guiding direction, prevent the virtual arthroscope from leaving the original trajectory, and assist the trainees in performing the surgical simulation. There are three guiding forces including attractive force, static force and time-dependent force in the proposed simulator. The overall flowchart of the guiding force in the proposed simulation is illustrated in Figure 7.

Figure 7
figure 7

The flowchart of force guidance for trainees’ movements.

Initialization with attractive force

A recorded trajectory can be represented by a NURBS curve C(t), in which t is the recorded time parameter.

t = t 0 + u t 1 t 0
(10)

and u is the normalized NURBS parameter between 0 and 1. The trainee will be guided by the haptic device to follow the trajectory curve C(t) which simulates the nominal inspection path obtained from the experienced surgeon. A guiding force is needed for the trainee to follow or track C(t). Assuming the trainee’s trajectory D(t) deviates with some distance from C(t), we can provide an attraction force in the initialization stage to attract D(t0) to C(t0). Figure 8 represents the attractive force F A t in a segment of the trajectory curve. When the distance between C(t) and D(t0) is larger than d A , it is visually guided by the VR simulation and the haptic force F A t to help the trainee move the arthroscope towards the target C(t0) (see Figure 9).

F A t = k A d A C t 0 D t C t 0 D t
(11)
Figure 8
figure 8

The attractive force guidance.

Figure 9
figure 9

The static and time-dependent force guidance.

But when the arthroscope probe head comes between the sphere of radius d t and d A , the attractive force will be reduced linearly following Eq. (12).

F A t = k A C t 0 D t
(12)

When the arthroscope probe head comes within the sphere of radius d t , the attractive force is zero.

Static force guidance

After the initialization, the trainee will be guided by force to follow the trajectory C(t). If we only consider the desired trajectory C(t) as a static curve without considering the time constraint the trainee needs to follow and finish the arthroscopic inspection, then we only need to consider the contour error ε between C(t*) and D(t). The projected C(t*) will become the current target point.

ε = C t * D t
(13)

Hence, the guiding force is defined as the static (normal) guiding force F s .

F s t = k s ε = k s C t * D t
(14)

For novice trainees, it means there is no time constraint or pressure to push the trainee in finishing the inspection procedure. The trainee can follow his own speed with which he is comfortable. The static or normal force will guide the arthroscope in getting close to the recorded trajectory in the normal direction. However, the trainee can still be visually guided by the VR environment to move along the path.

Time-dependent force guidance

In this study, it is assumed the recorded trajectory by the experienced surgeon is the standard inspection path to follow. This includes not only the geometric path, but also the various speed throughout the trajectory. An experienced surgeon knows where to move the probe head faster, and where to slow it down for better observation and collision avoidance. Therefore, for advanced and realistic force guidance, C(t) is a time-dependent trajectory to follow, as depicted in Figure 9. The time-dependent guiding force is F d , which can be considered as the combination of normal and tangential guiding force.

F d t = k d e = k d e x , e y , e z T
(15)

where the tracking error e is the vector between the target point C(t t ) and the current point D(t c ). Furthermore, as the trainee obtains more experiences by repeated practices, the force guidance can be reduced and eventually removed. This can be easily implemented by the following scheme.

F = 1 ν k e 0 < ν < 1
(16)

where v is the training strength parameter. When v?=?0, there is full force guidance; when v?=?1, the guidance is completely removed.

Results and discussion

The hardware of our system includes a haptic device, a personal computer and a display monitor. The Phantom haptic device has 6-DOF manipulation and provides 3-DOF (x,y,z) force feedback in the guiding module. Our system is executed on a Pentium 4 1.5 GHz PC equipped with NVidia GeForce3 graphics card. The PC handles all computation consisting of guiding model and visual rendering. The proposed system is implemented in OpenGL and C++.

In the clinical module, the experienced surgeon operated an arthroscope in a virtual patient’s knee during the simulated inspection, as shown in Figure 10(a) and (b). One fiducial marker made of acrylic was disinfected before inspection and fixed on the arthroscope, as depicted in Figure 10(c). Finally, the trajectory of the arthroscope can be saved into the database.

Figure 10
figure 10

The simulated inspection. (a) The real knee model, (b) The simulated inspection, (c) The fiducial marker is fixed on the arthroscope.

Virtualization

In the proposed system, a virtual knee-joint model can be obtained from the database. The original surface of the virtual knee-joint model was black and white. However, it is now enriched with color detail in the visualization because of the texture mapping, as indicated in Figure 11.

Figure 11
figure 11

Visualization for real and virtual arthroscopic view. (a) Virtual inflammatory articular cartilage, (b) Real inflammatory articular cartilage, (c) Virtual inflammatory medial meniscus, (d) Real inflammatory medial meniscus.

Filtering of tracking data by NURBS

After the virtual knee-joint model is loaded, the previous trajectory of the arthroscope for the inspection is also retrieved from the database. The original trajectory has been filtered by NURBS and produced much smoother trajectory curves. The noises in the original trajectory of the arthroscope were effectively filtered and the amount of data was much compressed (see Figure 12).

Figure 12
figure 12

Trajectory reconstruction with NURBS curve. (a) Original trajectory, (b) Modified trajectory.

User interface

The clinical scenario is displayed on a LCD panel, which contains two separate windows, as shown in Figure 13 (Additional file 1). The external viewpoint window displays the surgical scene as viewed form an external viewpoint such as surgeon’s view. This view can be adjusted depending on the surgeon’s preference by changing the camera position, orientation and magnification. The arthroscopic viewpoint window can display the recorded clinical view, as well as the virtual view. Although the virtual view is created by simulation, the clinical arthroscopic view is mapped onto the surfaces of the virtual knee-joint compartments in the virtual view. The trainees can switch between these two views if the patient’s medical images are previously obtained. Otherwise, the trainees can only see the virtual view.

Figure 13
figure 13

The user interface: The left window is external viewpoint window and the right window is arthroscopic viewpoint window (The red points are the insertions for the arthroscope).

Force guidance of inspection trajectory

The trainees can see the trajectory in the proposed system as shown in Figure 14. Then, users can feel the force guidance by the haptic device when their virtual probe head deviates from the planned trajectory, as indicated in Figure 15.

Figure 14
figure 14

The visualization of the trajectory.

Figure 15
figure 15

Force Guidance of Inspection Trajectory.

Evaluation

In this paper, we evaluate that the force guidance improves the training for inspection of the knee. Thus, eight participants perform an inspection of the knee whose trajectory is obtained from an experienced surgeon in the clinical module. All participants were right-handed and they operated the same inspection with and without force guidance. To evaluate participants’ performance, the normal path error and operating time were recorded. The normal path error was based on the trajectory from experienced surgeon in clinical module and the movements of the Phantom tip operated by each participant. A 3D motion of the ideal and experimental trajectory is illustrated in Figure 16(a) (Additional file 2). Then, the normal path error is the shortest distance between the trajectory from the trainee and the trajectory from the experienced surgeon, as shown in Figure 16(b).

Figure 16
figure 16

The training trajectory and error. (a) 3-D motion, (b) The path error.

Additional file 2: Virtual Inspection with force guidance [http://youtu.be/hEW0eaKGZP8]. (AVI 8 MB)

To avoid differences in performance related to unease with the instruments, they used initially the haptic device to perform some basic operations without force guidance in the VR environment [28]. These basic operations were based on the manufacturer established settings, and were not modified by authors. After achieving familiarity with the haptic device and basic operations, the participants completed the first inspection without force guidance and recorded their first operating time and the normal path error as first data. Then, they repeated the same inspection five times without recording for practice. After that, they did the inspection again and recorded their operating time and the normal path error as the second data. Thus, we can understand the learning curve after continuous practices five times according to the first and second data.

In order to avoid the effect of accumulated learning experience, at least 3 days were allowed to pass after the completion of the inspection without the force guidance before we started the new inspection section with the force guidance. The participants performed the inspection with force guidance in the same manner as before. Data on the completion time and error were described in Tables 1 and 2.

Table 1 Without the force guidance
Table 2 With the force guidance

The experiment reveals that most participants’ performances are better after they repeated continuously the same inspection five times. The standard deviations are also clear indications to show that they performed better after repeating the same operation. Furthermore, we should notice that most participants’ performances with the force guidance are better than those without the force guidance, as shown in Figure 17. The performance includes the mean normal path error, standard deviation and the operating time. In the experiment, the average error with the force guidance is 33.01% lower than that without the force guidance. The operating time with the force guidance is 14.95% less than that without the force guidance. Even though B’s second mean normal path error is larger than the first one in the haptics-enhanced inspection, his second standard deviation is similar to the first one. It indicates that there were few tremors during B’s operation. Data on the normal path error showed a trend toward a benefit from the force guidance in the inspection.

Figure 17
figure 17

The subject’s performance of arthroscopic inspection on the same simulator was done with and without the force guidance.

The force guidance is important to shorten the learning time. Hence, the magnitude of force has played an important role in the force guidance. The magnitude of force in this system can be adjusted by the simulated spring coefficient. Figure 18 indicates that the higher the spring coefficient, the better the learning result. However, unstable force will occur if the spring coefficient is set too high.

Figure 18
figure 18

The effect for the magnitude of force.

Conclusions

In this paper, we have discussed the motivations for developing a novel virtual arthroscopic training system. The system focuses on the experience-based inspection of anatomic knee structures. In this system, the trajectory of the arthroscope from an experienced surgeon can be recorded by vision tracking and built into a database. Then, the noise of the trajectory can be filtered by NURBS. Data compression advantage is also achieved. After obtaining the NURBS curve, it can be used to review the entire inspection procedure and guide the trainees in the virtual environment. Furthermore, the clinical arthroscopic view is mapped to the surface of the virtual knee-joint compartments. Hence, more realistic environment can be visualized. Then, the trainee can operate the haptic device along the NURBS trajectory during the force guidance. The trainees will feel the force guidance when their virtual arthroscope probe head deviates from the planned trajectory. The experimental results suggest that stronger educational effect can be achieved by force guidance for novice trainees. Currently, the deformation of the soft tissue has not been implemented in the system. In the future, we will develop the soft tissue deformation model to strength the simulation effect. In addition, a specific haptic device will be built to increase the DOF to simulate the practical treatment. Finally, a multi-metric scoring system will be integrated into the proposed system to objectively evaluate the users’ performances.

Abbreviations

VR:

Virtual reality

MIS:

Minimally invasive surgery

CT:

Computed tomography

MRI:

Magnetic resonance imaging

CCD:

Charge-coupled device

NURBS:

Non-uniform rational B-spline

AVG:

Average

SD:

Standard deviation.

References

  1. Grechenig W, Fellinger M, Fankhauser F, Weiglein AH: The Graz learning and training model for arthroscopic surgery. Surg Radiol Anat 1999, 21: 347–350. 10.1007/BF01631337

    Article  Google Scholar 

  2. Voto S, Clark RN, Zuelzer WA: Arthroscopic training using pig knee joints. Clin Orthop Relat Res 1988, 226: 134–137.

    Google Scholar 

  3. Megali G, Tonet O, Mazzoni M, Dario P, Vascellari A, Marcacci M: A new tool for surgical training in knee arthroscopy. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention; 25–28 September 2002. Edited by: Dohi T, Kikinis R. Tokyo: Springer Berlin Heidelberg; 2002:170–177. Lecture Notes in Computer Science

    Chapter  Google Scholar 

  4. Mabrey J, Gillogly SD, Kasser JR, Kasser J, Sweeney HJ, Zarins B, Mevis H, Garrett WE Jr, Poss R, Cannon WD: Virtual reality simulation of arthroscopy of the knee. Arthroscopy 2002, 18: e28. 10.1053/jars.2002.33790

    Article  Google Scholar 

  5. Heng P-A, Cheng C-Y, Wong T-T, Xu Y, Chui Y-P, Chan K-M, Tso S-K, Tso SK: A virtual-reality training system for knee arthroscopic surgery. IEEE Trans Inf Technol Biomed 2004, 8: 217–227. 10.1109/TITB.2004.826720

    Article  Google Scholar 

  6. Bayonat S, Garcia M, Mendoza C, Ferniindez JM: Shoulder arthroscopy training system with force feedback. In Proceedings of the International Conference on Medical Information Visualisation─BioMedical Visualisation; 05–07 July. London; 2006.

    Google Scholar 

  7. Moody L, Waterworth A, McCarthy A, Harley P, Smallwood R: The feasibility of a mixed reality surgical training environment. Virtual Reality 2008, 12: 77–86. 10.1007/s10055-007-0080-8

    Article  Google Scholar 

  8. Tuijthof GJ, van Sterkenburg MN, Sierevelt IN, van Oldenrijk J, Van Dijk CN, Kerkhoffs GM: First validation of the PASSPORT training environment for arthroscopic skills. Knee Surg Sports Traumatol Arthrosc 2010, 18: 218–224. 10.1007/s00167-009-0872-3

    Article  Google Scholar 

  9. Tuijthof GJ, Visser P, Sierevelt IN, Van Dijk CN, Kerkhoffs GM: Does perception of usefulness of arthroscopic simulators differ with levels of experience? Clin Orthop Relat Res 2011, 469: 1701–1708. 10.1007/s11999-011-1797-y

    Article  Google Scholar 

  10. Wang Y, Xiong Y, Xu K, Liu D: vKASS: asurgical procedure simulation system for arthroscopic anterior cruciate ligament reconstruction. Comput Animat Virtual Worlds 2012, 24: 25–41.

    Article  Google Scholar 

  11. InsightArthroVR®. [http://insightarthrovr.gmv.com/index_en.htm]

  12. ARTHRO Mentor™. [http://simbionix.com/simulators/arthro-mentor/]

  13. Feygin D, Keehner M, Tendick R: Haptic guidance: experimental evaluation of a haptic training method for a perceptual motor skill. In Proceedings of the 10th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems; 24–25 March 2002. Orlando; 2002:40–47.

    Chapter  Google Scholar 

  14. Jérémy Bluteau SC, Payan Y, Gentaz E: Haptic guidance improves the visuo-manual tracking of trajectories. PLoS One 2008, 3: e1775. 10.1371/journal.pone.0001775

    Article  Google Scholar 

  15. Kim Y, Yang U, Jo D, Lee G, Choi J, Park J: Efficient multi-pass welding training with haptic guide. In Book Efficient multi-pass welding training with haptic guide. City: ACM; 2009:1–1.

    Google Scholar 

  16. Morris D, Tan H, Barbagli F, Chang T, Salisbury K: Haptic Feedback Enhances Force Skill Learning. In Proceedings of the Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems; 22–24 March 2007. Tsukuba; 2007:21–26.

    Chapter  Google Scholar 

  17. Palluel-Germain R, Bara F, de Boisferon AH, Hennion B, Gouagour P, Gentaz E: A Visuo-Haptic Device - Telemaque - Increases Kindergarten Children's Handwriting Acquisition. In Proceedings of the Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems; 22–24 March 2007. Tsukuba; 2007:72–77.

    Chapter  Google Scholar 

  18. Xing-Dong Y, Bischof WF, Boulanger P: Validating the Performance of Haptic Motor Skill Training. In Proceedings of the Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems; 13–14 March 2008. Washington, DC; 2008:129–135.

    Google Scholar 

  19. Jabero M, Sarment DP: Advanced surgical guidance technology: a review. Implant Dent 2006, 15: 135–142. 10.1097/01.id.0000217790.68814.1e

    Article  Google Scholar 

  20. Sutherland I: A head-mounted three dimensional display. In Proceedings of Fall Joint Computer Conference, part I: 09–11 December 1968. San Francisco; 1968:757–764.

    Google Scholar 

  21. Shahidi R, Bax MR, Maurer CR, Johnson JA, Wilkinson EP, Wang B, West JB, Citardi MJ, Manwaring KH, Khadem R: Implementation, calibration and accuracy testing of an image-enhanced endoscopy system. IEEE Trans Med Imag 2002, 21: 1524–1535. 10.1109/TMI.2002.806597

    Article  Google Scholar 

  22. Maintz JBA, Viergever MA: A survey of medical image registration. Med Image Anal 1998, 2: 1–36.

    Article  Google Scholar 

  23. Tsai R: A versatile camera calibration technique for high accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses. IEEE J Robotics Automation 1987, 3: 323–344.

    Article  Google Scholar 

  24. Zhang Z: A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell 2000, 22: 1330–1334. 10.1109/34.888718

    Article  Google Scholar 

  25. Kato H, Billinghurst M: Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality; 20–21 October 1999. San Francisco; 1999:85–94.

    Google Scholar 

  26. Piegl L, Tiller , Wayne : The NURBS Book. 2nd edition. New York: Springer -Verlag; 1997.

    Book  Google Scholar 

  27. Wang J-B, Yau H-T: Real-time NURBS interpolator: application to short linear segments. Int J Adv Manuf Technol 2009, 41: 1169–1185. 10.1007/s00170-008-1564-8

    Article  Google Scholar 

  28. Panait L, Akkary E, Bell RL, Roberts KE, Dudrick SJ, Duffy AJ: The role of haptic feedback in laparoscopic simulation training. J Surg Res 2009, 156: 312–316. 10.1016/j.jss.2009.04.018

    Article  Google Scholar 

Download references

Acknowledgements

This research is supported by the Tzu-Chi Dalin General Hospital Research Project: 98-1-7.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yen-Kun Lin.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

SRL proposed conception, design and drafted the manuscript. YKL participated in its design and coordination and helped to draft the manuscript. STH participated in the design of the study and performed the analysis. HTY conceived of the study, and participated in coordination and drafted the manuscript. All authors read and approved the final manuscript.

Shaw-Ruey Lyu, Yen-Kun Lin, Shian-Tang Huang and Hong-Tzong Yau contributed equally to this work.

Electronic supplementary material

Additional file 1: Virtual inspection [http://youtu.be/lKplX2-lc34]. (AVI 9 MB)

Authors’ original submitted files for images

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Lyu, SR., Lin, YK., Huang, ST. et al. Experience-based virtual training system for knee arthroscopic inspection. BioMed Eng OnLine 12, 63 (2013). https://doi.org/10.1186/1475-925X-12-63

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1475-925X-12-63

Keywords