Digital stereophotogrammetry based on circular markers and zooming cameras: evaluation of a method for 3D analysis of small motions in orthopaedic research
- Evgenij Bobrowitsch^{1}Email author,
- Christof Hurschler^{1},
- Gavin Olender^{1},
- Christian Plaass^{2},
- Hazibullah Waizy^{2},
- Heino Arnold^{3} and
- Christina Stukenborg-Colsman^{2}
https://doi.org/10.1186/1475-925X-10-12
© Bobrowitsch et al; licensee BioMed Central Ltd. 2011
Received: 19 November 2010
Accepted: 1 February 2011
Published: 1 February 2011
Abstract
Background
Orthopaedic research projects focusing on small displacements in a small measurement volume require a radiation free, three dimensional motion analysis system. A stereophotogrammetrical motion analysis system can track wireless, small, light-weight markers attached to the objects. Thereby the disturbance of the measured objects through the marker tracking can be kept at minimum. The purpose of this study was to develop and evaluate a non-position fixed compact motion analysis system configured for a small measurement volume and able to zoom while tracking small round flat markers in respect to a fiducial marker which was used for the camera pose estimation.
Methods
The system consisted of two web cameras and the fiducial marker placed in front of them. The markers to track were black circles on a white background. The algorithm to detect a centre of the projected circle on the image plane was described and applied. In order to evaluate the accuracy (mean measurement error) and precision (standard deviation of the measurement error) of the optical measurement system, two experiments were performed: 1) inter-marker distance measurement and 2) marker displacement measurement.
Results
The first experiment of the 10 mm distances measurement showed a total accuracy of 0.0086 mm and precision of ± 0.1002 mm. In the second experiment, translations from 0.5 mm to 5 mm were measured with total accuracy of 0.0038 mm and precision of ± 0.0461 mm. The rotations of 2.25° amount were measured with the entire accuracy of 0.058° and the precision was of ± 0.172°.
Conclusions
The description of the non-proprietary measurement device with very good levels of accuracy and precision may provide opportunities for new, cost effective applications of stereophotogrammetrical analysis in musculoskeletal research projects, focusing on kinematics of small displacements in a small measurement volume.
Background
"Three-dimensional (3D) measurements play a vital role in a diversity of industries and disciplines, ranging from the manufacturing and process sectors to healthcare" [1]. Orthopaedic research often focuses on qualitative and quantitative measurements of an object's motion [2, 3]. An image based 3D motion analysis system can track wireless, small, light-weight markers attached to the surface of an object. Measurement of strain on the ligaments or spatial changes of small bones can be determined with a high accuracy. Malicky et al. used a stereoradiogrammetry to measure the strain on the glenohumeral capsule by means of spherical markers [4]. Video-based 3D motion analysis systems based on black [5] or retroreflective [6] spherical markers were developed for tracking objects in small (less than 1 m^{3}) measurement volume. In contrast to gait analysis tracking systems configured for large measurement volume [7], "motion analysis configured for registration within small volumes allows measurement of minuscule displacements with great accuracy" [6].
A possibility to optimize the motion system accuracy was successfully tested by Mössner and co-workers [8]. They used zoom, tilt and pan of theirs cameras to track athletes movement during down hill skiing. The zoom was used to overcome the limited camera resolution. The tilt and pan enabled the object tracking in the field of view of the stationary camera while zooming. In order to perform this task, at least 6 control points [9] had to be visible in each camera frame. This required the Direct Linear Transformation (DLT) method [10, 11] while determining the intrinsic and extrinsic parameters of a fully projective camera. Due to the known intrinsic parameters, the camera lens distortion can be minimized. This improves the object's spatial information derivated from the captured image. In order to reconstruct 3D positions of markers captured by two or more cameras, the extrinsic parameters of each camera have to be known. These extrinsic parameters describe the geometrical relation between the camera and the captured calibration body. The calibration body is not required during subsequent motion tracking when all cameras remain at the same fixed relative to one another position [5, 6]. Otherwise, when experiment conditions require a flexible camera positioning and the camera's intrinsic parameters were determined (the camera was pre-calibrated), we can determine the extrinsic parameters for each camera using at least 3 [12] or 4 [13] control points. This process is also called "camera pose estimation".
Ansar and Daniilidis [13] and Lepetit et al. [14] showed in comparison to different pose estimation methods, a very good accuracy and robustness of the orthogonal iterations algorithm developed by Lu et al. [15]. The orthogonal iterations algorithm searches for an optimal orthogonal projection of the control points presented as a perspective projection on the image plane. The orthogonality constraint is enforced by using singular value decomposition, not from specific parameterization of rotations, e.g., Euler angles [15] typical for DLT methods.
A big calibration body used for the camera calibration in DLT methods can be replaced with a small fiducial marker when the cameras were pre-calibrated. Fiducial markers are artificial landmarks added to a scene to facilitate locating point correspondences between images, or between images and a known model [16]. In our study, the fiducial marker was defined as an aggregate of coplanar control points used for the camera pose estimation. Coplanar reference objects are especially easy to manufacture and measure [17]. In addition, a control point detection based on a detection of the centre of a circular target is advantageous because the circle is centrosymmetric and the detection of the circle centre is not sensitive to the thresholding error [18]. Nevertheless most of 3D tracking systems are based on the use of spherical markers because their circular image is almost independent of the viewing direction [19] of the camera. In contrast to this, the perspective projection of a circle marker on the image-plane has an elliptical form with exception if the image-plane is parallel to the circle-plane. The ellipse centre differs from the centre of the projected circle depending on the angle and displacement between the circle surface and the image-plane. This effect is known as eccentricity [20]. In order to avoid the systematic geometric image measurement error because of the eccentricity, its correction is required [18, 20, 21].
Using the combination of all advantageous aspects of the different techniques mentioned above, the purpose of this study was to develop and evaluate a non-position fixed motion analysis system configured for a small measurement volume. The system was used zoom to track small round flat markers with respect to the fiducial marker consisting of four coplanar circles. Additionally, a unique solution to find the circle's centre projecting on the image had to be developed and applied. This 3D motion analysis system was specifically configured to measure spatial changes of small bones in the foot region.
Methods
Tracking Devise and Image Acquisition
Both cameras were pre-calibrated with a fixed focus and camera zoom factor of 2.2 using an internet available tool [23]. A 6.5 × 6.5 cm calibration board was printed by laser printer as a black-white checkerboard presenting a grid of 144 control points. The calibration board was captured inside of a field of view of each camera in 18 different positions fulfilling a volume where the markers to track and the fiducial marker had to be captured. Thus the intrinsic parameters were determined in order to pre-calibrate each camera [23]. Due to the 90° tilted cameras (Figure 1), the measurement volume (approximately 0.1 × 0.1 × 0.1 m) was behind and above the fiducial marker which was placed in the middle of the lower part of each camera field of view.
The image acquisition tool was programmed with MATLAB (The MathWorks Inc., Natick, MA, USA). This tool allowed the real time streaming view from both cameras. When the fiducial marker was placed near to the tracked objects, two static images from both cameras were simultaneously acquired under optimal light conditions (150 Watt, Ministudio 606, Multiblitz Dr. ing. D. A. Mannesmann GMBH & CO KG, Köln, Germany) described in a manual of the used cameras.
Determination of the circle centre projected onto the image-plane
The black circle on the white background was used to locate the fiducial as well as the object-surface markers. Edge detection between the black and white areas was performed by means of a gray value threshold. On this edge, an ellipse Π^{ i }(c^{ i }, a^{ i }, b^{ i }, α^{ i }) was fitted, where i was denoted as "initial". The centre c^{ i }corresponded only approximately to the centre of the projected circle, when the ellipse axes ratio a^{ i }/b^{ i }was not equal to one.
where d was the distance from O to the image XY-plane. The rotation of the minor axis $\overrightarrow{a}$ about the ellipse long axis $\overrightarrow{b}$ with the amount of ± γ yielded two vectors which were useful to find the two sections AB' and BA' whose middle lay on the rays passed through O and the centres of the real imaged c _{ r }and imaginary c _{ i }circles (Figure 2). The reversed rotation with the transposed (t) matrix ${R}^{t}(\overrightarrow{n},\beta )$ yielded the searched rays passed trough O and the centres of the real imaged and the imaginary circles. The selection criterion for the fiducial marker was that the four real imaged circles were coplanar and their imaginary circles were not coplanar. Furthermore, during reconstruction of the 3D marker position from the left and right images, two normal vectors for each imaged circle were found (${\overrightarrow{n}}_{r}$ and ${\overrightarrow{n}}_{i}$, Figure 2, where r was denoted as 'real' and i - as 'imaginary"). The real imaged circle normal vectors from the left (l) image ${\overrightarrow{n}}_{rl}$ and from the right (r) image ${\overrightarrow{n}}_{rr}$ coincided. The imaginary circle normal vectors ${\overrightarrow{n}}_{il}$ and ${\overrightarrow{n}}_{ir}$ did not coincide.
Determination of the 3D position and scaling factor of the fiducial marker
where E is the principal point of the image, 1 ≤ i ≤ 4, $\overrightarrow{e}$ is the unit vector, t and h are the lengths of the corresponding vectors. The four orthogonal projection points of the fiducial quadrangle tops (p _{ i }) lay on the rays from E to r _{ i }. The cross point divided the diagonals of the fiducial quadrangle under known length ratios used to calculate the p _{ i }.
where $A=\left[\begin{array}{cccc}{x}_{01}& {y}_{01}& 0& 0\\ 0& 0& {x}_{01}& {y}_{01}\\ .\\ .\\ .\\ {x}_{04}& {y}_{04}& 0& 0\\ 0& 0& {x}_{04}& {y}_{04}\end{array}\right]$, $q=\left[\begin{array}{c}{x}_{11}-{x}_{t}\\ {y}_{11}-{y}_{t}\\ .\\ .\\ .\\ {x}_{14}-{x}_{t}\\ {y}_{14}-{y}_{t}\end{array}\right]$ and "\" is backslash or matrix left division. If A were a square matrix, A\q would be roughly the same as A^{ -1 } q. But in our case A is an m-by-n (m = 8 and n = 4) matrix with m ≠ n and q is a column vector with m components, then s = A\q is the solution in the least squares sense to the under- or over-determined system of equations As = q.
In order to reconstruct the matrix R, the property of a unit matrix was used where the sum of the squared column or row elements is equal to one. The signs in the third column and row of R were reconstructed by means of another property where determinant of R is equal to one. Now only two right orthogonal matrixes remained which corresponded to the two possible 3D positions of the fiducial quadrangle. In order to choose the matrix R which described the real imaged position of the fiducial quadrangle, the property of the perspective projection to converge to the perspective projection centre O(0,0,0) was used. The lines had to converge to O when they passed through the four points of the fiducial marker p _{ i } (x _{ pi } ,y _{ pi } ,z _{ pi }) and their perspective projection r _{ i } (x _{ ri } ,y _{ ri } ,d), where d was the distance from O to the image XY-plane and the coordinates of p _{ i }were calculated as in the equation (4).
3D reconstruction of markers
The 3D position of the fiducial marker regarding the camera coordinate system was determined by means of the described above rotation matrix R, the scaling factor k and the translation vector t. The two cameras were used to simultaneously acquire two images of the same scene. After rigid body transformation of ${O}_{j}^{c}$ and the perspective projections of markers ${m}_{jn}^{c}$ into the fiducial marker coordinate system, the 3D positions of markers with respect to the fiducial marker were reconstructed. The single 3D marker position ${m}_{n}^{f}$ was the estimated intersection point of the two rays from the perspective projection centre ${O}_{j}^{f}$ to the corresponding marker perspective projection ${m}_{jn}^{f}$ [25], where 1 ≤ j ≤ 2, n was an integer between 1 and the number of markers, c was denoted as "camera coordinate system" and f was denoted as "fiducial marker coordinate system".
Evaluation experiments
Experiment 1: Inter-marker distance measurement
In order to evaluate the accuracy (agreement between the measured and reference values) and precision (closeness of measurement values to each other under similar experimental conditions [26]) of the presented measurement system an 8 × 8 grid of black circles was printed on a white surface and adhered to a 10 × 10 cm plate. The test distance between adjacent circles centres in horizontal and vertical directions amounted to 10 mm. Two grids with circles of 1 and 2 mm diameter were prepared to perform the following tests:
Test A - Translating Camera
Test B - Rotating Plate
The grid of circles was captured eight times after the plate was rotated about X _{ obj }axis. The rotation angles were approximately -60, -45, -30, -15, 15, 30, 45 and 60°. The test was repeated five times.
Test C - Rotating Cameras
Similar as in the test B but here the cameras were rotated about Y _{ obj }axis. The rotation angles were approximately -60, -45, -30, -15, 15, 30, 45 and 60°. The test was repeated five times.
Precision test of image processing
where ${\overline{d}}_{ki}$ is the mean value of the i's inter-marker distance on the grid of circles in the k's position measured 10 times (d _{ kij }).
System repeatability test
In this test the influence of the system rebooting on the 112 measured distances between adjacent circles centres in horizontal and vertical directions was examined. A single position of the grid of circles from the test A (the initial position) was captured 10 times. After acquisition of each capture the measurement system was shut down and started again. Every time, the zoom and focus had to be adjusted to the default values used during the pre-calibration. The variation in distances was calculated similarly to the image processing precision test by means of the Root Mean Squares error (7).
Precision test on cadaveric specimen
Five different pictures of the specimen were acquired by means of the dual cameras and then each was processed 10 times. The variation in all possible distances between the nine markers ($m={C}_{9}^{2}=36$) was calculated similarly to the image processing precision test by means of the Root Mean Squares error (7).
Experiment 2: marker displacement measurement
In this experiment two plastic plates were used to determine the translational and rotational accuracy and precision when one plate was moved in respect to another static plate. Each plate was equipped with four black circles (mounting accuracy 0.005 mm). The circles were on the tops of a square whose side length amounted to15 mm. Two plate types were manufactured for circle sizes of 1 and 2 mm diameter.
Rotation test
In this test the Z _{ obj }axis of the stationary plate was perpendicular to the plate surface and parallel to the pin between the cameras. In the zero position the non-stationary plate was coplanar to the stationary plate and the circles on the corresponded square sides were collinear. The non-stationary plate was rotated about X _{ obj }, Y _{ obj }and Z _{ obj }axes by means of a headpiece on a milling unit (Deckel Maho Pfronten GmbH, Germany) used for precise rotation with a step of 2.25 ± 0.025° trough the range of 90° (± 45° from the zero position).
The magnitude of rotation between the stationary and non-stationary plates was calculated by means of a detection of the rotation matrix describing the rotation between the plates [27]. From the rotation matrix was calculated "the attitude vector" $\overline{a}=\overline{n}\alpha $ [28], where $\overline{n}$ is the unit vector about which the scalar rotation α occurs. Then "the attitude vector" was orthogonally decomposed onto the X _{ obj }, Y _{ obj }and Z _{ obj }axes [29] of the stationary plate.
Translation test
In order to determine the accuracy and precision of translational measurements the non-stationary plate was translated from the zero position along each of the X _{ obj }, Y _{ obj }and Z _{ obj }axes in amount of 0.5, 1 and 5 mm. The translations were performed by means of the translational manipulator (ThorLabs Inc. Europe, Karlsfeld, Germany). Its accuracy (0.005 mm) and precision (± 0.002 mm) was characterized using laser interferometry in the previous study [30]. The test was repeated five times.
Statistical analysis
The error value was calculated as a difference between the reference value and the measured value. All collected error values were examined with Jarque Bera test to verify that the data was normally distributed. Regarding this test, many of the error values sets were significant (p < 0.05) non normally distributed. Therefore non parametric statistical methods were applied. The accuracy of the presented measurement system was represented by means of the mean and median error [31]. The precision was calculated as a standard deviation [31] or as Root Mean Squares error (7).
The median error was presented due to the applied non parametric methods. The comparison of medians was performed by means of Kruskal Wallis test. The variance of the error values was compared by means of the Levene's test. All statistics were performed using MATLAB (The MathWorks Inc., Natick, MA, USA). The values of significant difference or significant sameness [32] were set at p < 0.05 and p > 0.95 respectively.
Results
Experiment 1: detection of Inter-marker distance
Tests A, B and C - 10 mm distance detection
Detection of the 10 mm inter marker distance
Marker | Test A error (Z _{ obj } ) | Test B error (X _{ obj } ) | Test C error (Y _{ obj } ) |
---|---|---|---|
ø1 mm | 0.011 (0.002) ± 0.094 | 0.009 (-0.007) ± 0.110 | 0.007 (-0.002) ± 0.096 |
ø2 mm | 0.023 (0.013) ± 0.098 | 0.007 (-0.014) ± 0.103 | -0.006 (-0.02) ± 0.098 |
Image processing precision and repeatability tests
The image processing precision test revealed the Root Mean Squares error for 1 and 2 mm Markers at the level of 0.0044 and 0.0051 mm, respectively. Levene's multiple-sample test showed a significantly (p < 0.001) higher variance of mean distance deviation for 2 mm Markers in comparison to 1 mm.
The system repeatability test detected the Root Mean Squares error for both marker sizes at the level of 0.013 mm. Levene's test showed that the variance of the distance detection errors for both marker sizes was significant (p = 0.98) the same.
The image processing precision test on the foot specimen showed the Root Mean Squares error at the level of 0.0035 mm.
Experiment 2: measurement of displacement
Rotation test
Rotation test
Marker | Rotation error X _{ obj } | Rotation error Y _{ obj } | Rotation error Z _{ obj } |
---|---|---|---|
ø1 mm | 0.08 (0.08) ± 0.159 | 0.082 (0.123) ± 0.281 | 0.001 (-0.006) ± 0.035 |
ø2 mm | 0.105(0.092) ± 0.160 | 0.080 (0.098) ± 0.192 | 0.000 (-0.008) ± 0.062 |
Translation test
Translation test
Marker | Transl. mm | Translation error X _{ obj } | Translation error Y _{ obj } | Translation error Z _{ obj } |
---|---|---|---|---|
0.5 | -0.001(-0.003) ± 0.014 | 0.024(0.019) ± 0.031 | 0.009(0.006) ± 0.058 | |
ø1 mm | 1 | -0.006(-0.007) ± 0.017 | 0.033(0.037) ± 0.030 | -0.011(0.001) ± 0.062 |
5 | -0.027(-0.026) ± 0.014 | 0.032(0.046) ± 0.091 | 0.021(0.023) ± 0.065 | |
0.5 | -0.002(-0.006) ± 0.016 | 0.014(0.011) ± 0.021 | -0.019(-0.026) ± 0.032 | |
ø2 mm | 1 | -0.003(-0.004) ± 0.012 | 0.008(0.01) ± 0.013 | 0.003(-0.0003) ± 0.057 |
5 | -0.045(-0.045) ± 0.011 | -0.013(-0.016) ± 0.034 | 0.052(0.046) ± 0.041 |
Discussion
This study demonstrated that the 3D stereophotogrammetrical system based on the tracking of flat round markers can accurately measure the distances and movements within a small measurement volume. The algorithms to detect the centre of the projected circle and to estimate the camera pose using the fiducial marker were presented. The evaluation tests were designed in order to test the measurement accuracy and precision of distances and movements expected during the tracking of markers fixed on small bones or ligaments. Despite the fact that the presented system processes only static images, in comparison to similar systems using proprietary vendor-specific hardware, the presented system is a very small fraction of the cost. Future goals include the possibility of making the presented measurement system able to extract the tracked markers from a video stream.
The developed algorithm of the detection of the projected circle centre functioned with a single projected circle while other algorithms require for the detection for example concentric circles [33] or coplanar circles [18, 20, 21]. On the other hand, the presented algorithm of the projected circle centre detection required the principal point and principal distance to be known from the pre-calibration. The quality of the pre-calibration could play a determinant role in the accuracy and precision [34] of the presented measurement system, when the measurement conditions were optimal. This played an important role during the camera pose estimation using the fiducial marker where the pose estimation was optimized by means of the direct transformation of the fiducial marker perspective projection into its orthogonal projection, the least squares algorithm using to solve the equation 5 and the reconstruction of the rotation matrix using the orthogonal matrix constrains.
The remaining inaccuracies of the camera pre-calibration could explain the reduction of precision when lager distances were measured. This could be corroborated by the smaller image processing precision (± 0.005 mm) in comparison to the precision ranged from ± 0.011 to ± 0.11 mm while the distances and movements were measured. The system repeatability (± 0.013 mm) also influenced the distance and movement measurements precision because of small discrepancies occurred during the system adjusting the zoom and focus values. Lujan et al. also reported that "lager translations/rotations reduced kinematic accuracy" [5].
In the presented study, the precision was calculated as the standard deviation or as the Root Mean Squares error (equation 7) because of the calculation similarity. Therefore these two parameters had to be more comparable than the mean standard deviation [6] or the two standard deviations [5], which were used instead of the Root Mean Squares error in the related studies [5, 6].
The accuracy values were very close to zero and ranged from -0.045 to 0.052 mm for translations and from -0.008 to 0.105° for rotations. The presented system showed comparable results in accuracy for the similar displacements measurements in the study of Lujan et al. (0.034 mm for the translations and 0.132° for the rotations) [5] and in the study of Everaert et al. the accuracy ranged from 0 to 0.05 mm [6].
The measurement with the flat round markers was limited by the angle λ between the normal vector to the marker plane and the camera view axis (0°≤λ<90°). When this angle became too close to 90°, there were difficulties to fit the ellipse of the projected circle because the ellipse became too slim. Therefore the axes ratio of the ellipse could be a criterion of the critical value of the angle λ. This axes ratio criterion was set at 0.2 (correspond approximately to λ = 78.5°). Therefore the angle range between the Z _{ obj }and the pin with the fiducial marker, what was mounted between the cameras, was set at ± 60° (Figure 1 and 3, test B and C).
It has been stated that: "The accuracy of the target location deteriorates if the number of edge pixels compared to central pixels increases, because of the uncertain grey values of the edge" [19]. This phenomenon occurred when the marker size became smaller [5] and the angle λ for the flat round markers, bigger. Due to using the zoom and good camera resolution, the marker size of 1 or 2 mm diameter showed in the tests A, B and C the significant agreement in precision. "Zoom lenses are used extensively in computer vision to overcome the limited resolution" [35]. Both cameras of the presented system were calibrated for the fixed zoom and focus settings because Wiley and Wong admitted in their study that: "There were significant changes in the distortion characteristics with changes in the focal setting. However, the pattern of change for a given camera-lens combination was very systematic and stable over time" [35], what was confirmed through the good precision values (± 0.013 mm) of the repeatability test.
The black circle on the white background - this is advantageous for the edge detection colour combination remained during the test on the cadaveric foot specimen. Therefore the colours of connective tissues surrounding the markers did not deteriorate the precision of the presented measurement system. Nevertheless the use of flat round retroreflective markers may be more advantageous because the maker size reduction.
The measurement of the displacements in the second experiment showed typical distribution of the measurement errors for this camera setup [5, 31]. Regarding this distribution the presented measurement system performed the best translation measurement in the XY _{ obj }-plane and the best rotation measurement, when rotations occurred about the Z _{ obj }axes.
Conclusions
The study demonstrated that the handy 3D stereophotogrammetrical system based on the tracking of the flat round markers within a small measurement volume with respect to the fiducial marker can accurately measure the distances and movements. The evaluation experiments of the 10 mm distances measurement showed the total accuracy of 0.0086 mm (mean error) and the precision of ± 0.1002 mm (standard deviation). The translations from 0.5 mm to 5 mm were measured with the total accuracy of 0.0038 mm and the precision of ± 0.0461 mm. The rotations of 2.25° amount were measured with the entire accuracy of 0.058° and the precision of ± 0.172°. These levels of accuracy and precision may provide opportunities for new applications of stereophotogrammetrical analysis in orthopaedic research projects, focusing on small displacements in a small measurement volume.
Declarations
Acknowledgements
We are grateful to the German Research Foundation for the financial support (DFG-HU 873_2-1), Joerg Viering, Michael Breyvogel and theirs colleagues from the machine shop for technical support. We thank Christopher Müller who helped on the design of figures.
Authors’ Affiliations
References
- Bogue R: Three-dimensional measurements: A review of technologies and applications. Sensor Review 2010, 30: 102–106. 10.1108/02602281011022670View ArticleGoogle Scholar
- Malicky DM, Kuhn JE, Frisancho JC, Lindholm SR, Raz JA, Soslowsky LJ: Neer Award 2001: nonrecoverable strain fields of the anteroinferior glenohumeral capsule under subluxation. J Shoulder Elbow Surg 2002, 11: 529–540. 10.1067/mse.2002.127093View ArticleGoogle Scholar
- Phatak NS, Sun Q, Kim SE, Parker DL, Kent Sanders R, Veress AI, Ellis BJ, Weiss JA: Noninvasive determination of ligament strain with deformable image registration. Annals of Biomedical Engineering 2007, 35: 1175–1187. 10.1007/s10439-007-9287-9View ArticleGoogle Scholar
- Malicky DM, Soslowsky LJ, Kuhn JE, Bey MJ, Mouro CM, Raz JA, Liu CA: Total strain fields of the antero-inferior shoulder capsule under subluxation: a stereoradiogrammetric study. J Biomech Eng 2001, 123: 425–431. 10.1115/1.1394197View ArticleGoogle Scholar
- Lujan TJ, Lake SP, Plaizier TA, Ellis BJ, Weiss JA: Simultaneous measurement of three-dimensional joint kinematics and ligament strains with optical methods. Journal of Biomechanical Engineering 2005, 127: 193–197. 10.1115/1.1835365View ArticleGoogle Scholar
- Everaert DG, Spaepen AJ, Wouters MJ, Stappaerts KH, Oostendorp RA: Measuring small linear displacements with a three-dimensional video motion analysis system: determining its accuracy and precision. Arch Phys Med Rehabil 1999, 80: 1082–1089. 10.1016/S0003-9993(99)90065-5View ArticleGoogle Scholar
- Ehara Y, Fujimoto H, Miyazaki S, Tanaka S, Yamamoto S: Comparison of the performance of 3D camera systems. Gait and Posture 1995, 3: 166–169. 10.1016/0966-6362(95)99067-UView ArticleGoogle Scholar
- Mössner M, Kaps P, Nachbauer W: A method for obtaining 3-D data in alpine skiing using pan-and-tilt cameras with zoom lenses. ASTM Special Technical Publication 1996, 1266: 155–164.Google Scholar
- Chen L: An investigation on the accuracy of three-dimensional space reconstruction using the direct linear transformation technique. Journal of Biomechanics 1994, 27: 493–500. 10.1016/0021-9290(94)90024-8View ArticleGoogle Scholar
- Abdel-Aziz YI, Karara HM: Direct Linear Transformation from Comparator Coordinates into Object Space Coordinates in Close-Range Photogrammetry. ASP Symposium on Close Range Photogrammetry 1971, 1–18.Google Scholar
- Hatze H: High-precision three-dimensional photogrammetric calibration and object space reconstruction using a modified DLT-approach. J Biomech 1988, 21: 533–538. 10.1016/0021-9290(88)90216-3View ArticleGoogle Scholar
- Haralick BM, Lee CN, Ottenberg K, Nölle M: Review and analysis of solutions of the three point perspective pose estimation problem. International Journal of Computer Vision 1994, 13: 331–356. 10.1007/BF02028352View ArticleGoogle Scholar
- Ansar A, Daniilidis K: Linear pose estimation from points or lines. IEEE Transactions on Pattern Analysis and Machine Intelligence 2003, 25: 578–589. 10.1109/TPAMI.2003.1195992View ArticleGoogle Scholar
- Lepetit V, Moreno-Noguer F, Fua P: EPnP: An accurate O(n) solution to the PnP problem. International Journal of Computer Vision 2009, 81: 155–166. 10.1007/s11263-008-0152-6View ArticleGoogle Scholar
- Lu CP, Hager GD, Mjolsness E: Fast and globally convergent pose estimation from video images. IEEE Transactions on Pattern Analysis and Machine Intelligence 2000, 22: 610–622. 10.1109/34.862199View ArticleGoogle Scholar
- Fiala M: Designing highly reliable fiducial markers. IEEE Transactions on Pattern Analysis and Machine Intelligence 2010, 32: 1317–1324. 10.1109/TPAMI.2009.146View ArticleGoogle Scholar
- Triggs B: Camera pose and calibration from 4 or 5 known 3D points. In 1999. Kerkyra, Greece; 278–284.Google Scholar
- Liu Q, Su H: Correction of the asymmetrical circular projection in DLT camera calibration. Sanya, Hainan; 2008:344–348.Google Scholar
- Trinder JC, Jansa J, Huang Y: An assessment of the precision and accuracy of methods of digital target location. ISPRS Journal of Photogrammetry and Remote Sensing 1995, 50: 12–20. 10.1016/0924-2716(95)98211-HView ArticleGoogle Scholar
- Ahn SJ, Warnecke HJ, Kotowski R: Systematic geometric image measurement errors of circular object targets: Mathematical formulation and correction. Photogrammetric Record 1999, 16: 485–502. 10.1111/0031-868X.00138View ArticleGoogle Scholar
- Heikkilä J: Geometric camera calibration using circular control points. IEEE Transactions on Pattern Analysis and Machine Intelligence 2000, 22: 1066–1077.View ArticleGoogle Scholar
- Bobrowitsch E, Imhauser C, Graichen H, Durselen L: Evaluation of a 3D object registration method for analysis of humeral kinematics. J Biomech 2007, 40: 511–518. 10.1016/j.jbiomech.2006.02.016View ArticleGoogle Scholar
- Camera calibration toolbox for matlab [http://www.vision.caltech.edu/bouguetj/calib_doc/]
- Horn BKP, Hilden HM, Negahdaripour S: Closed-form solution of absolute orientation using orthonormal matrices. J Opt Soc Am A 1988, 5: 1127–1135. 10.1364/JOSAA.5.001127MathSciNetView ArticleGoogle Scholar
- Köhler M: Vision Based Remote Control in Intelligent Home Environments. In 3D Image Analysis and Synthesis'96. Edited by: B Girod HN-PS. University of Erlangen-Nürnberg/Germany: Inx-Verlag; 1996:147–154.Google Scholar
- Maletsky LP, Sun J, Morton NA: Accuracy of an optical active-marker system to track the relative motion of rigid bodies. J Biomech 2007, 40: 682–685. 10.1016/j.jbiomech.2006.01.017View ArticleGoogle Scholar
- Soderkvist I, Wedin PA: Determining the movements of the skeleton using well-configured markers. J Biomech 1993, 26: 1473–1477. 10.1016/0021-9290(93)90098-YView ArticleGoogle Scholar
- Spoor CW, Veldpaus FE: Rigid body motion calculated from spatial co-ordinates of markers. J Biomech 1980, 13: 391–393. 10.1016/0021-9290(80)90020-2View ArticleGoogle Scholar
- Woltring HJ: 3-D attitude representation of human joints: a standardization proposal. J Biomech 1994, 27: 1399–1414. 10.1016/0021-9290(94)90191-0View ArticleGoogle Scholar
- Seehaus F, Emmerich J, Kaptein BL, Windhagen H, Hurschler C: Experimental analysis of Model-Based Roentgen Stereophotogrammetric Analysis (MBRSA) on four typical prosthesis components. J Biomech Eng 2009, 131: 041004. 10.1115/1.3072892View ArticleGoogle Scholar
- Wiles A, Thompson D, Frantz D: Accuracy assessment and interpretation for optical tracking systems. In Medical Imaging: Visualization, Image-Guided Procedures, and Display Proceedings of the SPIE Edited by: Galloway, Robert L Jr. 2004, 5367: 421–432.Google Scholar
- Lindsay RM: Reconsidering the status of tests of significance: An alternative criterion of adequacy. Accounting, Organizations and Society 1995, 20: 35–53. 10.1016/0361-3682(93)E0004-ZView ArticleGoogle Scholar
- Kim JS, Gurdjos P, Kweon IS: Geometric and algebraic constraints of projected concentric circles and their applications to camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence 2005, 27: 637–642. 10.1109/TPAMI.2005.80View ArticleGoogle Scholar
- Xiaopeng L, Faig W: Digital camera calibration: a summary of current methods. Geomatics Info Magazine 1997, 11: 40–41.Google Scholar
- Wiley AG, Wong KW: Geometric calibration of zoom lenses for computer vision metrology. Photogrammetric Engineering & Remote Sensing 1995, 61: 69–74.Google Scholar
Copyright
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.