EO-36 Artículo Artícul o Aceptado por Refereo Refe reo
10º CONGR CON GRE ES O NACIONAL D E INGENIERÍA ELECTROM ECÁNICA Y DE S ISTEMAS IST EMAS Noviembre 26-30, 2007; México, D.F.
Developing a dynamic optical 3D measurement system for measurement of respiratory patterns 1
1
M. Bandala , M. J. Joyce Department of Engineering, Lancaster University, Lancaster LA1 4YR, UK Tel (44) (44) 1524 593326 Fax (44) 1524 5381707 E-mail:
[email protected] 1
s ystem capable capable Abstract –– Abstract –– In this paper we prese nt a scanning system
of acquiring the shape of 3D objects. A line of light is projected onto the physical objective. The image from a video camera is analysed to estimate the position the surface profile, so that the shape and volume of the object can be derived. The system scans objects at reasonable speeds and creates good-resolution images of the scanned object. The running technique of Laser Triangulation is implemented with a high-resolution camera, a laser diode, and electronics incorporated into a small sensor package that rotates from a fixed position. The resultant data are transmitted to a PC. This 3D system can be used as a auxiliary tool for radiotherapy gating methods that are typically done in one dimension.
Keywords ––
triangulation.
Surface scanning optical measurement, laser
II. SCA SCA NNING PRINCIPLE The laser beam is aimed to the object of interest, then a came came ra grabs an image image t o en able able th e software software to analys analys e the image and find the laser line in order to estimate the 3D position of all the points illuminated by the laser. Some thresholding has t o be done t o eliminate eliminate any re maining noise on th e picture. The enhanced image image is used to triangulate the points or pixels in space. Finally a computer program meshes the points and creates a 3D view of the object of interest. Figure 1 resembles the basic configuration of the system.
RS232
PC
I. INTRODUCTION
Interface board
Line Laser
L1
In order to optimize external-beam conformal radiotherapy, patient movement during treatment must be taken into account. For treatment on the upper torso, the target o rgans rgans are known to move move substantially substantially due to patient respiration [1]. When chest motion is present during a radiotherapy radiotherapy procedure, physicians physicians usually usually require require a method to monitor monitor the b reathing patterns in order to deliver deliver radiat ion more accurately to the moving targets. Many of the techniques available use surrogate breathing signals taken from patients by systems that use different sensors such as thermocouples, thermistors or strain gauge [2]. Another common technique is the combination of infrared-sensitive cameras with reflective markers mounted on the abdomen of the patient, as well as audio or visual prompting methods that instruct the patients to breath in and breath out at periodic intervals intervals in order to deduce patients’ own breathing – [5]. patterns [3] [3] – [5]. Some of these methods can be complex, time consuming, and very expensive. Most importantly some patients find them difficult to tolerate and uncomfortable. For this reason, we are proposing the development of a different system that could be use d to track and predict organ motion motion us ing a noninvasive tech nique [6].
d
L2
IEEE1394 Camera User
c
User
Person to be scanned inetOrgPerson inetOrgPerson
Figure 1. Motion tracking system basic setup.
The main setup of this system is shown below in figure 2. As it can be seen, the classic triangulation method is deployed here. For simplicity, simplicity, let us derive derive t he equations for the tracking of a single point in space. Where C is a matrix that represents the camera pixel direction vectors, d is the laser position or distance with respect of the camera. L1 is the laser beam vector 1 and L2 is the laser bea m vector 2. The magnitude of the vector C is in reality the position in 3D of our point of interest . So C
d
L1 L2
(1)
EO-36 Artículo Artícul o Aceptado por Refereo Refe reo
10º CONGR CON GRE ES O NACIONAL D E INGENIERÍA ELECTROM ECÁNICA Y DE S ISTEMAS IST EMAS Noviembre 26-30, 2007; México, D.F. the angles formed by the two lines from the secondary principle point to the image sensor (figure 5), were obtained applying the for mule g iven below. 2 x arctan
2 x arctan
Where
Figure 2. Motion tracking system basic setup.
D1
and
D 2
2 f
(2)
D 2 2 f
(3)
are the actual horizontal and vertical
sizes (in micrometers) of the image sensor. In some cases . This had to be done since the D1 D2 , therefore angles of interest
Note that C is modelled in matrix form because it represents all the vectors associated to the camera pixels that conform the camera image. This is better explained by figures 3 and 4. Of course the triangulation can be solved by simple algebraic equations, however all the components of the C vectors must be known. Such components depend on the pos ition ition o f its its associated p ixel. ixel.
D1
and
are not a st andard camera camera
attribute provided by manufacturers.
Figure 5. Pr inciple t o obtain
and
.
If (1) is expressed in vector form. It is still necessary to find the component angles angles o f vector C, C, L1 and L2. 1
x Figure 3. Camera pixel vectors representation.
C
y
d
0
L1
L2
(4)
0
z
Figure 6 is the representation of coordinate system associated to the laser. The laser vectors L1 and L2 can the be e xpress ed in terms terms o f the know ang leθ le θ . So cos
0 Figure 4. Posit ion ion of t he camera pixel vect vect ors in relation with the X-Z plane.
L1
0 1
(5)
L2
sin sin
(6)
0
III. OPTICAL CONSIDERATIONS During this study, we discovered that camera angle of view is crucial to find the components of the C vectors. The horiz horizontal and vertical vertical angle of views views and , which which are
Figure 7 shows the relationship between vector component angles , , and and their projection projection angles angles 1 , 2 and 3
.
EO-36 Artículo Artícul o Aceptado por Refereo Refe reo
10º CONGR CON GRE ES O NACIONAL D E INGENIERÍA ELECTROM ECÁNICA Y DE S ISTEMAS IST EMAS Noviembre 26-30, 2007; México, D.F.
m 3
T c
1
1
2
atan
T r
n
(9)
1
1
2
Where - Ca mera widest horizontal horizontal angle of view. -
Ca mera widest vertical angle of view.
m T c
-
Column ass ociated to x-axis. Total number of pixel columns in screen.
n
-
Row ass ociated ociated to z-axis. z-axis.
T r Figure 6. Laser coordinate system.
Total number of pixel rows in screen.
Once the projection angles are known, th e componen componen t angles are found with with (10), (11) and (12). tan
csc
3
cot
1
(10)
tan
csc
3
cot
1
(11)
tan
csc
2
(12)
1
cot
Expression (1) can be represented in a from that can now be solved cos
C
d
d
cos cos
0
L1
0 0
0
cos
L2
1
sin
(13)
0
So d sin
C Figure 7. Coordinate system point P(x,y,z) and angles.
cos
Finding the direction vectors from an image is done by scanning every horizontal pixel line in the image and searching for the pixel with the laser projection on it. So, for every image image z-pos z-pos ition, ition, an as so ciated x-position x-position is so ught. For a given vector Cnm the projection vectors can be found from (7), (8) and (9).
1
1
90
1 2
cos
sin
Tc
1
(7)
n Tr
1
(14)
Note Note that θ here he re is not any of the projection angles θ 1, θ2 or θ 3 but the angle of the laser beam (see figure 6). The practical implementati implementation on of a p rototype is shown in figure 8 and t he final implementation implementation in f igure 9.
m
2
2
cos
(8)
Figure 8. System prototype.
EO-36 Artículo Artícul o Aceptado por Refereo Refe reo
10º CONGR CON GRE ES O NACIONAL D E INGENIERÍA ELECTROM ECÁNICA Y DE S ISTEMAS IST EMAS Noviembre 26-30, 2007; México, D.F.
Figure 9. Lancaster University 3D Scanner.
IV. ALGORITHM Finding the direction vectors is done by scanning every horizontal line in the image and looking for the laser projection. That is, for every line m an associated illuminated position (or column number n ) in required. The camera vector Cm,n is found by using (14). Since d, Φ, φ , Tc and Tr are know variables, for a position (m, n) the angles θ 1 , θ 2 and θ 3 are needed to obtain α, β and β and γ. The algorithm to fulfil this tas k is is i llustrated in figure figure 10.
Figure 11. 3D scanning of a dummy face.
The calibration of the system was performed with the method proposed by [7]. Figure 11 shows the scan of a box marked with squares which are at a known distance between each other. The accuracy of the system was calibrated by comparing the actual square distances and the ones scanned by t he system.
Figure 10. Algorithm to obtain C.
V. RESULTS Figure 11 shows how the laser line is projected over a dummy face. The camera acquires single images and the software finds the vector positions for the laser-illuminated pixels. A program based on Lab VIEW™ generates the x, y and z values and traces them in a 3D graph. It is possible to acquire and trace an entire 3D object by changing the position of the laser beam. This process has to be cycled to mesh many single scanned lines and draw a complete 3D shape.
Figure 11. Top – Top – Calibration Calibration box. Bottom - Scan o f t he square-marked box during the calibration process.
EO-36 Artículo Artícul o Aceptado por Refereo Refe reo Solid static bodies, such as a dummy head are easily scanned with good accuracy; the scanning speed to obtain the image in figure 8 was approximately 4.5 seconds. A method for dynamic analysis of the chest wall motion does not require the detail level achieved here, therefore the resolution and the θ angle stepping can be modified so that a section of interest of the human chest can be scanned much faster.
10º CONGR CON GRE ES O NACIONAL D E INGENIERÍA ELECTROM ECÁNICA Y DE S ISTEMAS IST EMAS Noviembre 26-30, 2007; México, D.F. VIII. [1] [2] [3]
[4]
On this basis a statistical model can be constructed incorporating incorporating predictive predictive variables variables and derived derived const ants that can explain the volume changes when scanning a breathing chest. It is worth mentioning that the faster the respiratory cycle is, the more difficult is the scanning process and the accuracy of predictions based on previous data will be reduced.
[5]
[6]
[7]
VI. CONCLUSION A 3D method to mon mon itor resp resp iratory iratory motion as an alternat alternat ive to the current 1D methods used in gated-radiotherapy is presented. Th is optical system has considerable considerable potential for rapid and accurate assessment of chest wall movements in the further assessment of movement during gatedradiotherapy. For exa exa mp le the position position prediction prediction of internal organs based of external measurements. It allows complex movements to be followed on a within breath basis, which could be related to muscle activity and respiratory pressures, and gives a more detailed view of events in the respiratory cycle. VII. ACKNOW ACKNOW LEDGM LEDGM ENT We acknowledge the support by the Mexican National Council for Scientific and Technological Development and the Faculty of Science and Technology at Lancaster University.
REFERENCES
Murphy, M.J., Tracking moving organs in real time. Seminars in Radiatio Radiat ion n Oncolo Oncolo gy, 2004. 14(1): p. 91. Kubo, H.D. and B.C. Hill, Respiration gated radiotherapy treatment: a technical study. Physics in Medicine and Biology, 1996. 41: p. 83. Shimizu, S., et al., Detection of lung tumor movement in real-time tumor-tracking radiotherapy. International Journal of Radiation Oncology*Biology*Phy Oncology*Biology*Phy sics, 2001. 51(2): p. 304. Serago, C.F., et al., Initial experience with ultrasound localization for positioning prostate cancer patients for external beam radiotherapy. International Journal of Radiation Oncology*Biology*Physics, 2002. 53(5): p. 1130. Seiler, P.G., et al., A novel tracking technique for the continuous precise measurement of tumour positions in conformal radiotherapy. Physi Phy sics cs in Medicine and Biology, 2000. 45: p. 103. Berson, A.M., et al., Clinical experience using respiratory gated radiation therapy: Comparison of free-breathing and breath-hold techniques. International Journal of Radiation Oncology*Biology*Phy Oncology*Biology*Phy sics, 2004. 60(2): p. 419. Gordon B Drummond1 and Neil D Duffy. A video-based optical system for rapid measurements of chest wall movement. Physiol. Meas. 22 (2001) 489 – 503. 503.
BIOGRAPHIES
Manuel Bandala received his B.Eng (Hons) in Electronics Enginering from the Instituto Tecnológico de Puebla, in 2001. He is currently a PhD Candidate at The University of Lancaster supported by the Mexican National Council for Scientific and Technological Development. His research interests include 3D laser scanning, body signal monitoring, wireless inertial navigation systems, and microelectronics design.
Malcolm J. Joyce received both his B.Sc (Hons) in Physics and his PhD in Nuclear Physics from the University of Liverpool, UK in 1990 and 1993, respectively. He is currently Senior Lecturer in the Department of Engineering at Lancaster University UK. His research interest s include medical radiotherapy, neutron and gamma-ray spectrometry and nuclear instrumentation. inst rumentation. He is a Chartered M ember of the Instit ute of Physics Physics and the Institution of Nuclear Engineers in the UK.