+ All Categories
Home > Documents > APPARATUS AND DEMONSTRATION NOTES - Reed … Sensor.pdf · evaluate the capabilities of the Kinect...

APPARATUS AND DEMONSTRATION NOTES - Reed … Sensor.pdf · evaluate the capabilities of the Kinect...

Date post: 16-Apr-2018
Category:
Upload: vuongthu
View: 217 times
Download: 2 times
Share this document with a friend
7
APPARATUS AND DEMONSTRATION NOTES The downloaded PDF for any Note in this section contains all the Notes in this section. Frank L. H. Wolfs, Editor Department of Physics and Astronomy, University of Rochester, Rochester, New York 14627 This department welcomes brief communications reporting new demonstrations, laboratory equip- ment, techniques, or materials of interest to teachers of physics. Notes on new applications of older apparatus, measurements supplementing data supplied by manufacturers, information which, while not new, is not generally known, procurement information, and news about apparatus under development may be suitable for publication in this section. Neither the American Journal of Physics nor the Editors assume responsibility for the correctness of the information presented. Manuscripts should be submitted using the web-based system that can be accessed via the American Journal of Physics home page, http://ajp.dickinson.edu and will be forwarded to the ADN editor for consideration. Using the Xbox Kinect sensor for positional data acquisition Jorge Ballester a) Physics Department, Emporia State University, Emporia, Kansas 66801 Chuck Pheatt b) Computer Science Department, Emporia State University, Emporia, Kansas 66801 (Received 4 May 2011; accepted 15 August 2012) The Kinect sensor was introduced in November 2010 by Microsoft for the Xbox 360 video game system. It is designed to be positioned above or below a video display to track player body and hand movements in three dimensions (3D). The sensor contains a red, green, and blue (RGB) camera, a depth sensor, an infrared (IR) light source, a three-axis accelerometer, and a multi-array microphone, as well as hardware required to transmit sensor information to an external receiver. In this article, we evaluate the capabilities of the Kinect sensor as a 3D data-acquisition platform for use in physics experiments. Data obtained for a simple pendulum, a spherical pendulum, projectile motion, and a bouncing basketball are presented. Overall, the Kinect sensor is found to be a useful data-acquisition tool for motion studies in the physics laboratory. V C 2013 American Association of Physics Teachers. [http://dx.doi.org/10.1119/1.4748853] I. INTRODUCTION The use of imaging technology to capture motion data in the physics laboratory has a long history. It dates back to at least the use of strobes and moving objects with blinking lights, such as the widely used “blinky.” 1 The visual records of these experiments were captured with a Polaroid Land camera using long exposure times. As imaging technologies have become more affordable, they have been incorporated into the physics laboratory. The development of videocas- sette recorders (VCR) and the possibility of advancing video recordings frame-by-frame was used to study topics such as the underdamped pendulum. 2 Data extraction required plac- ing a transparent sheet on the screen and marking it as the recording was advanced frame-by-frame. Extensive peda- gogical materials for the study of motion graphs, including prerecorded videos, were developed. 3 The development of computer video capabilities and soft- ware gradually simplified data capture (e.g., Ref. 4), espe- cially with the introduction of point-and-click tools such as VideoPoint TM . 5 These video techniques provide a record of one- or two-dimensional motion in a plane perpendicular to the line of sight. Scaling image distances to real world dis- tances requires a reference object of known size to be visible in the image. Video analysis techniques were initially used to investigate elementary motion problems but were also gradually adapted to study intermediate concepts in classical mechanics. 6 One limitation of video analysis, which also applies to the Kinect, is the 30 frames per second (fps) video standard. This frame rate can make precise numerical differ- entiation to obtain velocities and accelerations difficult. 2 However, affordable high-speed cameras capable of up to 1000 fps, such as the Casio EX-FH20, have become avail- able recently. 7 The higher temporal resolution is gained at the expense of spatial resolution. The pedagogical effective- ness of video analysis techniques in improving student understanding of particle motion has also been investigated. 8 Other motion tracking technologies have been developed in parallel with video imaging technologies. For example, ul- trasonic motion detectors have been used extensively in in- troductory physics laboratories. In some ways, motion detectors challenge video analysis in terms of pedagogical effectiveness. 9 Alternative techniques, providing high temporal resolution but lacking video images (e.g., light quadrature), have been available for some time. 10 As has been the case with many previous technological innovations, the Kinect sensor for the Xbox 360 video game system has potential applications in the physics laboratory. The Kinect sensor was introduced in November 2010 by Microsoft. It’s designed to be positioned above or below a video display to track player body and hand movements in 3D. The sensor contains a RGB camera, a depth sensor, an 71 Am. J. Phys. 81 (1), January 2013 http://aapt.org/ajp V C 2013 American Association of Physics Teachers 71 Downloaded 25 Jan 2013 to 134.10.9.63. Redistribution subject to AAPT license or copyright; see http://ajp.aapt.org/authors/copyright_permission
Transcript
Page 1: APPARATUS AND DEMONSTRATION NOTES - Reed … Sensor.pdf · evaluate the capabilities of the Kinect sensor as a 3D data-acquisition ... ing a transparent sheet on the ... January 2013

APPARATUS AND DEMONSTRATION NOTESThe downloaded PDF for any Note in this section contains all the Notes in this section.

Frank L. H. Wolfs, EditorDepartment of Physics and Astronomy, University of Rochester, Rochester, New York 14627

This department welcomes brief communications reporting new demonstrations, laboratory equip-ment, techniques, or materials of interest to teachers of physics. Notes on new applications of olderapparatus, measurements supplementing data supplied by manufacturers, information which, while notnew, is not generally known, procurement information, and news about apparatus under developmentmay be suitable for publication in this section. Neither the American Journal of Physics nor the Editorsassume responsibility for the correctness of the information presented.

Manuscripts should be submitted using the web-based system that can be accessed via the AmericanJournal of Physics home page, http://ajp.dickinson.edu and will be forwarded to the ADN editor forconsideration.

Using the Xbox Kinect sensor for positional data acquisition

Jorge Ballestera)

Physics Department, Emporia State University, Emporia, Kansas 66801

Chuck Pheattb)

Computer Science Department, Emporia State University, Emporia, Kansas 66801

(Received 4 May 2011; accepted 15 August 2012)

The Kinect sensor was introduced in November 2010 by Microsoft for the Xbox 360 video game

system. It is designed to be positioned above or below a video display to track player body and hand

movements in three dimensions (3D). The sensor contains a red, green, and blue (RGB) camera, a

depth sensor, an infrared (IR) light source, a three-axis accelerometer, and a multi-array microphone,

as well as hardware required to transmit sensor information to an external receiver. In this article, we

evaluate the capabilities of the Kinect sensor as a 3D data-acquisition platform for use in physics

experiments. Data obtained for a simple pendulum, a spherical pendulum, projectile motion, and a

bouncing basketball are presented. Overall, the Kinect sensor is found to be a useful data-acquisition

tool for motion studies in the physics laboratory. VC 2013 American Association of Physics Teachers.

[http://dx.doi.org/10.1119/1.4748853]

I. INTRODUCTION

The use of imaging technology to capture motion data inthe physics laboratory has a long history. It dates back to atleast the use of strobes and moving objects with blinkinglights, such as the widely used “blinky.”1 The visual recordsof these experiments were captured with a Polaroid Landcamera using long exposure times. As imaging technologieshave become more affordable, they have been incorporatedinto the physics laboratory. The development of videocas-sette recorders (VCR) and the possibility of advancing videorecordings frame-by-frame was used to study topics such asthe underdamped pendulum.2 Data extraction required plac-ing a transparent sheet on the screen and marking it as therecording was advanced frame-by-frame. Extensive peda-gogical materials for the study of motion graphs, includingprerecorded videos, were developed.3

The development of computer video capabilities and soft-ware gradually simplified data capture (e.g., Ref. 4), espe-cially with the introduction of point-and-click tools such asVideoPoint

TM

.5 These video techniques provide a record ofone- or two-dimensional motion in a plane perpendicular tothe line of sight. Scaling image distances to real world dis-tances requires a reference object of known size to be visiblein the image. Video analysis techniques were initially usedto investigate elementary motion problems but were also

gradually adapted to study intermediate concepts in classicalmechanics.6 One limitation of video analysis, which alsoapplies to the Kinect, is the 30 frames per second (fps) videostandard. This frame rate can make precise numerical differ-entiation to obtain velocities and accelerations difficult.2

However, affordable high-speed cameras capable of up to1000 fps, such as the Casio EX-FH20, have become avail-able recently.7 The higher temporal resolution is gained atthe expense of spatial resolution. The pedagogical effective-ness of video analysis techniques in improving studentunderstanding of particle motion has also been investigated.8

Other motion tracking technologies have been developedin parallel with video imaging technologies. For example, ul-trasonic motion detectors have been used extensively in in-troductory physics laboratories. In some ways, motiondetectors challenge video analysis in terms of pedagogicaleffectiveness.9 Alternative techniques, providing hightemporal resolution but lacking video images (e.g., lightquadrature), have been available for some time.10

As has been the case with many previous technologicalinnovations, the Kinect sensor for the Xbox 360 video gamesystem has potential applications in the physics laboratory.The Kinect sensor was introduced in November 2010 byMicrosoft. It’s designed to be positioned above or below avideo display to track player body and hand movements in3D. The sensor contains a RGB camera, a depth sensor, an

71 Am. J. Phys. 81 (1), January 2013 http://aapt.org/ajp VC 2013 American Association of Physics Teachers 71

Downloaded 25 Jan 2013 to 134.10.9.63. Redistribution subject to AAPT license or copyright; see http://ajp.aapt.org/authors/copyright_permission

Page 2: APPARATUS AND DEMONSTRATION NOTES - Reed … Sensor.pdf · evaluate the capabilities of the Kinect sensor as a 3D data-acquisition ... ing a transparent sheet on the ... January 2013

IR light source, a three-axis accelerometer, and a multi-arraymicrophone, as well as hardware required to transmit sensorinformation to an external receiver. In this article, we evalu-ate the capabilities of the Kinect sensor as a 3D data acquisi-tion platform for motion studies in the physics laboratory.Several experiments demonstrating the sensor’s capabilitiesin acquiring positional data are described.

II. SENSORS

The RGB and depth images from the Kinect sensor areof greatest interest for the purposes of this paper. The RGBand depth hardware used in the sensor were developed byPrimeSense.11 Both the RGB and depth images have astandard resolution of 640� 480 pixels. The unit generatesa RGB 8-bit color graphics video stream. The unit’s depthimage sensing hardware provides 11-bit depth data foreach pixel. A PrimeSense patent (Ref. 12) notes that the“technology for acquiring the depth image is based onLight Coding

TM

. Light Coding works by coding the scenevolume with near-IR light.” A near-IR light source and dif-fuser are used to project a speckle pattern onto the scenebeing assessed. An example image of the speckle patternand a discussion of its properties can be found in Ref. 13.The image of the speckle pattern that is projected onto anobject is then compared with reference images to identifythe reference pattern that correlates most strongly with thespeckle pattern on the object. This process provides anestimate of the location of the object within the sensor’srange. Cross-correlation between the speckle pattern on theobject and the identified reference pattern is then used tomap the object’s surface.12 The RGB and depth sensors areoffset from one another by approximately 2.5 cm, yieldingoffset view-points. A viewing transformation must beapplied to allow the images generated to have the samepoint of view.

Estimates of the Kinect depth sensor’s ranging limit varyfrom 0.8–3.5 m (Ref. 14) to 0.7–6.0 m (Ref. 15). The angularfield of view for both the RGB and the depth sensors isapproximately 57� horizontally and 43� vertically.15 Bothsensors acquire data at a rate of 30 fps. The Kinect uses auniversal serial bus (USB) type A connector that may beattached to a personal computer (PC) with USB inputcapability.

Windows-based software that allows the device to be con-nected to a PC has been available since December 2010.16,17

The software suites allow data to be acquired by the sensorand manipulated independently of the Xbox gaming unit.The authors have utilized and modified the software used toprocess the sensor’s output signals and acquire the data dis-cussed in this paper.

III. DEPTH SENSOR EVALUATION

The depth sensor’s performance was evaluated by collect-ing raw data using the Ajax software suite.16 Raw depth data(Draw) from the Kinect is provided in an 11-bit format, withdepth values for each image pixel between 0 and 2047. Atest range was set up on a floor using a 0.3 m by 0.3 m squaregrid pattern from 0.6 m to 4.3 m in depth by 4 m in width.Seven 0.15 m2 square targets were spaced at 0.3 m intervalson a 2.1 m high vertical stand and the stand was placed ateach grid point. Measurements collected from the sensor onthe targets verify that the Kinect distance measurements are

aligned in a rectangular grid with respect to the horizontal,vertical, and depth planes. Depth is defined as the perpendic-ular distance from the observed object to a reference planethat coincides with the front face of the Kinect. From thesemeasurements, a regression equation relating raw depth(Draw) and actual depth (Dmeters) was obtained:

Dmeters ¼b1

b0 � Draw; b0 ¼ 1090:760:2;

b1 ¼ 355:160:5: (1)

The parameter limits shown in Eq. (1) and for all subsequentregression analyses correspond to the 95% regression confi-dence intervals for these parameters.

A plot of the observed depth data and Eq. (1) are shown inFig. 1. The values of Draw for depths between 0.6 m and4.3 m cover a range between 500 and 1000, utilizing consid-erably less than one-half of the available range. The sensordocumentation (Ref. 14) specifies a 1-cm depth resolution at2 m. Our evaluation confirms this specification. At 4 m, the

Fig. 1. Calibration curve showing raw depth sensor values (Draw) versus dis-

tance (Dmeters).

Fig. 2. Position of the Kinect sensor in our reference frame. The coordinates

of the sensor are (x¼ 6 m, y¼ 4.5 m, z¼ 0 m). The figure illustrates that as

the distance from the sensor increases, the resolution of the x and y measure-

ments decreases.

72 Am. J. Phys., Vol. 81, No. 1, January 2013 Apparatus and Demonstration Notes 72

Downloaded 25 Jan 2013 to 134.10.9.63. Redistribution subject to AAPT license or copyright; see http://ajp.aapt.org/authors/copyright_permission

Page 3: APPARATUS AND DEMONSTRATION NOTES - Reed … Sensor.pdf · evaluate the capabilities of the Kinect sensor as a 3D data-acquisition ... ing a transparent sheet on the ... January 2013

depth resolution is 2.7 cm. This change in resolution atgreater depths is a direct consequence of the non-linear rela-tionship between Draw and Dmeters.

The sensor’s performance was also evaluated withthe OpenNI software suite which uses software componentsfrom PrimeSense.17 This software provides processed depth(Dprocessed) information, reporting depth values between 0and 10,000. Using a technique similar to the one used to eval-uate Draw, it was found that Dprocessed has a somewhat poorerresolution than Draw. As depth increases, the area covered byeach of the 640� 480 sensor pixels increases, reducing theresolution in the horizontal and vertical planes, as is illus-trated in Fig. 2. Any two objects appearing within a singlesensor pixel are indistinguishable. Based on our Dprocessed

measurements, the following relation between Dprocessed andDmeters was obtained:

Dmeters¼b1ðb0þDprocessedÞ; b0¼0:00460:003;

b1¼0:00100760:00001: (2)

The depth values reported by the OpenNI suite essentiallyprovide depth information in mm and a user transformation

to a linear depth scale is not required. The range of Dprocessed

values between 0 and 10,000 corresponds to a depth rangebetween 0 and 10 m. Figure 3 shows the position resolutionin the horizontal (x), the vertical (y), and the depth (z) planesas function of depth. The depth resolution at 2 m and 4 mwas found to be 1.1 cm and 4.6 cm, respectively. The x and yresolutions vary between 0.35 cm and 0.70 cm at 2 m and4 m, respectively.

Although the OpenNI suite exhibits poorer depth resolu-tion than the Ajax suite, several features built into theOpenNI suite make it more desirable as a data collectiontool. These features include the individual image time-stampinformation and the built-in viewing transformation, allow-ing the depth and RGB images to have the same point ofview. For these reasons, we have used the OpenNI softwarefor the measurements described in Section VI.

To assess the precision of the depth sensor at various dis-tances, a 0.3 m square target was placed at distances of 0.6 mand from 0.75 m to 4.0 m at 0.25 m intervals. The target wasapproximately perpendicular to the line of sight of the sen-sor. The target’s depth was measured one hundred times ateach location, and a plane was fit to each of the 100 depthimages using linear regression, yielding error mean square(EMS) estimates. EMS is an estimate of the variance of therandom error of the depth measurements. Point cloud samplesizes for the targets ranged from an average of 88,882 pixelsat 0.6 m to 2322 pixels at 4 m. The 100 EMS estimates werethen pooled to form an overall variability estimate, denotedas pooled residuals, at each depth. Figure 4 shows the pooledresiduals (

ffiffiffiffiffiffiffiffiffiffiEMSp

) as function of the distance between thetarget and the sensor. The variability increases from 0.2 cmat 0.6 m to 3.3 cm at 4 m. Comparing the pooled residuals tothe depth resolution shown in Fig. 3, a large proportion ofdepth variability can be attributed to depth resolution.

IV. MAPPING THE KINECT TO A 3D

EXPERIMENTAL REGION

The Kinect output naturally lends itself to the use of a 3Drectangular coordinate system (see Fig. 2). Given the depthrange of 0–10 m and a sensor field view of 57� horizontallyand 43� vertically, the sensor defines an experimental spaceof 12 m in the x-direction, 9 m in the y direction, and 10 m inthe negative z direction. The coordinates of the Kinect sensorin our reference frame are x¼ 6 m, y¼ 4.5 m, and z¼ 0 m.This places the Kinect at the center of one of the boundingfaces of the experimental space. The following equationsmap the sensor output (depth and pixels) to our right-handed3D coordinate system:

zmeters ¼ �Dprocessed

1000; (3)

xmeters ¼ 6� 2 Pixelx �639

2

� �tan

57�

2

� �zmeters

640; (4)

ymeters ¼ 4:5� 2 479� Pixely �479

2

� �tan

43�

2

� �zmeters

480:

(5)

Pixelx and Pixely represent a single pixel in the xy-planeassociated with a depth measurement. By convention, thepixels are numbered horizontally from 0 to 639 and verti-cally from 0 to 479, with the origin positioned in the upper

Fig. 3. Kinect spatial resolution of the horizontal (x), the vertical (y), and the

depth (z) measurements as function of depth.

Fig. 4. Pooled residuals, used to estimate the precision of the depth measure-

ments made with the Kinect, for a 0.3 m square target placed at multiple dis-

tances relative to the sensor.

73 Am. J. Phys., Vol. 81, No. 1, January 2013 Apparatus and Demonstration Notes 73

Downloaded 25 Jan 2013 to 134.10.9.63. Redistribution subject to AAPT license or copyright; see http://ajp.aapt.org/authors/copyright_permission

Page 4: APPARATUS AND DEMONSTRATION NOTES - Reed … Sensor.pdf · evaluate the capabilities of the Kinect sensor as a 3D data-acquisition ... ing a transparent sheet on the ... January 2013

left-hand corner of the images. The 3D coordinates calcu-lated in this way are in a single octant.

The Kinect’s 3D rectangular coordinate system translatesand rotates rigidly with the orientation of the sensor. Align-ment of all three axes was accomplished using a 1 m squaretarget in the shape of a Greek cross with equal sized armsmounted on a stand (see Fig. 2). The target is aligned plumbto the Earth’s surface using a simple bubble level. The sensoris then positioned so that the target is level, centered, and ata constant depth. Although not necessary, the Kinect’s xyand yz-planes may be made to coincide with the walls of arectangular experimental space by initially positioning thetarget squarely in the experimental space.

V. TIMING CONSIDERATIONS

Using a PC for real-time data acquisition can be problem-atic. Data acquisition devices are typically implementedusing dedicated real-time processors that have been designedto acquire data in a lossless fashion based on the desired rateof data capture. PC operating systems on the other hand arenot designed to minimize timing jitter introduced due tocompeting tasks. From a typical PC users’ perspective, varia-tions in the timing of tasks by several milliseconds are of lit-tle concern. However, these timing variations in dataacquisition operations may have a significant effect on theinterpretation of the data collected.

The Kinect sensor output is organized into frames consist-ing of RGB and depth images. The OpenNI software suiteprovides frame numbering and timestamp information,reported in units of 10�5 s. One thousand data frames werecaptured and the timestamp information was analyzed. Theaverage frame-to-frame capture time was calculated to be0.033338 s, with an expected value of 0:03�3 s based on a30 fps acquisition rate. The maximum frame-to-frame jitterwas found to be 10�5 s.

PC processor speed plays a role in the ability to acquiredata in a lossless fashion. When a data frame is collectedfrom the Kinect, it must be fully processed before the infor-mation from the next data frame can be acquired. Failure tokeep pace with the Kinect’s data generation rate will resultin lost or corrupted data. In our experiments, both RGB anddepth images were captured and stored on disk for post-processing. We found that buffering the Kinect output wasnecessary in order to minimize data loss over time periodslonger than several seconds. We note that a PC with multiplecores and a processor speed greater than 2.8 GHz assuredthat data loss would not be an issue for most experiments.

VI. EXAMPLE EXPERIMENTS

The Kinect was used to digitize several different types ofmotion commonly studied in physics laboratories. The pur-pose of these experiments was to assess the real-world effec-tiveness of the Kinect in gathering motion data. Dataobtained were evaluated with regard to its ability to producequalitative motion patterns and quantitative results.

The first experiment involved a simple pendulum of lengthL¼ (2.30 6 0.01) m, constrained to swing in a plane. Thependulum bob consisted of a metal ball with a radius of2.6 cm and mass of 0.5 kg. In the first trial, the pendulummoved in the xy-plane, perpendicular to the line of sight ofthe Kinect. The x, y, and z coordinates as function of timeare shown in Fig. 5. The motion in the x direction is what is

commonly analyzed in video clips using e.g., Logger Pro(Ref. 18). The data in the x direction clearly show the peri-odic oscillations of the pendulum with well-defined maximaand minima. The motion in the y direction shows theexpected vertical oscillations of the bob with two verticaloscillations for every horizontal oscillation. The asymmetryin the vertical oscillations, as well as the variability in bob’sz position, indicates some misalignment of the Kinect withrespect to the experimental coordinate system. In a secondtrial, the pendulum moved in the yz-plane. In this case, the

Fig. 5. Measured x, y, and z coordinates of the bob of a simple pendulum of

length L¼ (2.30 6 0.01) m swinging in the xy plane.

74 Am. J. Phys., Vol. 81, No. 1, January 2013 Apparatus and Demonstration Notes 74

Downloaded 25 Jan 2013 to 134.10.9.63. Redistribution subject to AAPT license or copyright; see http://ajp.aapt.org/authors/copyright_permission

Page 5: APPARATUS AND DEMONSTRATION NOTES - Reed … Sensor.pdf · evaluate the capabilities of the Kinect sensor as a 3D data-acquisition ... ing a transparent sheet on the ... January 2013

Kinect provided motion data along the z axis, data that arenot readily obtainable with video capture. Fitting a sinusoi-dal curve to the data shows the period for motion in thexy-plane and in the yz-plane to be (3.0635 6 0.0004) s and(3.0624 6 0.0004) s, respectively. This compares favorablywith a predicted period of (3.042 6 0.007) s, calculated fromthe small-angle simple-pendulum formula

T ¼ 2p

ffiffiffiL

g

s: (6)

The error for the predicted period was determined by propa-gating the uncertainty in L through Eq. (6).

The angular amplitude of the oscillations in both trialswas (15 6 1)�. The error was determined by calculating theaverage and standard deviation of the angular amplitudeover several periods. Including finite amplitude corrections(Ref. 19) to Eq. (6), results in a predicted period of(3.055 6 0.008) s.

The second experiment focused on a spherical pendulum.In this experiment, the bob is free to move anywhere on aspherical surface defined by the length of the string. Forsmall amplitudes, the motion of the bob is predicted toconsist of separate sinusoidal oscillations along the x and zdirections with the same period as given by Eq. (6). The twooscillations generally have different amplitudes, resulting inelliptical orbits in the xz-plane. In practice, deviations fromthe idealized small-amplitude pendulum produce approxi-mately elliptical orbits that do not quite close and precess.With traditional video capture techniques, a camera must bemounted directly above or below the spherical pendulum.The Kinect sensor can provide motion data for this system,by placing the bob anywhere in the Kinect’s experimentalspace. A plot of the motion in the xz-plane is shown in Fig.6. The pendulum bob orbits clockwise in this figure. Twocomplete orbits, separated by 4 min, are displayed. Thelarger ellipse represents the earlier orbit. The decreasing sizeof the orbit shows the decay of the amplitudes. The preces-sion is in the clockwise direction in Fig. 6, with an averagerate of (78 6 2)� per minute. The precession rate was calcu-lated from the change in the direction of the major axis ofsuccessive orbits of the spherical pendulum.

A third experiment that lends itself to qualitative andquantitative analysis is projectile motion. The projectile usedwas a wooden ball with a radius of 3.6 cm and a mass of0.158 kg. It was tossed several times across a distance ofapproximately 3 m. The motion took place in the xy-plane toevaluate the Kinect’s ability to analyze transverse motion. Ina second trial, the motion was carried out in the yz-plane toevaluate the Kinect’s unique ability to analyze motion with anormal component, i.e. along the sensor unit’s line of sight.

Projectile motion can be described either by specifyingthe horizontal (x and/or z) and vertical (y) positions as func-tion of time or in terms of its trajectory through space bydetermining the vertical position as function of the horizontalposition. In the first measurement, the motion was in thexy-plane and can be described by the standard equations:

x ¼ x0 þ vx0t; (7)

and

y ¼ y0 þ vy0t� 1

2gt2: (8)

Fig. 6. Two orbits of the spherical pendulum separated by 4 min. The pre-

cession is in the clockwise direction with an average rate of (78 6 2)� per

minute.

Fig. 7. The vertical (y) and horizontal (x) positions of a wooden ball with a

radius of 3.6 cm and a mass of 0.158 kg as function of time.

Fig. 8. The vertical motion of a bouncing overinflated regulation basketball.

The floor is at approximately y¼ 3.1 m.

75 Am. J. Phys., Vol. 81, No. 1, January 2013 Apparatus and Demonstration Notes 75

Downloaded 25 Jan 2013 to 134.10.9.63. Redistribution subject to AAPT license or copyright; see http://ajp.aapt.org/authors/copyright_permission

Page 6: APPARATUS AND DEMONSTRATION NOTES - Reed … Sensor.pdf · evaluate the capabilities of the Kinect sensor as a 3D data-acquisition ... ing a transparent sheet on the ... January 2013

By eliminating time from Eqs. (7) and (8), the trajectory canbe described by the following equation:

y ¼ y0 þvy0

vx0

ðx� x0Þ �g

2v2x0

ðx� x0Þ2: (9)

The time dependence of the horizontal and vertical motion isshown in Fig. 7. The x versus t plot displays the linear behav-ior predicted by Eq. (7) with a regression estimate ofvx0¼ (4.02 6 0.02) m/s. The y versus t plot displays the quad-ratic behavior predicted by Eq. (8) using regression analysiswith g¼ (9.8 6 0.3) m/s2. The projectile motion in theyz-plane can also be described using Eqs. (7)–(9) by replacingx with z. Analyzing the z versus t linear relationship producesa regression estimate of vz0¼ (3.64 6 0.03) m/s. Analyzingthe y versus t quadratic relationship produces a regressionestimate of g¼ (9.6 6 0.2) m/s2.

The final experiment consisted of tracking a series ofbounces executed in the yz-plane using an extremely overin-flated basketball. The positional data is shown in Fig. 8. Themotion between bounces is expected to follow a standardparabolic trajectory described by Eq. (8). Each individualtrajectory between bounces was fit with a quadratic equation.The individual fits were used to determine the maximumheight reached between consecutive bounces. The dottedcurve shown in Fig. 8 is a fit connecting the successive max-ima. Comparing successive maxima can be used to calculatethe coefficient of restitution e for the basketball using therelation

e ¼ffiffiffiffiffiffiffiffiffihnþ1

hn

r; (10)

where hnþ1 and hn are successive heights. Based on the datashown in Fig. 8, e¼ 0.977 6 0.006. As expected, this ishigher than the coefficient of restitution of 0.78 6 0.03 speci-fied for regulation basketballs.20

VII. SUPPORTING SOFTWARE

The software used to collect and analyze the data in thepreviously mentioned experiments, is available from theauthors. To utilize the Kinect, the following software compo-nents are required:

(1) OpenNI framework,21 which provides an applicationprogramming interface (API) for writing applicationsutilizing natural interaction. This API covers communi-cation with the Kinect, as well as high-level middleware.

(2) NITE middleware,21 which provides an additionalframework for acquiring and processing Kinect data.

(3) Kinect device drivers from SensorKinect22 that allow theKinect to communicate with higher-level software.

(4) The Kinect data acquisition program (DAQ), developedby the authors. This application program allows the userto acquire Kinect data (RGB and depth data) and savethe data in a series of files for further processing.

(5) The data-processing program (DP), developed by theauthors, to post process the data acquired from theKinect DAQ. The program allows the user to gather datafrom the RGB and depth data files. Once processed, datacan be saved to a comma separated value (CSV) file thatmay then be used as input to a spreadsheet or other pro-gram for model fitting or additional analysis.

Items 1, 2 and 3 are freely downloadable from the Inter-net.21,22 Items 4 and 5 are available from the authors.23

In conducting a Kinect based experiment, the user collectsdata using the DAQ program and then post processes thedata using the DP program. The DP program allows the userto digitize images manually with a point-and-click interfaceusing one of several cursors or to use the automated digitiz-ing feature. In the automated approach, the user masks outbackground objects and digitizes the object of interest usingblob detection.24 Automated digitizing is preferable, andmuch quicker than hand digitizing, and can be used if theobject being studied is not in close proximity to otherobjects. If the experimenter cannot meet the proximityrequirements for automated digitization, hand digitizingmust be used.

VIII. CONCLUSIONS

The Kinect sensor provides an inexpensive option foracquiring 3D positional data with a time base. The experi-ments performed show that meaningful qualitative and quan-titative data can be acquired. The device has limitations withrespect to spatial and temporal resolution, and object sizeand speed need to be carefully considered when designing anexperiment. Overall, the Kinect sensor has been found to bean effective device with the potential for rapidly acquiringdata in diverse experiments.

ACKNOWLEDGMENTS

The authors would like to thank the reviewers for theirinsightful and thoughtful comments. The authors would alsolike to thank Andrew Wayman for his work on assessing theKinect system.

a)Electronic mail: [email protected])Electronic mail: [email protected] Science Study Committee, Physics, 2nd ed. (D. C. Heath and

Company, Lexington, MA, 1965).2M. S. Greenwood, “Using videotapes to study underdamped motion of a

pendulum: A laboratory project,” Am. J. Phys. 55, 645–648 (1987).3D. A. Zollman and R. G. Fuller, “Teaching and learning physics with

interactive video,” Phys. Today 47(4), 41–47 (1994).4M. Marcuso and R. M. Webber, “Kinematical measurements using digital

image capture,” Am. J. Phys. 64, 1080–1083 (1996).5Videopoint

VR

, <http://www.lsw.com/videopoint/>.6W. M. Wehrbein, “Using video analysis to investigate intermediate con-

cepts in classical mechanics,” Am. J. Phys. 69, 818–820 (2001).7Casio EX-FH20, <http://www.casio.com/products/archive/Cameras/

High-Speed/EX-FH20/content/Technical_Specs/>.8R. J. Beichner, “The impact of video motion analysis on kinematics graph

interpretation skills,” Am. J. Phys. 64, 1272–1277 (1996).9R. J. Beichner, “The effect of simultaneous motion presentation and graph

generation in a kinematics lab,” J. Res. Sci. Teach. 70, 803–815 (1990).10P. A. DeYoung and B. Mulder, “Studying collisions in the general physics

laboratory with quadrature light emitting diode sensors,” Am. J. Phys. 70,

1226–1230 (2002).11PrimeSense, Tel-Aviv, Israel, <http://www.primesense.com>.12J. Garcia and Z. Zalevsky, U.S. patent 7,433,024 B2 (7 October 2008).13Technical description of Kinect calibration, <http://www.ros.org/wiki/

kinect_calibration/technical>.14PrimeSensor Reference Design, <http://primesense.360.co.il/?p¼514>.15Kinect, <http://www.businessinsider.com/blackboard/kinect>.16Ajax Software Suite—The software for acquiring unprocessed (raw) data

from the Kinect consisted of the V16 OpenKinect for Windows driver

(<http://ajaxorg.posterous.com/kinect-driver-for-windows-prototype>)

using and libusb 1.0 driver (<http://www.libusb.org>).17OpenNI Software Suite—The software for acquiring processed data from

the Kinect consisted of interaction software from OpenNI, OPENNI-

76 Am. J. Phys., Vol. 81, No. 1, January 2013 Apparatus and Demonstration Notes 76

Downloaded 25 Jan 2013 to 134.10.9.63. Redistribution subject to AAPT license or copyright; see http://ajp.aapt.org/authors/copyright_permission

Page 7: APPARATUS AND DEMONSTRATION NOTES - Reed … Sensor.pdf · evaluate the capabilities of the Kinect sensor as a 3D data-acquisition ... ing a transparent sheet on the ... January 2013

Win32-1.0.0.23.exe (<http://openni.org/downloadfiles/2-openni-binaries>),

PrimeSense NITE 1.3 middleware, NITE-Win32-1.3.0.17.exe (<http://

openni.org/downloadfiles/12-openni-compliant-middleware>) and Sensor-

Kinect drivers, SensorKinect-Win32-5.0.0.exe (<https://github.com/avin2/

SensorKinect/tree/master/Bin>).18Vernier Logger Pro 3 software, <http://www.vernier.com/soft/lp.html>.19J. B. Marion and S. T. Thornton, Classical Dynamics of Particles and

Systems, 4th ed. (Saunders College Publishing, Philadelphia, PA, 1995).

20International Basketball Federation (FIBA), Official Basketball Rules2010: Basketball Equipment, (San Juan, Puerto Rico, 2010).

21<http://75.98.78.94/Downloads/OpenNIModules.aspx>.22<https://github.com/avin2/SensorKinect>.23See supplementary material at http://dx.doi.org/10.1119/1.4748853 for the Kin-

ect data-acquisition program (DAQ) and the data-processing program (DP).24<http://www.aforgenet.com/framework/features/motion_detection_2.0.

html>.

77 Am. J. Phys., Vol. 81, No. 1, January 2013 Apparatus and Demonstration Notes 77

Downloaded 25 Jan 2013 to 134.10.9.63. Redistribution subject to AAPT license or copyright; see http://ajp.aapt.org/authors/copyright_permission


Recommended