+ All Categories
Home > Technology > An Application of Stereo Image Reprojection from Multi-Angle Images for Immersive Environment

An Application of Stereo Image Reprojection from Multi-Angle Images for Immersive Environment

Date post: 10-Feb-2017
Category:
Upload: tatsuro-matsubara
View: 229 times
Download: 0 times
Share this document with a friend
8
An Application of Stereo Image Reprojection from Multi-Angle Images for Immersive Environment Tatsuro MATSUBARA ABSTRACT In recent building designing, a visual simulation of interior and exterior design that includes conceptual content is often used for prototype development. Immersive virtual reality system becomes a popular solution for house designing. There is difficulty in using virtual reality system that is not suitable for play-only system because of lack of consideration for home use. Common VR systems have large-scale frameworks and may be expensive to play, they require a special configuration that include developed devices. We considered simplifying the VR system that real-time rendering with head tracking is replaced as some view images rendered previously. In addition, we made own VR system that gives an immersive experience on inexpensive configuration. The VR system separated into two processes, a number of cameras are placed spherically and are rendered as images, calculates a view image and displays stereo images, we developed stereo image reprojection system that works well on laptop computers and smartphones using VR headsets. Our future challenges include an optimization image size and camera setup how many cameras would be suitable for stereovision, image compression is important for contents distributing. 1. INTRODUCTION Visual simulation is often convenient for industrial development makes less-cost and time-saving for making a prototype, large scale development such as a building or landscape designing demands the evaluation using VR system. There is the difficulty in visual simulation that is not widely prevalent for play-only system, because immersive VR systems often become expensive, using VR system needs high performance computer. We developed low-cost real-time rendering system provide virtual products
Transcript

An Application of Stereo Image Reprojection from

Multi-Angle Images for Immersive Environment

Tatsuro MATSUBARA

ABSTRACT

In recent building designing, a visual simulation of interior and exterior design that includes conceptual content is often used for prototype development. Immersive virtual reality system becomes a popular solution for house designing. There is difficulty in using virtual reality system that is not suitable for play-only system because of lack of consideration for home use. Common VR systems have large-scale frameworks and may be expensive to play, they require a special configuration that include developed devices. We considered simplifying the VR system that real-time rendering with head tracking is replaced as some view images rendered previously. In addition, we made own VR system that gives an immersive experience on inexpensive configuration. The VR system separated into two processes, a number of cameras are placed spherically and are rendered as images, calculates a view image and displays stereo images, we developed stereo image reprojection system that works well on laptop computers and smartphones using VR headsets. Our future challenges include an optimization image size and camera setup how many cameras would be suitable for stereovision, image compression is important for contents distributing.

1. INTRODUCTION

Visual simulation is often convenient for industrial development makes less-cost and time-saving for making a prototype, large scale development such as a building or landscape designing demands the evaluation using VR system. There is the difficulty in visual simulation that is not widely prevalent for play-only system, because immersive VR systems often become expensive, using VR system needs high performance computer. We developed low-cost real-time rendering system provide virtual products

with computing and visualizing system, these have been explored and applied for large-scale commercial VR system[1-2]. Recently, computer performance is rapidly increased and downsizing that bring out like a smartphone, it will be suitable for portable VR player. In this paper, we consider stereo image system from multi-camera images using the implementation that works well on laptop computers and smartphones using VR headsets.

2. INTERPUPILLARY ADJUSTMENT IN OMNIDIRECTIONAL CAMERA

Stereo panoramic imaging system is used for a binocular vision in VR [3-4], the camera-setting is illustrated in Figure 1, assumed as IPD(InterPupillary Distance) between left and right vision has a fixed value, it also means a binocular disparity is locked the IPD toward horizontal direction. But the challenge of the stereo imaging system that is inadequate for variety of horizontal IPD depends on head-tilt in Figure 2.

red arrow - right camera visionblue arrow - left camera vision

Figure 1. Camera set-up of stereo panoramic image

without head tilt with head tilt

IPD

Figure 2. Mismatched IPD of head-tilt view

For the variation of IPD, the alternative configuration of the stereo panoramic imaging is needed, the definition of the proposal technique from omnidirectional vision is below, let r be the radius of alignment circle and let n be the number of cameras toward horizontal direction, let i be the sequence number (1, 2, 3, …), IPD length is described

IPD[i] = 2rsin(⇡

n/i). (1)

If n be given 8, IPD ratio of the radius follows as 0.7653(i=1), 1.4142(i=2), 1.8477(i=3), 2(i=4). The maximum value of sequence number is restricted by the equation (defined camera view angle as f),

1 i <fn

2⇡ . (2) Larger n value makes a greater variety of IPD, that shows seamless visual quality.

3. REPROJECTION SYSTEM FROM MULTI CAMERA VISON There are three coordinate systems, named as view (eye) space and world (real) space, sample (camera) space. View space is assumed as an eye vision, world space explains the relation between screen space and sample space, and sample space treats a camera vision as a texture.

For displaying eye vision, the first process chooses suitable camera from each eye direction, the second process calculates the coordinate on camera space, describes below sentence. Select Camera from Eye Direction Figure 3 shows the relative vectors and parameters calculates from a view space coordinate (g, h) to the world space vector p*. Three vectors Xv and Yv , Zv is the orientation in view space, Rv , Lv relate to IPD calculation, fv is the degree of the view angle(recommend 120°or more).

kXvk = kYvk = tan(fv2)kZvk. (3)

Yv

XvZv

(Zv)

p⇤

O

g

h

O

Zv

Xv

RvLv

Figure 3. The relative parameters of view space

The vector p* intersects with a view plane, the view direction p is normalized p*, p⇤ = Zv + gXv + hYv , (4)

p =p⇤

kp⇤k. (5) To choose suitable camera from p, figure 4 shows cameras and p, requires a procedure, the factor of proximity s and t are calculated from camera front directions(Zc(1), Zc(2), Zc(3), …) and camera view angles(fc(1), fc(2), fc(3), …).

Zc(1)

Zc(2)

Zc(3)

Zc(4)

Zc(5)

Zc(6)

Zc(7)

Zc(8)

p

fc(m)

2

Zc(m) Xc(m)

Figure 4. The relation vector p and camera directions (Zc(1), Zc(2), Zc(3), …)

Let m be the variable natural number (1, 2, 3, …), calculate the factors s is defined as

s = dot(p,Zc(m)), (6) If s < cos(fc(m)/2), means p is placed outlying the camera region, skip to next camera. To calculate right eye vision, t is defined as

t = dot(p,Rv), (7) or calculate left eye vision,

t = dot(p,Lv), (8) The camera is chosen that has the largest z value in the cameras, ω is weighting factor (recommend 0.5 ≦ω≦1) to adjust IPD.

z = !s+ t. (9) Camera Sampling Coordinate To get the coordinate (k, l) in sample space, vector p is projected camera view plane, Figure 5 shows the relation of vectors and parameters. Three vectors Xc(m) and Yc(m) , Zc(m) is the orientation in sample space of chosen camera,

kZc(m)k = 1, (11)

kXc(m)k = kYc(m)k = tan(fc(m)

2)kZc(m)k. (10)

calculates p’ intersects camera view plane,

p0 =p

s. (11)

O

Xc(m)Zc(m)

(Zc(m))

Yc(m)

k

ld

p0

Figure 5. Illustration of definition Sample space and parameters

Calculate vector d that aligned on the camera view plane

d = p0 � Zc(m), (12) the component k and l are calculated by

k =

dot(d,Xc(m))

kXc(m)k, (13)

l =

dot(d,Yc(m))

kYc(m)k. (14)

4. IMPLEMENTATION TO VR SYSTEM AND CONCLUSION

Figure 6 shows simulated camera images of VR landscape, 12 cameras are aligned on a sphere, figure 7 renders the stereovision reconstructs from the camera images. In conclusion, The reprojection works well on laptop computers and provides binocular vision for VR system. The challenge is the optimization of image size and minimizes the number of cameras, each images is recommended large size, but the size of camera images is currently 960x720 due to the restriction of computational performance in current study, more resolution is demanded for immersive VR system. In addition, we try to use real photography using this technique in next study.

Figure 6. Simulated camera images

Figure 7. Reconstructed binocular vision

5. REFERENCES

[1] T. Matsubara, S. Ishihara,M. Nagamachi,and Y. Matsubara, “Kansei Analysis of the Japanese Residential Garden and Development of a Low-Cost Virtual Reality Kansei Engineering System for Gardens”, Advances in Human-Computer Interaction, Volume 2011 (2011), Article ID 295074, 12 pages doi:10.1155/2011/295074. [2] T. Matsubara, Y. Matsubara, Mitsuo Nagamachi, S. Ishihara, “Development of Low-cost Kansei Engineering System with Wide-view VR Display System for Japanese Residential Garden”, Proceedings of International Conference on Humanized Systems 2013, 2013, pp.1-6 (CD-ROM). [3] Cabral, E.L.L., de Souza, J.C., Hunold, M.C., “Omnidirectional stereo vision with a hyperbolic double lobed mirror”, Proceedings of the 17th International Conference on Pattern Recognition, 2004, pp.1-9.

[4] H. Lee, Y. Tateyama, T. Ogi, “Panoramic stereo representation for immersive projection display system”, Proceedings of the 9th ACM SIGGRAPH Conference on Virtual-Reality Continuum and its Applications in Industry, 2010, pp. 379-382.


Recommended