Vision Research at Bristol Professor David Bull
Head of Visual Information Laboratory
Director Bristol Vision Institute
25 February 2013
What is BVI?
• A grouping of around 35 permanent academic staff and some 70 other researchers at Bristol working on vision and imaging research and its applications
• Adopted by the University of Bristol as a strategic research “Theme” in 2007
• From fundamental psychology and physiology to applications • Learning across disciplines and working across boundaries.
• Collaborators include: BAe Systems, DSTL, BBC, STM, ARRI, Technicolor, Aardman, GDUK, RFEL, Seebyte, QinetiQ, Jaguar, Samsung, Thales, Toshiba, HHI Berlin, ENST Paris, MPI, UCSD, Purdue, RFEL, UNSW, Cambridge, UWE.....
Scope and reach
• Anatomy • Archaeology and Anthropology • Biological Sciences • Biochemistry • BRL • Computer Science • Drama and Film
• Electrical and Electronic Engineering • Experimental Psychology • History of Art • Ophthalmology • Mathematics • Machine Vision Laboratory, UWE • Mechanical Engineering • Medicine
• Answering big questions requires collaboration across traditional academic divides. E.g.:
• To understand and exploit biological systems • To exploit perceptual processes to enhance visual tasks and experiences • To facilitate automation using vision-based methods
Research areas
• Human and Animal Visual
Performance
• Visual Psychophysics
• Vision and Learning
• Visual System Modelling
• Eye Movements and Hand-Eye
Coordination
• Locomotion and vision
• Foraging and mapping
• Robot Vision
• Clinical and Developmental Studies
• Brain Imaging fMRI
• Computational Modelling
• Computer Vision
• Multisensor Fusion
• Detection, Tracking, Classification of
difficult targets
• Search and Retrieval – Finding and
Hiding things
• Camouflage
• Animal biometrics
• Video Compression
• Quality assessment and immersion
• High Dynamic Range imaging
• Immersive video formats
The mantis shrimp
• We are trichromatic: with red, green and blue photoreceptors. The mantis shrimp uses TWELVE colour channels (a world record), PLUS polarisation.
• Coincidentally-perhaps, it also has the fastest and most accurate strike in the animal kingdom....
Shared facilities: studio
• RED ONE mysterium • RED EPIC • Sony HDCAM F900 /F500 • Panasonic PTZ • Mikrotron Eosyns (HFR) • Polarisation imager • 24 camera parallel capture • Tobii Eyetracker • Final Cut Pro Mac suite • HD and HFR display • Dolby HDR display • Subjective Evaluation
Shared facilities – locomotion
• HMD with integrated eye tracker • Projection on walls (or screens)
and floor • Qualisys 10 camera motion
capture • Force platforms • Mobile eye tracking • Treadmill
(1) Motion
The motion of a sparse set of features are used to extract 3D pose using
only a single view point. The appearance of the object is not modelled.
Visual SLAM
mapped
points
)t,(R ii
Simultaneously Localise a Camera And Map the environment - SLAM
Applications in Robotics, Wearable Computing, Augmented and Virtual Reality,
and Personal Localisation systems
….in real-time
Segmentation
– On real images
Conventional level set Conventional level set
Conventional level set Conventional level set
Proposed method
Proposed method
Proposed method
Proposed method
(3) Colours and patterns
Reaction-Diffusion-System:
- animals adapt visually to habitats and conspecifics - pattern evolution follows chaotic generation procedures (Turing systems, fractals etc.) - resulting patterns have global properties specific to species - butterfly effect facilitates individually unique patterns
utilization for camouflage & animal identification
Con-textures
Graphcut techniques for texture synthesis
Synthesis of textures using a steerable pyramid
Original
Texture
Synthesised
Texture
Compression-texture synthesis
Region
Classification
Segmentation
Texture
Warping
VQA I
Video In
H.264
Encoder
Entropy
Coding
Dynamic Texture
Analysor
Regions
Static
textures
Mo
tio
n
pa
ram
eter
s
Ma
ps
Wa
rped
reg
ion
s
Warping maps
Dynamic textures
VQA II
Synthesised
regions
Synthesis maps
Fee
db
ack
Non-texture
regions
Synthesis-failure
regions
Ch
an
nel
Bit
str
eam
Entropy
Decoding
Texture
MC
Dynamic
Texture
Synthesiser
Warping maps
Motion parameters
Synthesis maps
H.264
Decoder
Reconstructed frames
Reconstructed frames
Wa
rped
fra
mes
Video out
Feedback
Sid
e in
form
ati
on
Video error resilience
Error Resilience is a Wireless Environment is measured assuming: 2x2 STBC, ½ rate, Constraint length K = 7, convolutional code, Viterbi hard decision decoding, ideal channel estimation and uncorrelated spatial channels, BPSK modulation.
(5) Immersive Technology
• The Bristol-BBC Immersive Technology Laboratory has been set up to explore the extended parameter space associated with future immersive video formats – ‘beyond 3D’ • Higher frame rates , Higher dynamic range, Higher resolution • Enhanced colour gamut, Exploiting peripheral vision
Horizontal resolution, pixels
Frame rate (Hz)
2K(HDTV)
4K 8K(SHV)
6K0.7K(SDTV)
50
100
150
200
250
300
350
400
Frame rate required to maintain dynamic resolution of standard definition 50Hz TV without shuttering
Frame rate required to maintain this resolution with 50% shutter
Current proposal for UHDTV2 (SHV) - 120Hz
Contour for systems with sample rate the same as SHV @ 120Hz
HDTV
SDTV
Current proposal forUHDTV1 in ITU-R BT2020 (“4K” Quad HD) – 25Hz