LIFE – Light Field Evaluation System

Save favourite Print 1 Mar March 2016

This three year project has the aim to build a light field communication test bed and explore the consequences on processing, distribution and presentation by utilizing a large number of cameras of different types and presenting captured data in different ways. Industrial partners, Ericsson and Axis Communications, will us the project results as input to video standardization works and improving the image quality in video surveillance.

Background

Conventional image and video capture has proved to be powerful components in numerous applications throughout the years, providing information about how objects in a scene are related in space, time, and colour. Unfortunately, neither images nor video sequences tell the whole picture of what is visible in a scene. Theoretically all light reflected of a scene can be described using the 7-dimensional plenoptic function, which then encompasses everything visible. A conventional digital camera can be described as sampling this plenoptic function giving a 2D image. Similarly a video camera gives rise to a sequence of images, or a 3D dataset. This kind of dimension reduction inevitably causes information loss.

In this project researchers study immersive telepresence and video surveillance. For a virtual conference room to be a competitive alternative to the physical meeting, an immersive meeting experience close to a real-life meeting is required. Light field-based capture and presentation have the potential to provide this level of quality. Within the context of video surveillance the emphasis rather lies on extracting conventional 2D and video content with enhanced quality and improved identification capabilities.

Objectives

The project partners agree on that a flexible test bed is useful for several applications. Different testing goals may be achieved by using the same test bed. The explicit project objectives are:

  • To build a prototype light field camera array, including rectification procedure and synchronization between cameras
  • To present a subset of the light field data in the form of a free-view perspective on a 3D-capable device giving an immersive experience
  • To explore computational photography methods required for an immersive light field presentation and propose novel algorithms that imply improved quality
  • To implement solutions required for making a real-time test bed demonstrator functional, including rectification, synchronization, distribution, presentation, and coordination issues both between cameras and between sending and receiving side of a link, to integrate those solutions into the test bed and to utilize the test bed to demonstrate end-user perceived benefits of lightfield technology by, e.g., demonstrating free-view perspective selection capabilities
  • To perform initial test to get an understanding for technical requirements using real data, and the improvements of applications of interest, i.e. experience of immersiveness for telepresence and surveillance, and enhanced image processing for surveillance.

Research group

Researchers

Project leader
Mårten Sjöström
+46-14 88 36
+46 70-544 6811
marten.sjostrom@miun.se

Researchers
Roger Olsson

Project period

2015-2018