Home About

  TRIP: a Vision Sensor System

TRIP
MobileEye
MobileSense
Publications
Downloads
Related Work

Welcome to the TRIP system website. Here, you can download the source code of the TRIP system and access all the documentation available about it. You will also be able to put the TRIP system into practise by means of the MobileEye application, which turns a standard mobile phone into a ringcode reader.  An evolution of MobileEye has been MobileSense, a Bluetooth application capable not only of processing TRIP ringcodes, but also of undertaking OCR and colour recognition and text to speech synthesis. TRIP was the result of my PhD at the Laboratory for Communications Engineering, University of Cambridge. My PhD was about providing sensing and middleware infrastructure to facilitate the deployment of Ubiquitous Computing in our living environments. My dissertation is summarised in this presentation.

For more details on TRIP, MobileEye or MobileSense click on the corresponding menu options on the left hand side or on Publications. You can download the source code of these systems by clicking on Downloads.

horizontal rule

The TRIP system

TRIP (Target Recognition using Image Processing) is a vision-based sensor system that uses a combination of 2-D circular barcode tags or ringcodes (see image below), and inexpensive CCD cameras (e.g. web-cams, CCTV cameras or even mobile phone cameras) to identify and locate tagged objects in the cameras’ field of view. Compared with untagged vision-based location systems, the processing demands of TRIP are low. Optimised image processing and computer vision algorithms are applied to obtain, in real-time, the identifier (TRIPcode) and pose (location and orientation) of a target with respect to the viewing camera.

This work was sponsored by the Department of Education of the Basque Goverment and AT&T Laboratories Cambridge.

      

 

 

horizontal rule

The MobileEye and MobileSense Systems

The latest mobile phones are offering more multimedia features, better communication capabilities (both short and long range) and are far more easily programmable (extendible) than ever before. So far, the "killer apps" to exploit these new capabilities have been presented in the form of MMS (Multimedia Messaging), video conferencing and video-on-demand services. We propose two new promising application domains for the latest Smart Phones: Their use as intermediaries between us and our surrounding environment and as aiding devices that augment the sensorial capabilities of people with disabilities (blind, deaf). Thus, our mobiles will behave as personal buttlers that will assist us in our daily tasks, taking advantage of the computational services provided at our working or living environments. For this to happen, a key element is to add senses to our mobiles: capability to see (camera), hear/notice (Bluetooth) the objects and devices offering computer services within an environment. The MobileEye and MobileSense systems represent our solutions in this regard and attempt to prove that it is possible to add sensing capabilities to our mobile phones.

This work was sponsored by the Cathedra of Telefónica Móviles in the University of Deusto.

 

Send mail to d i p i n a [at] e s i d e [dot] d e u s t o [dot] e s with questions or comments about this web site.
Last modified: 24-07-06