We can imagine that an interactive robot will need to display amounts of information that are not adequate for the small screens that may equip their bodies. On the other side there are a number of new small video projectors on the market with low energy consumption, increasing projection capabilities and interesting durability. This creates a good opportunity for employing these devices as visualisation extensions for these robots. In fact, they can be used for displaying information on some wall or on the floor for the nearby user.
- You should develop your work in a continuous way and I will not accept people that work only during the last month.
- You must demonstrate the evolution of your work every one or two weeks.
- You must work on the lab and integrated on the team and not alone at home (not somewhere else)
- You are close to be a professional so you must behave as one.
- Students are accepted on a First Come First Served basis and take into account the student obtained grades. So if you come too late, probably the subject is already taken.
Note: that this is a dynamic list and it may be updated frequently especially just before the beginning of the semesters.
Medical ultrasound is a medical diagnostic technique based on the application of ultrasound to get views of the internal body structures or organs. Among the advantages there are its low cost, non-use of ionising radiation, and realtime imaging. The learning process implies the development os capabilities to interpret the images that correspond to 2D scans of the body inside. Being based on the emission of mechanical waves and detection of their reflections, these waves are modified by the elements that they encounter along the travel paths. In this process it is normal that artefacts may result from reflections or other sources but that are discarded by the experienced physician through a careful choice of the probe scan movements and varying its orientation. In this project we intend to create a system that includes an haptic device (phantom or …) that will be used to simulate the positioning of the sonograph probe, an HMD that will be used to visualise the both the (virtual) patient and the sonograph display.
Smart interactive houses have since long been the subject of several conversations, commercial advertisings. In fact there are several manufacturers that have developed proprietary protocols for controlling devices but as their offer is either limited in the variety of supported devices or they are too expensive we have not seen any massive adherence to their use. This is one side of the story, the other side comes from the type of usage that these systems permit to use, and their more os less complicated and limited interfaces. We can say that frequently these systems enable one to do everything that he or she does not need. More recently automatic profiles have been supported by some of these systems that enable a given preset configuration to be selected at particular times, or when a user arrives or leaves home. The aim of this project is to develop an extensible platform that will perform the integration of signals from a network of sensors, such as cameras, PIR, temperature, noise, among others. This will create the basis for the analysis of behaviours of people, in particular for aged people to detect abnormal and risky situations, such as falls, strokes, excessive immobility, etc...
Manipulation of microscopic objects is gaining relevance as the techniques evolve and bring the possibility of dealing of smaller and smaller objects. For example in biology the possibility of manipulating individual living cells, embryos and stem cell and generic therapies is of outmost importance. For example performing injection in cells is a task that requires extensive practice, in particular as it is performed at a level where the magnitude of the involved forces are very hard to measure, and only a 2D-like visualisation of the task is possible through a microscope. Given this, it is quite normal that in many cases the cells explode due to imprecise manipulation that in turn comes out from the lack of reliable feedback that may bring the operator to sense it properly. In this work we intend to develop a system for manipulating cells or other microscopic objects, based on the generation of haptic forces extracted from visual cues extracted from the manipulated objects’ deformations.
In recent years, progressive exposure therapy has proven to become very effective in the treatment of psychological disorder. Following this principle and due to the danger that sometimes in vivo or in loco exposure brings to the patient, it is important to develop a Virtual or Augmented Reality framework that allows that same experiences. This proposal focus on the manipulation of perceived self, i.e. modifying the perception that the users have of their own body. When applied either to Virtual or Augmented Reality systems, the users must be able to look at themselves (looking down or through a mirror) and see a different body, and feel it as theirs body. The idea is to exploit the “rubber hand illusion” and extend it to make the users believe and feel that they own a new body, not just a different hand. The modified body perception will enable the development of serious games for the therapy of specific disorders such as phobic disorders.
This project aims at developing a 3D teleconferencing system, that virtually puts two persons in front of each other, in spite of being in different physical spaces. For this we will explore the use of two Kinect devices to capture not only the scene image but also its depth information from two different viewpoints. This information should be transmitted to a remote place that will show the 3D scene on a TV screen (eventually 3D enabled) from an observer controlled viewpoint. Being the idea to develop a face-to-face communication system, the viewpoint changes should be equivalent to moving the observer’s head in front of the viewed user, as happens in a physical meeting. By consequence the viewpoint movements allowed are confined to those that keep the observed in his/her (virtual) half-space. This will simplify the problem as no back views need to be generated. To enable the user to move freely in front of the display and have the 3D perception of the remote user and scene, a tracker must be used to continuously estimate the viewing point.