Open Subjects

VisualChain: Visual Inspection in a production chain

Objective: To develop a quality control system using computer vision to control the parameters of the plates of certain references of bicycle chains.

SRAMPORT is a company of the Chicago-based SRAM group and produces high-quality bicycle chains.
A chain is composed of outer plates, inner plates, rollers and shafts. The plates are produced in presses by a stamping process. There is now a manual control of the physical parameters of the plates produced.

The aim of this work is to develop a system that has the capability of measuring the physical characteristics of several plates dispersed on a base,

after the measurement the system will validate each of the sample plates or reject according to the parameters of the model and its tolerance .

In a second phase the system can undergo an evolution for continuous measurement and integration in the productive process.

Workplace: ISR-UC e SRAMPORT

Supervision: Prof. Paulo Menezes and Eng. Paulo Silva (SRAMPORT)

Spatial Augmented Reality for Serious Games

Most people have assisted to video-mapping shows where pictures and movies are typically projected using buildings structures as a screen.
We know that pointing a projector to a wall requires some adjustment of its orientation or the correction of the "trapezoidal distortion" (affine transformation) that appears when the optical axis is not perpendicular to the wall plane. When projecting over non-planar surfaces the distortion effects render the projected image hard to understand, depending on the relative position of the projector, the "screen" surface(s) and the viewer.

The objective of this work is to develop an interactive system composed of a Kinect device and a video projector that will be used to project over a table, a slanted wall or even the floor near a robot. Using new low consumption LED-based video projectors it is possible to embed it, together with a Kinect-based device on a mobile robot, that may then use this system to display information on the most appropriate nearby surface.
There are indeed several interesting possibilities for the use of this system like  (1) in making the projection over a set of objects and modify their appearance or create a scenario for a game based on the manipulation of these objects, or (2) in industrial contexts.

Although in this work we will aim at develop a game-based rehabilitation tool for stroke patients or elderly people, it will be closely related with an industrial application on the context of an ongoing research project.

Remarks:

This work will be benefit from a current ongoing project and from a collaboration with LAAS-CNRS a large research institution in Toulouse, France.
Depending on the pace of the evolution of the project, the student may be invited to do an internship at LAAS-CNRS co-advised by Prof. Frédéric Lerasle, where the working language can be either English or French depending on the student fluency on these languages.

Workplace: ISR-Coimbra / LAAS-CNRS in Toulouse, France

deepSTAIl: Style Transfer for Artificial Illustrations

Deep learning-based techniques are at the centre of the attentions of companies, researchers and even general public. The applications on the recognition of people, places and objects have demonstrated the power of these techniques showing unprecedented success rates.

One interesting application of neural networks is in what is called “style transfer”. This corresponds to get a picture or illustration and by using an exemplar from a specific painter the system produces a version of the input picture with close resemblance to the works of that painter.

Read more: deepSTAIl: Style Transfer for Artificial Illustrations

Immersive Medical Sonography

Medical ultrasound is a medical diagnostic technique based on the application of ultrasound to get views of the internal body structures or organs. Among the advantages there are its low cost, non-use of ionising radiation, and realtime imaging. The learning process implies the development os capabilities to interpret the images that correspond to 2D scans of the body inside. Being based on the emission of mechanical waves and detection of their reflections, these waves are modified by the elements that they encounter along the travel paths. In this process it is normal that artefacts may result from reflections or other sources but that are discarded by the experienced physician through a careful choice of the probe scan movements and varying its orientation. In this project we intend to create a system that includes an haptic device (phantom or …) that will be used to simulate the positioning of the sonograph probe, an HMD that will be used to visualise the both the (virtual) patient and the sonograph display.

Read more: Immersive Medical Sonography

A Sensor Network for a Confortable and Safe (IoT) Smart House

Smart interactive houses have since long been the subject of several conversations, commercial advertisings. In fact there are several manufacturers that have developed proprietary protocols for controlling devices but as their offer is either limited in the variety of supported devices or they are too expensive we have not seen any massive adherence to their use. This is one side of the story, the other side comes from the type of usage that these systems permit to use, and their more os less complicated and limited interfaces. We can say that frequently these systems enable one to do everything that he or she does not need. More recently automatic profiles have been supported by some of these systems that enable a given preset configuration to be selected at particular times, or when a user arrives or leaves home. The aim of this project is to develop an extensible platform that will perform the integration of signals from a network of sensors, such as cameras, PIR, temperature, humidity, noise, gas, carbon dioxyde, among others.The integration of these sensors will create the necessary basis for creating a smart home or office support. Habits, preferences, and behaviours are to be learnt and used to create the best comfort conditions while minimising energy waste. A second goal is to integrate the necessary intelligence  to detect abnormal and risky situations, such as falls, or excessive immobility, and generate the appropriate alarms.

Read more: A Sensor Network for a Confortable and Safe (IoT) Smart House

Haptic Interaction for Simulated Micromanipulation

Manipulation of microscopic objects is gaining relevance as the techniques evolve and bring the possibility of dealing of smaller and smaller objects. For example in biology the possibility of manipulating individual living cells, embryos and stem cell and generic therapies is of outmost importance. For example performing injection in cells is a task that requires extensive practice, in particular as it is performed at a level where the magnitude of the involved forces are very hard to measure, and only a 2D-like visualisation of the task is possible through a microscope. Given this, it is quite normal that in many cases the cells explode due to imprecise manipulation that in turn comes out from the lack of reliable feedback that may bring the operator to sense it properly. In this work we intend to develop a system for manipulating cells or other microscopic objects, based on the generation of haptic forces extracted from visual cues extracted from the manipulated objects’ deformations.

Read more: Haptic Interaction for Simulated Micromanipulation

Serious games for therapy of psychologic disorders

In recent years, progressive exposure therapy has proven to become very effective in the treatment of psychological disorder. Following this principle and due to the danger that sometimes in vivo or in loco exposure brings to the patient, it is important to develop a Virtual or Augmented Reality framework that allows that same experiences. This proposal focus on the manipulation of perceived self, i.e. modifying the perception that the users have of their own body. When applied either to Virtual or Augmented Reality systems, the users must be able to look at themselves (looking down or through a mirror) and see a different body, and feel it as theirs body. The idea is to exploit the “rubber hand illusion” and extend it to make the users believe and feel that they own a new body, not just a different hand. The modified body perception will enable the development of serious games for the therapy of specific disorders such as phobic disorders.

Read more: Serious games for therapy of psychologic disorders

3D Teleconferencing

This project aims at developing a 3D teleconferencing system, that virtually puts two persons in front of each other, in spite of being in different physical spaces. For this we will explore the use of two Kinect devices to capture not only the scene image but also its depth information from two different viewpoints. This information should be transmitted to a remote place that will show the 3D scene on a TV screen (eventually 3D enabled) from an observer controlled viewpoint. Being the idea to develop a face-to-face communication system, the viewpoint changes should be equivalent to moving the observer’s head in front of the viewed user, as happens in a physical meeting. By consequence the viewpoint movements allowed are confined to those that keep the observed in his/her (virtual) half-space. This will simplify the problem as no back views need to be generated. To enable the user to move freely in front of the display and have the 3D perception of the remote user and scene, a tracker must be used to continuously estimate the viewing point.

Read more: 3D Teleconferencing