Google Summer of Code (GSoC) 2015 Projects

Below follow the list of the project proposals accepted for 2015 GSoC. More information can be found on:

2015 Accepted Project Proposals

1- Utilization of the EUROPA Scheduler/Planner for the V-ERAS EVA Missions planning

2- Virtual Reality based Telerobotics

3- Enhancement of Kinect integration in V-ERAS

4 – Development of a Monitoring/Alarm Front-End for the ERAS Station

1- Utilization of the EUROPA Scheduler/Planner for the V-ERAS EVA Missions planning

Shridhar Mishra

Proposal Abstract

  • The main aim of this project will be making some sort of Astronaut’s Digital Assistant which will take into account all the constraints and rules that has been defined and plot a plan of action. It will also schedule all the tasks for the astronaut such that job of the astronaut becomes easy.
  • It may also take health data of the astronaut from the health database and determine if the astronaut is fit enough to perform a certain job (like a space walk or repair) based on the already programmed constraints , it may also notify other crew members if the situation of crew member is critical and help is needed.

Time Schedule

This week-by-week time-line provides a rough guideline of how the project will be done

Community Bonding Period (April 27 – May 25)

  • Start coding advance due to exams in july.
  • Study the existing documentation on PyEUROPA and Health monitor projects done by previous year gsoc students.

  • Research and discuss about different ways to implement the code.

  • Lay down rough designs on paper and discuss the same with the mentor and the community.

2.5 Weeks (May 26 – June 5)
  • Refine schedule with the mentor.
  • Discuss the constraints that are added .
  • Discuss rules that govern the constraints.
2.5 Weeks (June 6 – June 23)
  • Implement the rules discussed in the previous week.
  • Implement rules regarding all the external factors likes spares, fuel, man power.
Objective: Complete work regarding all the external environment.
1 days (24-25 June): Prepare the code for mid-term evaluation.
June 26: Midterm evaluation
3 Weeks (July 1 – July 22)
  • Discuss crew health related automations.
  • Implement crew health related automations.It may include something like if heart rate is more than 120 (say) and body temp>99F then abort operation.
  • If time permits develop a front end for the whole Astronaut’s Digital Assistant.
1 Week (July 23 – July 31)
  • Fix any code block that do not follow the specified coding style.

  • Write integration tests

  • Test and document the code more thoroughly and work on the documentation.

2 Weeks (Aug 1 – Aug 14) Buffer period for any delay in the project due to my exams or any other unseen reason in the future.
Last Week (Aug 14 –Aug 20) Review and code the final documentation. Look for any errors in the code. Check for errors in coding style, indentation errors or any other errors and make ending commits.

Benefits to ERAS

The main benefit to ERAS will be that it will get extended support for from py europa and better planning for future missions since all the missions will be evaluated by the computer and its feasibility will be made available.

Deliverables

On completion of the project we expect to have a planning and scheduling based on external factors like health stats of crew member, availability of fuel, other resources like spares etc.

2- Virtual Reality based Telerobotics

Siddhant Shrivastava

Proposal Abstract

Virtual European Mars Analog Station (V-ERAS) is based on immersive real-time environment simulations running on top of the Blender Game Engine (BGE). This V-ERAS project has two distinct components. First, it entails the teleoperative control of a Husky Robot rover’s motion via human body-tracking. Second, it involves streaming the 3-D camera video feed from the rover to BGE over the network and processing it into an Augmented Reality experience through a head-mounted Virtual Reality device. The goal of the project is thus to develop a software and hardware system that enhances the capabilities of the crew members preparing for Mars missions.

Project Timeline

List of terms

  • BGE – Blender Game Engine
  • C3 – Command, Control, Communication
  • HUD – Heads-Up Display – A display within the viewer’s line of sight which indicates information. The Head-Mounted VR device acts as the HUD medium for this project
  • ROS – Robot Operating System – a collection of tools, libraries, and conventions that aim to simplify the task of creating complex and robust robot behavior across a wide variety of robotic platforms.
  • RTSP – Real-time Streaming Protocol
  • Telerobotics control of semi-autonomous robots from a distance
  • VR – Virtual Reality
  • V-ERAS – Virtual European Mars Analog Station

Community Bonding Period (Apr 27 – May 25)

  • Continue to engage with the ERAS community (via IRC and Google Group)
  • Set up workstation according to project requirements
  • Explore Clearpath Robotics’ Husky rover for simulation in ROS and Blender
  • Understand certain components of the code-base (Vision, Avatar Control, Body Tracking, Communication, Virtual Reality, HUD modules) in further detail
  • Learn advanced Blender concepts
  • Start working with the devices required for the project (Stereo Camera, Oculus VR device).
  • Create the first version of the Software Architecture Document

12-week timeline (May 25 – Aug 24)

  • Week 1-2 (May 25 – June 8)
    Teleoperative control of simulated Husky rover model

    • Week 1 (25 May – 1 June)
      Create basic interface for mapping Kinect bodytracking information to teleoperation commands
    • Week 2 (1 June – 8 June)
      Write a Tango device server to act as a Publisher/Subscriber ROS node in order to communicate to the Husky rover model.
      Unit Tests. First bi-weekly report.
  • Week 3-4 (8 June – 22 June)
    Drive a real Husky model with the Telerobotics module

    • Week 3 (8 June – 15 June)
      Employ the parallelly developed generic gesture control interface in another project for the telerobotics module
    • Week 4 (15 June – 22 June)
      Extend the Teleoperative control to a real Husky mobile robot. Second bi-weekly report
  • Week 5 (22 June – 29 June)
    Real-time Streaming from stereo camera to the V-ERAS application and Oculus

    • Field Tests for the developed modules
    • Integrate 3-D camera stream from the Minoru 3-D webcam with Blender and the Oculus VR Device
    • Configure high-performance ffmpeg servers to communicate video streams for different Quality of Service (QoS) requirements
  • Week 6 (29 June – 6 July)
    Buffer Week

    • Visualize the stereo camera streams in the V-ERAS application.
    • Field tests continued for the developed modules
    • Unit Tests for body-tracking Husky rover
    • Performance evaluation of Minoru 3-D camera, ROS, BGE, and Oculus working together in V-ERAS
    • Commit changes to V-ERAS
    • Third bi-weekly report + Midsem Evaluations
  • Week 7 (6 July – 13 July)
    Oculus Integration with the stereo camera stream

    • Extend the existing Oculus-Blender interface to display and update the incoming stereo video stream
  • Week 8-10 (13 July – 3 August)
    Augmented Reality experience through a Heads-Up Display(HUD) for Oculus Rift using the Blender Game Engine

    • Week 8 (13 July – 20 July)
      Use the positional-tracking feature of Oculus VR DK2 to set rover camera angle. Complete any remaining part of teleoperative control of Husky Fourth bi-weekly report.
    • Week 9 (20 July – 27 July)
      Integrate Augmented Reality with the work done in week 1-6. Commit changes to V-ERAS /HUD
    • Week 10 (27 July – 3 August)
      Develop a generic HUD API for any future application to use. Fifth Bi-weekly report.
  • Week 11-12 (3 August – 17 August)
    Code cleaning, Testing, Documentation, Analysis, Commit, Polish existing functionalities

    • Week 11 (3 August – 10 August): Network Performance Analysis, PEP8 compliance
    • Week 12 (10 August – 17 August): Integration Tests, Documentation. Final commits and merging. Final report
  • Firm Pencils-down deadline:
    Submit code-samples to the Melange application.

Project Detailed Description

  • Visualize video stream from stereo camera mounted on the Rover to the Oculus VR device
    • Hardware Requirements
      • Raspberry Pi Model B
      • Minoru 3D webcam
      • Wireless Network connection
      • High-performance computing node to support the V-ERAS Blender Game Engine application
    • Current State of the Vision moduleCurrent State of Vision System
      • The CamManager module captures the incoming image stream from the left and right stereo channels and passes it to the Cameo Tango device server, which sends the images over the Tango bus to the WindowManager, ObjectTracking manager, ObjectRecognition manager, and DepthTracking manager. The inferences from this are passed to the scheduling and inference engine EUROPA
    • Proposed Software Architecture
      • The real-time video stream is fed from the input camera device and streamed using a ffmpeg server separately from the existing Tango Controls setup for Image stream. A FFmpeg client node handles this through the Blender Game Engine.
      • The Image stream is extracted by the Raspberry Pi mounted on the Rover and communicated separately to the Tango bus for the Vision subsystem on V-ERAS.
      • For hard real-time support, protocols like RTSP will also be considered in addition to the existing HTTP support for streaming live data from the rover.
      • Network Performance analysis will be done using Wireshark.
  • Teleoperation of Husky Rover using Body-Tracking
    • Current Stage
      • Body Tracking – Kinect and its associated libraries (OpenNI, and Microsoft Kinect SDK) are succesful in extracting the vital information about a human body’s movements.
      • Avatar Control is specific to the animated human avatar
      • Teleoperation is essentially required for robot rovers because both human intervention and automation are required for control of the robot.
      • Currently, V-ERAS has basic Avatar Control support within BGE which needs to be extended for different mobile objects.

Real-Life Architecture Diagram

  • Proposed architecture
    • Write Tango Publisher/Subscriber nodes for communication with ROS components
    • Motion Authoring via body-tracking – create a well defined Blender model rig and supporting scripts that allows different objects to register motion information.
    • Motion playback via the Robotic Rover – create a system that allows the digital avatar to take commands from the Teleoperation engine and display various motions on the Mars surface via some possible channels (ROS, Tango). This playback happens in real-time.
    • Simulate rover in Blender via rViz, rospy, and the associated robot model.
    • Teleoperate Husky rover using body-tracking information.
    • Design a Tango device server for body-tracking
    • A ROS-and-Blender based integration example which inspires my thought process can be found here.
  • The architecture that I have in mind for the simulation of bodytracking-based teleoperation is –

Architecture diagram for the simulated system

Benefits to ERAS

This project would benefit ERAS on several important fronts.

  • The solution of creating an Augmented Reality and Teleoperation experience is a major step forward in the existing state of the project. It will make the V-ERAS system more intuitive by immersing the crew in the Martian environment via the Oculus VR device. A generic library for object-control with motion-tracking is an excellent fit for different object utilities (robot rovers, human avatars, manipulator arms).
  •  improve the efficiency of data communication and real-time feedback which are primary requirements in such a life-critical application.

Deliverables

  • Real-time Streaming Video Server
  • Stereo camera stream aggregator
  • BGE support for 3-D camera stream visualization
  • Teleoperation module based on Body-Tracking
  • ROS Nodes for driving a Husky rover based on the telerobotics module
  • Generic HUD (Heads-Up Display) Augmented Reality support for all V-ERAS modules
  • Blender scenes for supporting the above modules

3- Enhancement of Kinect integration in V-ERAS

Vito Gentile

Summary

The available immersive virtual reality simulation of the ERAS Station (V-ERAS) allows users to interact with a simulated Martian environment using Aldebran VSS Motivity, Oculus Rift and Microsoft Kinect. However the integration of the latter technology is still not complete, and the goal of this project is to enhance it in order to:

  • increase the manageability of multiple Kinects: a GUI is available, but it is written in C# ans should be translated in Python. Furthermore, visualization of users’ skeletal joints may be included in the GUI;
  • improve user navigation: Motivity is a passive omnidirectional treadmill, so users’ steps are not real. As a consequence, the system needs to include a valid and robust algorithm for estimating users’ steps, to be reproduced in the virtual reality simulations;
  • reproduce users’ movements in real time: at the moment, it seems not clear to identify main reasons for non-real time responses of the system. Some of the issues might be in the integration of Kinect with the system, and a further investigation is needed. If this investigation reveals that the non-real time responses are due to an improper Kinect integration, such problems should be solved;
  • reduce data transfer latency by enhancing Tango integration: data taken from Kinect are sent to a Tango bus in a non-straightforward way, because the tracker is written in C# and the libraries to interact with Tango are in Python. One of the aim of this project is to test the reliability of PyKinect, a framework that allows the use of Python to interact with Kinect using the Microsoft API on Windows. If the adoption of PyKinect will fail, the use of Boost.Python can help. As an alternative to C# API, Microsoft provides also a C++ implementation of the same libraries, so Python may still be used to implement a tracker;
  • integrate touch-less gestural interaction support: touch-less interaction can be integrated in an actual analog station (e.g. to control some system parameters, similar to what have been done in many domotic houses with lighting or temperature level), or can be used only for simulation purposes (e.g. for simulating interactions with virtual objects). Teleoperation of robots and ATVs, as well as interaction with gestural-based GUIs, can be some of the useful applications to be included in ERAS. One of the aim of this project is to integrate a touchless gesture recognition module; if the algorithms that will be decided to use only depend on skeletal joints, they can be implemented in Python and simply get data from the Tango bus. However, more sofisticated gesture recognition algorithms can require some of the facilities provided with Microsoft SDK, and might be difficult to translate them in Python. So an investigation in this direction will be needed.

System architecture

A scheme of how the system architecture should be at the end of the project is available at this link, and shown below:

V-ERAS System Architecture

A single main server will manage all the four Kinects, and send skeletal joints on the Tango bus. On the other hand, each ERAS station can read skeletal joints from the bus, and use them from the Blender Game Engine.

Before V-ERAS 14, each station had its own Kinect directly plugged-in, while now a single station can manage all the Kinects, and send skeletal joints to the Tango bus from a single computer. In fact, the aforementioned architecture has been started to be built during V-ERAS 14, and need to be particularly refined and tested in the data transfer phase from the skeletal tracker (currently written in C#) to the Tango bus (which uses the Python API).

V-ERAS Tracker Behavior

The previous picture (also available at this link) shows how the skeletal joints are used from the system. The skeletal tracker get and save them in a JSON file, which is then read from a simple Python script and send to the Tango bus. One of the goal of this project is to simplify this, by merging the skeletal tracker and the Python script, and removing the need to use an intermediate file.

Project Timeline

  • 27 April – 24 May (before the official coding time)
    Familiarize with the code and the version control system, study the documentation guidelines, and better investigate improvements to be done, together with mentors.
  • 25 May – 10 June
    Translate the Kinect skeletal tracker from C# to Python, by using PyKinect. Rewrite the GUI in Python and integrate a visualization of users’ skeletal joints in it. Verify the manageability of four Kinects simultaneously working.
  • 11 June – 25 June
    Determine the algorithm used to estimate users’ steps, and then implement a better solution to improve user navigation.
  • 26 June – 3 July (mid-term evaluation)
    Test the produced software, document it and prepare for the mid-term evaluation.
  • 4 July – 19 July
    Improve Tango integration, by merging the PyKinect-based tracker with the tracker.py script (the one used to send data to the Tango bus). Then, verify if this improvement affects avatars’ movements and allows real-time execution.
  • 20 July – 10 August
    Implement touch-less gestural interaction support, and evaluate if it can be written in Python and simply based on skeletal joints. If not, investigate how to integrate the Microsoft SDK facilities to simplify the recognition.
  • 11 August – 28 August
    Write and improve documentation, revision code, fix bugs and prepare for final submission.

Benefits to ERAS

The preliminary part of ERAS is Virtual ERAS (or V-ERAS), and the first tests of this platform (executed on the last December in Madonna di Campiglio, Italy) have shown that most of the issues were related to skeletal tracking and its integration with the whole system. Furthermore, user navigation is still not reliable, and there is too much lag between actual users’ movements and their representations in the virtual environment. My project primarily aims to solve this kind of issues.

When user navigation will become reliable, the second part of the project will start, by improving V-ERAS with the support of touch-less gestures management. This may open several new perspectives, all related with the possibility to interact via gestures with virtual objects, or to control system parameters of the station (e.g. lighting, temperature level, etc…).

Deliverables

The final output of my project will be composed of:

  • an improved module for skeletal tracking (an extension of the current body_tracker_ms and body_tracker packages), which will include integration with Tango bus, GUI to manage multiple Kinects and visualize their output, and the system to get Kinect data and estimate the user navigation;
  • a new module (but it has to be decided if it will be integrated in the previous one or not) to manage touch-less gestures using Kinect data;
  • documentation, to describe and comment the produced code.

4 – Development of a Monitoring/Alarm Front-End for the ERAS Station

Ambar Mehrotra

Proposal Abstract

  • This project aims at developing a generic top level monitoring/alarming interface for the ERAS Habitat that would be able to manage all the relevant information.
  • Once the skeleton of the top level GUI is in place the aim will shift towards designing a popup / sub-GUI for the health monitor where different data from the V-ERAS will be shown according to the user needs. This will mainly include showing data on a preferential basis rather than showing all the data at once.
  • The aim will also be to enable the GUI to properly interact and interface with the Tango server and also will be able to leverage the functionality of the existing Tango Alarm Systems / PyAlarm for notifying the user about specific events.
  • A major portion of the project will aim at making the GUI highly customizable and easy to modify for additional data channels as and when there is a need to do so.  

Proposal Detailed DescriptionHabitat Monitoring GUI Integration

  • The Habitat Monitoring GUI will fit inside the Habitat Monitoring Client which is directly interfaced with the Tango System Bus. The GUI will be able to directly coordinate with the various servers interfaced with the Tango System Bus using the PANIC API. Panic API will serve as the main point of control and coordination among various components of Tango Alarm Systems.
  • In some cases there are restricted resources, therefore, the device cannot be allowed to generate an alarm itself. Therefore we cannot use Tango’s alarm system in those cases. In such cases collector devices will collect the data from such devices and generate the alarms for them instead of the devices generating alarms themselves. The number of collector devices is still unknown. The user will be able to add the alarm sources via a configuration file.
  • The Tango Database contains sets of rules that are permanently checked by a central daemon, the Tango Alarm Server. This server logs alarm changes and triggers actions if and when required. The rules are combinations of boolean operators and Tango Attribute values.
  • The archiving database monitors thousands of parameters essential to the functioning of the entire system. It keeps track of these parameters so as to easily manage large volumes of technical data and take the required action whenever necessary. By regularly monitoring the data a fault may be detected even before an alarm is triggered and a lot of resources can be saved.
  • The Health Monitor GUI will be implemented as a sub-GUI in the Habitat Monitoring GUI and will be used for showing the health status and other statistics of various astronauts.
  • The Aouda.X communication server will keep a track of the various parameters coming from the Aouda.X spacesuits. The Health Monitoring GUI will request the biometric data from the Aouda.X server for various Aouda.X spacesuits and will show them to the user according to the user’s choice. The main functionality of the Health Monitor Server will remain the same with little changes so the detailed description can be found on in the eras project docs (Link to Health Monitor).
  • The basic aim of the project is to develop a top level alarming / monitoring interface for ERAS Habitat that will be able to manage everything.

  • The interface will provide the user the ability to interact with the remote instruments through integration with the Tango Alarm System / PyAlarm. The project will leverage the specific development already done particularly on Tango Alarm Systems.

  • The GUI will be able to integrate with the various plottings coming in from the various sensors and display them appropriately.
  • The main aim is to make the GUI highly customizable. The user should be able to add more data channels easily as and when required as per the need.

    The user will be presented with a list of channels that he may be allowed to add according to a configuration file or from a database.

    Another way of adding a data channel can be by asking the user to supply the address of the device and then asking the device itself for the rest of the information.

  • After the basic skeleton of the top level monitoring system is ready then I will move on to integrating the Health Monitor into the GUI.
  • The Health Monitor GUI will be made as a popup or a sub-gui inside the top level monitoring system. It will be dedicated to provide information about health of the astronauts performing EVA through biosensors (ECG, air flow sensor, etc.). It can use the existing Health Monitor as a starting point and then make some adjustments.
  • Choice of suitable library:
    • When it comes to the choice of the library to use with this there are two most suitable choices that can be taken. One is the use of the PyQt library that is the python wrapper for the very famous C++ Qt library. Other is the use of the wxpython which is a wrapper for the wxWdgets of c++. Yet another option is to use the PySide library which is basically same as the PyQt but with better licencing.
    • Both PyQt and wxpython are considered good for writing GUI apps in python. But in my opinion PyQt and PySide has a lot of community support as compared to wxpython. They also offer a signal slot mechanism which will go well with the PyAlarm in my opinion as we want to catch a signal and do something.
    • Both PyQt and PySide can use the famous Qt Designer to design .uic files and import them and use them. wxpython also has a GUI designer called glade but on going through the user reviews it turns out that Qt Designer is much more better.
    • A plus for wxpython though is that the documentation and tutorials are great as compared to PyQt. Also the original health monitor written uses wxpython so that would serve as a good starting point. But the syntax of the two libraries is not much different so it should not be much difficult to port from one to another.
    • A simple example of drag drop in wxpython.
    • Drag Drop
    •  The sample code for this can be found on this link on zetcode. This kind of feature can be added to the GUI to make it more customizable according to the user needs.
    • A similar code can also be written using PyQt which shouldn’t be much difficult to make.
    • Although the final decision will be taken after the discussion with my mentor.
    • A third library choice can be of using kivy which is an opensource library for building cross platform applications. It also provides support for touch sensitive interfaces and also has a fairly good documentation and a good community.

Timeline

  • Building the skeleton for the GUI.
  • Since this is the main portion of the project and also the main need, I will be spending most of my time developing this and will give the rest of the time to the Health Monitor portion. This will involve the work on following areas:
    • Allowing the GUI to add additional data channels easily.(3 weeks)
      • Any of the above mentioned 2 approaches or a combination of both techniques as mentioned in the details section will be used to add additional data channels.
    • Integration with the Tango Alarm Systems / PyAlarm(3 weeks)
      • This is required to inform the GUI about specific updates and allow the remote system to connect to the monitoring system to relay some information. I will be going through several papers illustrating how to leverage the functionality of the Tango Alarm Systems / PyAlarm.
    • Integrate the monitoring system with the plottings coming from the biometric device system vendor software using a generic mechanism(2 weeks)
    • Development of the Health Monitoring module(2 weeks): The Health Monitoring System will be a part of the top level monitoring interface and will be implemented as a sub-GUI. There is not much to be changed here and most of the functionality can be leveraged from the existing Health Monitor (Link to Health Monitor)

Benefits to ERAS

  • After the project is complete there would be a top level monitoring / Alarm System that will be able to handle everything.
  • The GUI will be integrated with the Tango Alarm Systems for event driven updates. This will make the communication and monitoring of remote systems very easy.
  • The user will be able to add more data channels easily.
  • The user will also be able to customize the GUI as per the needs.
  • There would also be a separate Health Monitoring System which will be able to get health information from the astronauts in EVA and will display it on the screen. This will help in monitoring the health as well as adding safety measures.

Deliverables

  • Habitat Monitoring GUI
  • Health Monitoring GUI

The final output will be a top level monitoring interface for ERAS Habitat that will be integrated with the Tango Alarm Systems and will be able to manage everything via a highly customizable GUI. It will also include a sub-section specifically dedicated to the health monitor that will take data health related data of the Astronauts in EVA and present it in a manner specified by the user.