SAI HEMACHANDRA VEMPRALA

I am currently a PhD candidate at the Texas A&M University and a research assistant in the Unmanned Systems Lab, headed by Dr. Srikanth Saripalli. My primary research deals with collaborative localization and navigation for groups of unmanned aerial vehicles through vision. Over the past five years of my PhD, I have also worked briefly on unmanned ground vehicles, humanoid robots, computer vision for intelligent systems and deep learning. I spent part of my PhD program at the Arizona State University as a research assistant in the Autonomous Systems Research Lab, also under Dr. Saripalli. During my PhD years, I spent the summer of 2016 as a robotics intern in Millennium Engineering at the NASA Ames Research Center. As an intern, I helped develop localization and autonomous navigation pipelines for a quadrotor UAV to navigate challenging subterranean, GPS-denied environments.

Prior to my PhD, I received my Master of Science in Electrical Engineering from the Arizona State University in 2013. During these years, I was part of the Extreme Environment Robotics Lab (EERL) at ASU, headed by the late Dr. Alberto Behar. As part of the EERL team, I worked on developing software and low-level firmware for the Micro Subglacial Lake Exploration Device (MSLED), an underwater robot for exploring subglacial lakes in Antarctica that was deployed as part of the Whillans Ice Stream Subglacial Access Research Drilling (WISSARD) expedition. Parallelly, I was involved in developing hardware and software for a remote volcano monitoring system named BENTO, for which project I was also the team lead. During my master's, I also worked as a Firmware Engineering Intern at the Intel Corporation during the summer of 2013: where I designed low level firmware for the Intel's BioSport headset.

My undergraduate degree was in Electrical and Electronics Engineering from the JNTU Hyderabad, India, from which I graduated in 2011. During my undergraduate program, I worked on developing algorithms for computational analysis of electrical distribution systems.  

Research and Work Experience

  • Research Assistant : Unmanned Systems Lab, Texas A&M University
    Jan 2017 - present

    - Primary PhD research: Collaborative localization and navigation for micro aerial vehicle swarms using vision.
    - Developing a computer vision based cancer tumor tracking system in collaboration with Mayo Clinic Arizona.
    - Winner of the 2018 TAMU Data Science contest: developed predictive models for taxi revenue using existing public domain data from the city of Chicago.

  • Research Assistant : Autonomous Systems Laboratory, ASU
    Jan 2013 - Dec 2016

    - Primary research focused on autonomous navigation of unmanned vehicles, GPS denied localization and planning for micro aerial vehicles.
    - Developed a 'natural motion' framework for a humanoid robot named Baxter.
    - Gained hands-on experience in sensor fusion, ROS based middleware development, embedded systems, various commercial UAV platforms.

  • Graduate Intern : Millennium Engineering and Integration
    May 2016 - Aug 2016

    - Developed the software architecture for sensing and communication for a quadrotor UAV and developed a framework for estimation and navigation in subterranean GPS denied environments.
    - Worked on developing a 6DOF Simulink model for simulation of a fixed wing UAV.
    - Implemented attitude stabilization on open source autopilot hardware for an experimental unmanned aircraft.

  • Software Engineer Intern : Koolock Inc.
    Jun 2016 - Aug 2016

    Developed image processing software for analyzing multispectral satellite images.

  • Autonomous Systems Engineer : Indigo Drones, Costa Rica
    Apr 2015 - Dec 2015

    Assisted with mission planning, sensor fusion and data analysis for UAVs with imaging sensors for plant health monitoring in large pineapple fields in Costa Rica.

  • Research Aide : Extreme Environment Robotics Laboratory, ASU
    May 2012 - Dec 2013

    - Developed a .NET based ground station software and low-level firmware for an underwater robot (MSLED).
    - Designed sensor fusion pipelines for gas, weather and seismic sensors, developed embedded firmware and data transmission protocols for sensing and reporting volcanic and seismic activity through satellite communication.

  • Graduate Intern : Intel Corporation
    Jun 2013 - Aug 2013

    - Designed embedded firmware for the biometric headset BioSport.
    - Involved in hardware design, sensor fusion and signal processing algorithms for Biosport.

Education

  • Texas A&M University (previously at Arizona State University) USA
    2014 - present Doctor of Philosophy

    PhD in Mechanical Engineering

  • Arizona State University, USA
    2012 - 2013 Master of Science

    MS in Electrical Engineering

  • Guru Nanak Engineering College, India
    2007 - 2011 Bachelor of Technology

    B.Tech. in Electrical and Electronics Engineering

Uncertainty-aware Planning for Micro Aerial Vehicle Swarms

In this project that forms the second part of my PhD thesis, I investigate the idea of collaborative uncertainty-aware path planning for vision based micro aerial vehicles. For vehicles that are equipped with cameras and can localize collaboratively (see below), a heuristic based approach attempts to capture the estimated "quality" of localization from various viewpoints. Evolutionary algorithms are integrated with an RRT based path planning framework to result in plans which allow the vehicles to navigate intelligently towards areas that can improve their vision based localization accuracy: such as moving only in well-lit locations, observing texture-rich objects etc.


Collaborative Localization for Micro Aerial Vehicle Swarms

As the first part of my PhD thesis, I am developing a collaborative localization pipeline that is applicable for a swarm of multirotor aerial vehicles with each vehicle using a monocular camera as its primary sensor. Feature detection and matching are performed between the individual views, thus allowing for reconstruction of the surrounding environment which is then used for localizing the moving vehicles within a group. The vehicles are also capable of computing relative poses between each other and fusing them with individual pose estimation occasionally for enhanced accuracy. GPU acceleration of the feature detection and matching algorithms allows for fast localization.


Real Time Cancer Tumor Tracking for Proton Beam Therapy

In collaboration with Mayo Clinic Arizona, I am working on developing a fully real-time computer vision based tracking system for cancer tumors. The target application is to control a state-of-the-art proton beam targeting system according to tumor motion which is caused by the breathing cycles of the patient and other kinds of natural organ motion. Computer vision techniques such as normalized cross correlation, image saliency maps etc. are utilized to track tiny fiducials implanted in the tumors through X-ray fluoroscopy. The tracking method is able to handle high amounts of noise and various types of markers in order to achieve accurate and real time tracking.


Ars Robotica: Robots in Theater

Ars Robotica was a collaboration between artists, theater performers and roboticists: to understand the fluidity and expressiveness of human movement and the possibility of its reproduction by robotic platforms. Using the Rethink Robotics Baxter as a test platform, we worked on defining and achieving human-like movement on the robot. We obtained training data from human performers through sensors all the way from a Microsoft Kinect to a 12 camera Optitrack system; which we then used to define a vocabulary of human motion. We later managed to express complex movements as a temporal combination of primitives: thus helping create a framework for autonomous interpretation and expression of human-like motion through Baxter.


Micro Subglacial Lake Exploration Device

As part of a team at the Extreme Environment Robotics Laboratory, I worked on the development of onboard firmware (Arduino) and ground station software (C#/.NET) for a subglacial lake and aquatic exploration robot called MSLED. MSLED consists of a submersible mothership/explorer combination, and uses MEMS sensor and imaging technologies to investigate deep, remote and chemically challenging aquatic environments. MSLED was deployed successfully twice in Lake McMurdo, Antarctica as part of the WISSARD program, while WISSARD played a crucial role in the discovery of subglacial life under the Antarctic ice.


BENTO Volcano Monitor

I led a team of graduate students on a project involving hardware and software design of expendable "volcanic monitor" capsules which monitor and transmit data about rapidly evolving volcanic conditions. The monitors are equipped with a number of sensors (seismic, gas, temperature etc.) and use a minimal data packaging and transmission protocol using the Iridium satellite modems that allows for real time compilation and dissemination of scientific data. Volcano monitors were deployed in Nicaragua, Italy, Iceland and Greenland.

Papers

  • Sai Vemprala, Srikanth Saripalli, "Monocular Vision based Collaborative Localization for Micro Aerial Vehicle Swarms", IEEE International Conference on Unmanned Aerial Systems (ICUAS) 2018 (accepted, pending publication).
  • Sai Vemprala, Srikanth Saripalli, "Uncertainty-aware Planning for Vision Based Multirotor Swarms", AHS International's 74th Annual Forum (accepted, pending publication).
  • Sai Vemprala, Srikanth Saripalli, "Vision based Collaborative Path Planning for Micro Aerial Vehicles", IEEE International Conference on Robotics and Automation 2018 (accepted, pending publication).
  • Sai Vemprala, Srikanth Saripalli. "Vision Based Collaborative Localization for MAV Swarms". AHS International's 73rd Annual Forum, 2017.
  • Sai Vemprala, Srikanth Saripalli. "Vision Based Collaborative Localization of Multirotor Vehicles". IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2016.
  • Alberto E. Behar, Daming D. Chen, [et al. including Sai H. Vemprala]. MSLED: The Micro Subglacial Lake Exploration Device. Underwater Technology, 33.1, 2015.
  • Sai Vemprala, Srikanth Saripalli. Autonomous exploration and navigation strategies for MAVs. AHS International Specialists' Meeting on Unmanned Rotorcraft System 2015
  • Ravi Babu P., Kumar M.P.V.V.R., Hemachandra V.S., Vanamali M.P.R. A Novel Power Flow Solution Methodology for Large Radial Distribution Systems. IEEE International Conference on Computational Technologies SIBIRCON 2010.
  • Ravi Babu P., Vanamali M.P.R., Kumar M.P.V.V.R., Hemachandra V.S.. Distribution System Network Reconfiguration using L-E Method. Annual IEEE India Conference 2010.

Posters

  • Sai Vemprala, Ian Shelanskey, Matthew Ragan, Lance Gharavi, Srikanth Saripalli. Ars Robotica: A Movement Framework for Robots in Theater. Workshop on Artistically Skilled Robots. Daejeon, Korea, 2016.
  • Sai Vemprala, Srikanth Saripalli. Autonomous Exploration using UAVs. AAAI Conference on Artificial Intelligence 2016. Phoenix, AZ, February 2016.

Patents

  • US 20170034618 A1: System and method for data transmission and power supply capability over an audio jack for mobile devices; Indira Negi, Lakshman Krishnamurthy, Brian K. Vogel, Darren S. Crews, Sai H. Vemprala, Xiaochao Yang, Howard D. Millett, Alexander Essaian, Alanson P. Sample (Intel Corp.)
  • US 20160192039 A1: System and method for device action and configuration based on user context detection from sensors in peripheral devices ; Indira Negi, Lakshman Krishnamurthy, Fuad Al-Amin, Xiaochao Yang, Brian K. Vogel, Jun Li, Alexander Essaian, Sai H. Vemprala, Donnie H. Kim, Lama Nachman, Haibin Liu (Intel Corp.)

Other presentations

  • Sai Vemprala. Sampling based Path Planning for Unmanned Aerial Vehicles. IROS 2017 Workshop on Complex Collaborative Systems. Vancouver, Canada, 2017.
  • Sai Vemprala, Srikanth Saripalli. Vision based MAV Swarms in a Photorealistic Simulation Framework. 1st International Symposium on Aerial Robotics. Philadelphia, USA, 2017.
  • Andres Mora, Sai Vemprala, Adrian Carrio, Srikanth Saripalli. Flight performance assessment of land surveying trajectories for multiple UAV platforms. Workshop on Research, Education and Development of Unmanned Aerial Systems, RED-UAS 2015. Cancun, Mexico, 2015.
  • Lance Gharavi, Srikanth Saripalli, Sai Vemprala, Matthew Ragan, Ian Shelanskey. Ars Robotica. Exemplar project at the a2ru National Conference Virginia, USA, 2015.
  • Alberto Behar, Sai Vemprala. BENTO: Volcanic monitoring. Deep Carbon Observatory Sandpit Workshop on Gas Instrumentation. Sicily, Italy, 2013.

Office Address

ENPH Rm 421, 180 Spence St, College Station, TX - 77840

Connect on LinkedIn

GitHub

Email

svemprala (at) tamu [dot] edu

Copyright (C) Sai Vemprala