SAI HEMACHANDRA VEMPRALA

I am currently a researcher in the Autonomous Systems group at Microsoft, where my research focuses on autonomy for aerial robotics, localization/planning and learning algorithms.

In 2019, I received my PhD from the Texas A&M University, where I was part of the Unmanned Systems Lab, headed by Dr. Srikanth Saripalli. My PhD thesis was on developing algorithms for collaborative localization and navigation for groups of vision based unmanned aerial vehicles. My primary research interests are aerial robotics, localization/planning, computer vision and machine learning. During my PhD years, I spent the summer of 2016 as a robotics intern in Millennium Engineering at the NASA Ames Research Center. During my time at Millennium, I helped develop localization and autonomous navigation pipelines for a quadrotor UAV to navigate challenging subterranean, GPS-denied environments.

Prior to my PhD, I received my Master of Science in Electrical Engineering from the Arizona State University in 2013. During these years, I was part of the Extreme Environment Robotics Lab (EERL) at ASU, headed by the late Dr. Alberto Behar. As part of the EERL team, I worked on developing software and lower-level firmware for the Micro Subglacial Lake Exploration Device (MSLED), an underwater robot for exploring subglacial lakes in Antarctica that was deployed as part of the Whillans Ice Stream Subglacial Access Research Drilling (WISSARD) expedition. I was also the team lead for the BENTO project,which involved developing hardware and software for a remote volcano monitoring system. During my master's, I also worked as a Firmware Engineering Intern at the Intel Corporation during the summer of 2013: where I developed embedded firmware for the Intel BioSport headset.

My undergraduate degree was in Electrical and Electronics Engineering from the JNTU Hyderabad, India, from which I graduated in 2011. During my undergraduate program, I worked on developing algorithms for computational analysis of electrical distribution systems.  

Research and Work Experience

  • Researcher : Microsoft Corporation
    July 2019 - present

    - Researcher in the Autonomous Systems group of the AI & Research division.

  • Research Assistant : Unmanned Systems Lab, Texas A&M University
    Jan 2017 - May 2019

    - Primary PhD research: Collaborative localization and navigation for micro aerial vehicle swarms using vision.
    - Developing a computer vision based cancer tumor tracking system in collaboration with Mayo Clinic Arizona.
    - Winner of the 2018 TAMU Data Science contest: developed predictive models for taxi revenue using existing public domain data from the city of Chicago.

  • Research Assistant : Autonomous Systems Laboratory, ASU
    Jan 2013 - Dec 2016

    - Primary research focused on autonomous navigation of unmanned vehicles, GPS denied localization and planning for micro aerial vehicles.
    - Developed a 'natural motion' framework for a humanoid robot named Baxter.
    - Gained hands-on experience in sensor fusion, ROS based middleware development, embedded systems, various commercial UAV platforms.

  • Graduate Intern : Millennium Engineering and Integration
    May 2016 - Aug 2016

    - Developed the software architecture for sensing and communication for a quadrotor UAV and developed a framework for estimation and navigation in subterranean GPS denied environments.
    - Worked on developing a 6DOF Simulink model for simulation of a fixed wing UAV.
    - Implemented attitude stabilization on open source autopilot hardware for an experimental unmanned aircraft. - Developed image processing software for analyzing multispectral satellite images.

  • Research Aide : Extreme Environment Robotics Laboratory, ASU
    May 2012 - Dec 2013

    - Developed a .NET based ground station software and low-level firmware for an underwater robot (MSLED).
    - Designed sensor fusion pipelines for gas, weather and seismic sensors, developed embedded firmware and data transmission protocols for sensing and reporting volcanic and seismic activity through satellite communication.

  • Graduate Intern : Intel Corporation
    Jun 2013 - Aug 2013

    - Designed embedded firmware for the biometric headset BioSport.
    - Involved in hardware design, sensor fusion and signal processing algorithms for Biosport.

Education

  • Texas A&M University (previously at Arizona State University) USA
    2014 - 2019 Doctor of Philosophy

    PhD in Mechanical Engineering

  • Arizona State University, USA
    2012 - 2013 Master of Science

    MS in Electrical Engineering

  • Guru Nanak Engineering College, India
    2007 - 2011 Bachelor of Technology

    B.Tech. in Electrical and Electronics Engineering

Uncertainty-aware Planning for Micro Aerial Vehicle Swarms

In this project that forms the second part of my PhD thesis, I investigated the idea of collaborative uncertainty-aware path planning for vision based micro aerial vehicles. For vehicles that are equipped with cameras and can localize collaboratively (see below), a heuristic based approach attempts to capture the estimated "quality" of localization from various viewpoints. Evolutionary algorithms were integrated with an RRT based path planning framework to result in plans which allow the vehicles to navigate intelligently towards areas that can improve their vision based localization accuracy: such as moving only in well-lit locations, observing texture-rich objects, building denser maps before navigating etc.


Collaborative Localization for Micro Aerial Vehicle Swarms

As the first part of my PhD thesis, I developed a collaborative localization pipeline that is applicable for a swarm of multirotor aerial vehicles with each vehicle using a monocular camera as its primary sensor. Images are captured continuously from each vehicle and Feature detection and matching are performed between the individual views, thus allowing for reconstruction of the surrounding environment. This sparse reconstruction is then used by the vehicles for individual localization in a decentralized fashion. The vehicles are also capable of computing relative poses between each other and fusing them with individual pose estimation occasionally for enhanced accuracy. Even when cross-correlations between vehicles are not tracked, covariance intersection allows for robust pose estimation between vehicles.


Real Time Cancer Tumor Tracking for Proton Beam Therapy

In collaboration with Mayo Clinic Arizona, I developed a real-time computer vision based tracking system for markers implanted in cancer tumors. The target application is to control a state-of-the-art proton beam targeting system according to tumor motion which is caused by the breathing cycles of the patient and other kinds of natural organ motion. It is common practice to embed tiny fiducial markers in the tumors in order to be visible in the X-ray spectrum. Computer vision techniques such as normalized cross correlation, image saliency maps etc. are utilized in conjunction with kernelized cross-correlation filters to track these tiny markers during X-ray fluoroscopy. The tracking method is able to handle high amounts of noise and various types of markers in order to achieve accurate and real time tracking.


Drone Detection through Depth Images

In collaboration with researchers from Universidad Politecnica de Madrid and MIT's ACL lab, I am working on a framework for detecting and localizing multirotor UAVs using depth images. A specific advantage of depth sensing versus other detection methods is that a depth map is able to provide 3D relative localization of the objects of interest, making it easier to develop strategies such as collision avoidance. In our work, a dataset of synthetic depth maps of drones has been first generated in the Microsoft AirSim UAV simulator and used to train a state-of-the-art deep learning-based drone detection model. The proposed detection technique, while trained only on simulation, has been validated in several real-life depth map sequences. It also generalizes well to multiple types of drones flying at up to 2 m/s, achieving an average precision of 98.7%, an average recall of 74.7% and a record detection range of 9.5 meters.


Ars Robotica: Robots in Theater

Ars Robotica was a collaboration between artists, theater performers and roboticists: to understand the fluidity and expressiveness of human movement and the possibility of its reproduction by robotic platforms. Using the Rethink Robotics Baxter as a test platform, we worked on defining and achieving human-like movement on the robot. We obtained movement data from expert human performers through various sensors: all the way from a Microsoft Kinect to a 12 camera high-precision, Optitrack system; which we then used as training data to construct "primitives", thus forming a vocabulary for motion. We later managed to express complex movements as a temporal combination of such primitives: thus helping create a framework for autonomous interpretation and expression of human-like motion through Baxter.


Micro Subglacial Lake Exploration Device

As part of a team at the Extreme Environment Robotics Laboratory, I worked on the development of onboard firmware (Arduino) and ground station software (C#/.NET) for a subglacial lake and aquatic exploration robot called MSLED. MSLED consists of a submersible mothership/explorer combination, and uses MEMS sensor and imaging technologies to investigate deep, remote and chemically challenging aquatic environments. Its sensory payload consists of a camera, inertial measurement unit, CTD sensor, data from which is transferred to the surface through a fiber optic cable. MSLED was deployed successfully twice in Lake McMurdo, Antarctica as part of the WISSARD program, while WISSARD played a crucial role in the discovery of subglacial life under the Antarctic ice.


BENTO Volcano Monitor

I led a team of graduate students on a project involving hardware and software design of expendable "volcanic monitor" capsules which monitor and transmit data about rapidly evolving volcanic conditions. The monitors are equipped with a number of sensors (seismic, gas, temperature etc.) and use a minimal data packaging and transmission protocol using the Iridium satellite modems that allows for real time compilation and dissemination of scientific data. Volcano monitors were deployed in Nicaragua, Italy, Iceland and Greenland.

Journal Papers

  • Sai Vemprala, Srikanth Saripalli, "Collaborative Localization for Micro Aerial Vehicles", under review.
  • Alberto E. Behar, Daming D. Chen, [et al. including Sai H. Vemprala]. MSLED: The Micro Subglacial Lake Exploration Device. Underwater Technology, 33.1, 2015.

Conference Papers

  • Sai Vemprala, Srikanth Saripalli, Carlos Vargas, Martin Bues, Yanle Hu, Jiajian Shen. "Real-time Tumor Tracking for Pencil Beam Scanning Proton Therapy", 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2018), Madrid, Spain, 2018, pp. 4434-4440.
  • Adrian Carrio, Sai Vemprala, Andres Ripoll, Srikanth Saripalli, Pascual Campoy. "Drone Detection using Depth Maps", 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2018), Madrid, Spain, 2018, pp. 1034-1037.
  • Sai Vemprala, Srikanth Saripalli. "Monocular Vision based Collaborative Localization for Micro Aerial Vehicle Swarms", 18th IEEE International Conference on Unmanned Aerial Systems (ICUAS 2018), Dallas, USA, 2018, pp. 315-323.
  • Sai Vemprala, Srikanth Saripalli. "Uncertainty-aware Planning for Vision Based Multirotor Swarms", Proceedings of the AHS International 74th Annual Forum, Phoenix, USA, 2017, pp. 1774-1783.
  • Sai Vemprala, Srikanth Saripalli. "Vision based Collaborative Path Planning for Micro Aerial Vehicles", 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 2018, pp. 1-7.
  • Sai Vemprala, Srikanth Saripalli. "Vision Based Collaborative Localization for Swarms of Aerial Vehicles". Proceedings of the AHS International 73rd Annual Forum, Dallas, USA, 2017, pp. 2980-2985.
  • Sai Vemprala, Srikanth Saripalli. "Vision Based Collaborative Localization of Multirotor Vehicles". 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, South Korea, 2016, pp. 1653-1658.
  • Ravi Babu P., Kumar M.P.V.V.R., Hemachandra V.S., Vanamali M.P.R. "A Novel Power Flow Solution Methodology for Large Radial Distribution Systems". IEEE International Conference on Computational Technologies SIBIRCON 2010.
  • Ravi Babu P., Vanamali M.P.R., Kumar M.P.V.V.R., Hemachandra V.S. "Distribution System Network Reconfiguration using L-E Method". Annual IEEE India Conference 2010.

Posters

  • Sai Vemprala, Ian Shelanskey, Matthew Ragan, Lance Gharavi, Srikanth Saripalli. Ars Robotica: A Movement Framework for Robots in Theater. Workshop on Artistically Skilled Robots. Daejeon, Korea, 2016.
  • Sai Vemprala, Srikanth Saripalli. Autonomous Exploration using UAVs. AAAI Conference on Artificial Intelligence 2016. Phoenix, AZ, February 2016.

Patents

  • System and method for data transmission and power supply capability over an audio jack for mobile devices; Indira Negi, Lakshman Krishnamurthy, Brian K. Vogel, Darren S. Crews, Sai H. Vemprala, Xiaochao Yang, Howard D. Millett, Alexander Essaian, Alanson P. Sample: US Patent 10,165,355 (issued 12/25/2018)
  • System and method for device action and configuration based on user context detection from sensors in peripheral devices ; Indira Negi, Lakshman Krishnamurthy, Fuad Al-Amin, Xiaochao Yang, Brian K. Vogel, Jun Li, Alexander Essaian, Sai H. Vemprala, Donnie H. Kim, Lama Nachman, Haibin Liu: US Patent 10,117,005 (issued 10/30/2018)

Other presentations

  • Sai Vemprala. Sampling based Path Planning for Unmanned Aerial Vehicles. IROS 2017 Workshop on Complex Collaborative Systems. Vancouver, Canada, 2017.
  • Sai Vemprala, Srikanth Saripalli. Vision based MAV Swarms in a Photorealistic Simulation Framework. 1st International Symposium on Aerial Robotics. Philadelphia, USA, 2017.
  • Andres Mora, Sai Vemprala, Adrian Carrio, Srikanth Saripalli. Flight performance assessment of land surveying trajectories for multiple UAV platforms. Workshop on Research, Education and Development of Unmanned Aerial Systems, RED-UAS 2015. Cancun, Mexico, 2015.
  • Lance Gharavi, Srikanth Saripalli, Sai Vemprala, Matthew Ragan, Ian Shelanskey. Ars Robotica. Exemplar project at the a2ru National Conference Virginia, USA, 2015.
  • Sai Vemprala, Srikanth Saripalli. Autonomous exploration and navigation strategies for MAVs. AHS International Specialists' Meeting on Unmanned Rotorcraft System 2015
  • Alberto Behar, Sai Vemprala. BENTO: Volcanic monitoring. Deep Carbon Observatory Sandpit Workshop on Gas Instrumentation. Sicily, Italy, 2013.
TAMUHack 2019: Best IoT hack winner

In 2019, I was part of the team that won the 'Best IoT Hack' category in the annual TAMUHack hackathon. Over 24 hours, we designed a system that performs facial recognition (while making sure it is a real person and not a picture by detecting eye blinks) along with voice verification for password-less authentication. We made use of Microsoft Azure's Cognitive Services API, thus allowing the system to run on low-power hardware. We deployed this system on a Qualcomm Dragonboard SBC.

TAMU Data Science contest: 1st place

In 2018, I won the Texas A&M University's Data Science contest. As part of the contest, I developed predictive models for estimating taxi revenue over time and location using public taxi ride data from the city of Chicago. I implemented ARIMA based forecasting as well as a LSTM-based recurrent neural network to generate accurate predictions, while comparing them with results from packages such as Facebook Prophet.

DAC 2018 System Design Contest: Top 5 finish

In 2018, I led a team of graduate students that participated in the 2018 DAC System Design contest. The objective of the contest was to design a machine learning pipeline that can classify and detect objects in a multi-class custom dataset. The primary requirement was that the pipeline needs to run at a speed of >20 FPS on an NVIDIA Jetson TX2, while maximizing intersection-over-union (IoU) and minimizing power consumption. We constructed an object detection/classification pipeline using the Tiny YOLO v2 model, which was implemented on the TX2, along with several Jetson-specific optimizations in order to achieve 21 FPS of detection speed and over 80% IoU in validation. Evaluated over all these metrics, our team achieved a top 5 finish in the contest out of 61 teams worldwide.

Office Address

ENPH Rm 421, 180 Spence St, College Station, TX - 77840

Connect on LinkedIn

GitHub

Email

svemprala (at) tamu [dot] edu

Copyright (C) Sai Vemprala