Aamir Ahmad
Tenure-track Professor, University of Stuttgart Research Group Leader
Max-Planck-Ring 4
72076 Tübingen
Germany
For the most updated information, please visit my homepage at the University of Stuttgart or visit my personal homepage https://www.aamirahmad.de/ .
From September 2020, I have started the Flight Robotics Group (FRG), embedded within the Institute of Flight Mechanics and Control in the department of Aerospace Engineering and Geodesy, University of Stuttgart. I will also continue to lead the Robot Perception Group (RPG), at the Perceiving Systems department of the Max Planck Institute for Intelligent Systems.
FRG is situated within the umbrella of both IMPRS-IS and Cyber Valley. Through the International Max Planck Research School – Intelligent Systems (IMPRS-IS), we hire highly talented PhD students in our group. We currently have 1 open postdoc position (see full advertisement here). To find out more about our group’s past, present and future work, watch the following video.
Please find the latest version of my CV here.
For all videos of my research please visit my youtube channel here
News
- [Info Oct' 20] -- Prospective PhD students, interested in joining my group, should apply through IMPRS, select me as their advisor on the portal and send me an email stating this. Please use the following phrase subject line 'FRG PhD position IMPRS-IS'. If you have recently sent me a general email requesting a PhD position, please follow the aforementioned procedure and send a mail again.
- [Sep '20] I have been appointed as a tenure-track professor of 'Flight Robotics (Flugrobotik)' at the University of Stuttgart. I will soon announce new open positions for Postdocs and PhD students. Bachelor and Master thesis students -- please contact me directly on my uni email. (news links here, here and here).
- [New!!! -- AirCapRL accepted in IEEE RA-L + IROS 2020] Details here: AirCapRL
- [New!!! -- Dataset and Code of our IEEE RA-L + IROS 2020 article] AirCapRL: Autonomous Aerial Human Motion Capture using Deep Reinforcement Learning -- Dataset and code here.
- [Dataset and Code of our ICCV 2019 paper] Markerless Outdoor Human Motion Capture Using Multiple Autonomous Micro Aerial Vehicles -- Dataset and code here.
- [IEEE/RSJ IROS 2019] Our talk in IROS Workshop on Aerial Swarms is now online!
- [IEEE/RSJ IROS 2019] Our submission to IROS Workshop on Aerial Swarms is accepted! To know all about the AirCap project please attend our talk on Nov 4.
- [Call for a PhD Position (Closed) - Deadline:
1st September] We have a new PhD student position open on the topic 'Reinforcement Learning for Aerial Robots'. See the call here, or contact me directly. - [Paper accepted in ICCV 2019] Our latest work titled 'Markerless Outdoor Human Motion Capture Using Multiple Autonomous Micro Aerial Vehicles' has been accepted to ICCV (July, 2019)
- [Paper accepted in IEEE RA-L 2019] Our latest work titled 'Active Perception based Formation Control for Multiple Aerial Vehicles' has been accepted as a journal paper in IEEE Robotics and Automation Letters (July, 2019)
- [Paper accepted in IEEE CASE 2019] Rahul's work titled 'Motion Planning for Multi-Mobile-Manipulator Payload Transport Systems' has been accepted as a conference paper in IEEE 15th International Conference on Automation Science and Engineering (CASE) (May, 2019)
- [EU H2020 Grant proposal DeepField accepted for funding] Our grant proposal, from a 5-member consortium consisting of several European Universities, is accepted for funding. Project begins October 2019. Stay tuned for more info.
- [Code Release] : Nodes and packages specific to our RA-L paper (accepted) "Active Target Perception based Formation Control for Multiple Micro Aerial Vehicles" is added on our Github project page. AirCap Github Page
- [Talk] NVidia I AM AI Photoseries featured my short interview with them at the GTC 2018.
- [Paper accepted in IEEE/RSJ IROS 2018] Our RA-L + IROS 2018 submission from the AirCap project got accepted as IROS paper also. Please see the publications for RA-L version of the paper.
- [Paper accepted in IEEE SSRR 2018] Our work on decentralized MPC with integrated obstacle avoidance for multiple UAVs got accepted at SSRR 2018.
- [Paper accepted in IEEE RA-L 2018] Our latest work and results from the AirCap project got accepted as a journal paper in IEEE Robotics and Automation Letters (RA-L). Please check my publications tab.
- My talk regarding outdoor motion capture using UAVs at 2018 GPU Technology Conference (GTC), March 2018, San José, USA.
- I was a featured speaker in the autonomous machines session at the 2018 GPU Technology Conference (GTC).
Research Overview
Please see my group page here -- Robot Perception Group for my research overview.
New Robot Platforms -- In order to have extensive access to the hardware, we design and build most of our robotic platforms. Currently, our main flying platforms include 8-rotor Octocopters. More details can be found here https://ps.is.tue.mpg.de/pages/outdoor-aerial-motion-capture-system.
There are currently two ongoing AirCap projects in our group: 3D Motion Capture and Perception-Based Control
PhD Students
- Eric Price (Co-supervised, Main Supervisor: Michael Black; Sep '16 -- present)
- Nitin Saini (Co-supervised, Main Supervisor: Michael Black; Apr '18 -- present)
- Elia Bonetto (Mar '20 -- present, Co-supervisor: Michael Black)
- Yu-Tang Liu (Aug'20 -- present, Co-supervisor: Michael Black)
Master/Bachelor Students
- Halil Acet (Jul '19 -- present, Research Assistant)
- Michael Pabst (Nov '19 -- present, Master Intern)
Previous PhD Students
- Rahul Tallamraju (Co-supervised, Main Supervisor: Professor Kamalakar Karlapalem; Mar '18 -- Aug' 20)
Previous Master and Bachelor Students
- Yilin Ji (Dec '19 -- Jul '20)
- Guilherme Lawless (Sep '16 -- Sep '17, PhD Intern) Currently at The Nano Foundation.
- Roman Ludwig (Sep '17 -- Jul '19, Research Assistant) Currently a PhD student at ETH Zürich.
- Igor Martinović (with thesis, Sep '17 -- Sep '19, Research Assistant and Master Student) Currently at Vector Informatik GmbH.
- Nowfal Manakkaparambil Ali (Jul '19 -- Sep '19, Research Assistant) Currently at Fraunhofer Institute.
- Ivy Nuo Chen (Aug '19 -- Oct '19, Bachelor Intern) Currently at Temple University, USA.
- Eugen Ruff (Research Engineer, Currently at Bosch)
- Soumyadeep Mukherjee (Bachelor Intern, Currently at udaan.com)
- Raman Butta (Bachelor Intern, Currently at Indian Oil)
- David Sanz (PhD Intern)
Invited Talks at International Conferences
- Deep Neural Network-based Cooperative Visual Tracking through Multiple Flying Robots, 2018 GPU Technology Conference (GTC), March 2018, San José, USA.
- Human-Multirobot Interaction in Cooperative Perception-based Search and Rescue Missions, 2017 IEEE ICRA Workshop on Human Multi-Robot Systems Interaction, June 2, 2017, Singapore.
Invited Lectures at Summer Schools
- Cooperative Perception, "Lucia" PhD School on AI and Robotics, Sep 4-8, 2017, Lisbon, Portugal.
- Cooperative Robot Localization and Target Tracking based on Least Squares Minimization, LARSyS Summer School on Multi-agent Systems, July 8-11, 2014, Lisbon, Portugal.
Previous Projects
- TRaVERSE: TowaRds Very large scalE human-Robot SynErgy- (Funded by EU-FP7 Marie Curie IEF) -- Hazardous work environment for humans, growing necessity for an increase in the worldwide agricultural production, and a rapid rise in the public healthcare expenditures due to aging population are among some of the most predominant societal issues where robotics and automation are becoming progressively vital. Distributed multi-robot teams consisting of a large number of robots operating in close cooperation with humans is fundamental for such applications. How to achieve maximum synergy between teams of robots and humans, without jeopardizing human safety and comfort, within the constraints of resources, e.g., computational capacity of the robots, sensor and actuator costs, is still an open question. The focus of this research project is on investigating and developing integrated methods for robot team functionalities with human interaction that are scalable to a very large number of robots, thus enabling their successful real-world deployment. Using state of the art, ecologically valid and immersive virtual environments and virtual reality equipments, the project will study and model human behavior, perception and cognitive responses when they interact and/or cooperate with robots in a large-scale multi-robot scenario. Clear distinction will be made between human users and operators where the former is expected to benefit from directly using a robot that functions as a part of a large robotic team. On the other hand, human operators are those that are in charge of cooperating/controlling a team of robots to accomplish a collaborative task. Eventually, using the human behavior models, cooperative multi-robot functionalities, including localization, mapping and motion planning, will be optimally designed to be extremely scalable within the constraints of ease, safety, effectiveness and naturalness of human user/operator interaction with the robots.
-
RoCKIn: Robot Competitions Kick Innovation in Cognitive Systems and Robotics (FP7-EU-601012) [Quoted from the project's website] “RoCKIn is an EU project that will be run over the next three years (2013-2016), consisting of robot competitions, symposiums, educational RoCKIn camps and technology transfer workshops. The mission is to act as a catalyst for smarter, more dependable robots. It is done by building upon the principles of challenge-driven innovation laid down by RoboCup, facilitating cognitive and networked robot systems' testing, and streamlining research and development through standardised testbeds and benchmarks. For this there are two challenges which will run concurrently in 2014 and 2015 with an introductory event to be held in June 2013. These challenges were selected due to their high relevance and impact on Europe's societal and industrial needs.” http://rockinrobotchallenge.eu/
-
PCMMC: Perception-Driven Coordinated Multi-Robot Motion Control (FCT PTDC/EEA-CRO/100692/2008) [Quoted from the project's wiki] Several robotic tasks require or benefit from the cooperation of multiple robots: transportation of large-size objects, large area coverage (e.g., for cleaning) or surveillance (e.g., for fire detection), pollutant plume tracking, or target detection and tracking, to name but a few. In this project, a novel active approach to cooperative perception through coordinated vehicle motion control is proposed. The vehicle formation geometry will change dynamically so as to optimize the accuracy of cooperative perception of a static or dynamic target by the formation vehicles. To achieve this, innovative decentralized low-communication formation full state estimation methods, and dynamic-goal-driven formation control, for cooperative target localization and tracking by decentralized fusion of the data measured by all the formation vehicles have been introduced. http://mediawiki.isr.ist.utl.pt/wiki/PCMMC:_Perception-Driven_Coordinated_Multi-Robot_Motion_Control
-
SocRob [Quoted from the project's website] SocRob is a project on Cooperative Robotics and Multi-Agent Systems carried out by the Intelligent Systems Laboratory at ISR/IST. The acronym of the project stands both for “Society of Robots” and “Soccer Robots”, the case study where we are testing our population of four robots. http://socrob.isr.ist.utl.pt/
-
URUS: Ubiquitous Networking Robotics in Urban Settings [Quoted from the project's website] In this project the idea of incorporating a network of robots (robots, intelligent sensors, devices and communications) in order to improve life quality in urban areas was analyzed and tested. The URUS project was focused in designing a network of robots that in a cooperative way interact with human beings and the environment for tasks of assistance, transportation of goods, and surveillance in urban areas. Specifically, the objective was to design and develop a cognitive network robot architecture that integrates cooperating urban robots, intelligent sensors, intelligent devices and communications.
A big chunk of code and dataset that I contribute to is open source and can be accessed through my github page mentioned on the left-side of this page. Some of my favorite ones are described below.
Datasets:
- Omni-dataset: Time-stamped sensor data acquired from four omni-directional soccer robots of the SocRob team can be downloaded from here. The dataset also contains groundtruth information acquired via an external vision system.
- DSR dataset: Domestic service robot dataset. It consists of time-stamped sensor data acquired by one service robot moving around in a real home-like environment while a person randomly walks around in the same environment. The dataset can be downloaded from here. Ground truth information is also provided.
Software:
- ROS package 'read_omni_dataset' to systematically read sensor messages from the omni-dataset: Download the source here.
- ROS package 'evaluate_omni_dataset' to do GT evaluation and benchamarking with the omni-dataset and is expected to be executed with 'read_omni_dataset'. Download the source here.
- ROS package 'read_dsr_dataset' to systematically read sensor messages from the DSR dataset: Download the source here.
- ROS package 'evaluate_dsr_dataset' to do GT evaluation and benchamarking with the DSR dataset and is expected to be executed with 'read_dsr_dataset'. Download the source here.
AirCap Story
The story of our project AirCap: Aerial Outdoor Motion Capture. The future of motion capture is here!