Date
Journal Title
Journal ISSN
Volume Title
Publisher
Commercial drones, also known as unmanned aerial vehicles (UAVs), are rapidly becoming more prevalent and are used in many different applications, such as surveillance for sporting events, transportation for emergency equipment and goods, filming and aerial photography and many other activities. The number of drones in United States of America are forecast to double and grow from an estimated 1.1 million units in 2017 to reach 2.4 million units by 2022 according to the Federal Aviation Administration (FAA) [1]. The fact that most of the drones can carry payloads has encouraged many of the drone manufacturing companies to add different types of sensors to drones, and the most basic one is the camera. The previous reasons have led to open a new field of study which is called Drone-Human Interface (HDI) and user interface researchers have started studying the different possible ways to interact with drones ranging from the traditional devices like Radio Controller (RC) to controlling the drones using human body postures. This thesis presents research detailing the use of hand gestures as a HDI method to control the drones. This work consists of three main modules: Hand Detector, Gesture Recognizer and Drone Controller. A deep learning method is incorporated and utilized in the first module to detect and track the hands in real-time with high accuracy and from a single Red-Blue-Green (RGB) image. Image processing algorithms and techniques are introduced as a dynamic way to identify the hand gestures and motions. Finally, the Drone Controller module is responsible for communicating with the drones. It sends and receives the messages between the proposed system and the drone which is connected to the system.