Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/107928
Full metadata record
DC FieldValueLanguage
dc.date.accessioned2023-03-29T08:17:50Z-
dc.date.available2023-03-29T08:17:50Z-
dc.date.issued2022-
dc.identifier.citationFarrugia, S. (2022). Autonomous robot path planning and obstacle avoidance in a dynamic environment (Bachelor's dissertation).en_GB
dc.identifier.urihttps://www.um.edu.mt/library/oar/handle/123456789/107928-
dc.descriptionB.Sc. IT (Hons)(Melit.)en_GB
dc.description.abstractAn autonomous mobile robot can traverse an environment without human intervention. It uses path planning and obstacle avoidance techniques to find an optimal path from its start to goal location, without colliding into obstacles along the way. Such classical techniques include Artificial Potential Field, Cellular Decomposition and Vector Field Histograms. On the other hand, AI techniques include A*, Fuzzy Logic, Rapidly-exploring Random Trees, Ant Colony Optimisation, Particle Swarm Optimisation and Neural Networks. In this project, we investigated path planning algorithms that can be used in (a) the absence of obstacles, (b) in an environment with static obstacles and (c) in an environment with dynamic obstacles, meaning that some or all obstacles might be moving. We used the Elegoo Smart Robot Car Kit V4.0, an Arduino Uno as an Input/Output Subsystem and a Raspberry Pi 3 B+ for the processing of the path planning and obstacle avoidance techniques. We also used an ultrasonic sensor to detect obstacles around the robot and a gyroscope to measure the robot’s orientation. We controlled the robot’s motion using differential driving techniques and Proportional Integral Derivative Control. Firstly, we investigated a Search-Based algorithm referred to as Hybrid A*. Hybrid A* first uses the same technique as the A* algorithm to find a path using its knowledge of the environment. Then, Hybrid A* simplifies the path for the robot’s movement constraints. Secondly, we investigated a Sampling-Based algorithm referred to as Rapidly-exploring Random Tree*. This algorithm creates a tree by randomly generating nodes around the free space of the environment, until it reaches the goal. We tested both algorithms in three different environments: (a) without obstacles (b) with static obstacles (c) with dynamic obstacles. During all tests, both algorithms successfully guided the robot around the static and dynamic obstacles without any collision. However, we observed that Hybrid A* algorithms always managed to find the shorter path in all tests, together with having shorter computational time taken to find a path and the overall time taken to guide the robot from the start to finish.en_GB
dc.language.isoenen_GB
dc.rightsinfo:eu-repo/semantics/restrictedAccessen_GB
dc.subjectMobile robotsen_GB
dc.subjectAlgorithmsen_GB
dc.titleAutonomous robot path planning and obstacle avoidance in a dynamic environmenten_GB
dc.typebachelorThesisen_GB
dc.rights.holderThe copyright of this work belongs to the author(s)/publisher. The rights of this work are as defined by the appropriate Copyright Legislation or as modified by any successive legislation. Users may access this work and can make use of the information contained in accordance with the Copyright Legislation provided that the author must be properly acknowledged. Further distribution or reproduction in any format is prohibited without the prior permission of the copyright holder.en_GB
dc.publisher.institutionUniversity of Maltaen_GB
dc.publisher.departmentFaculty of Information and Communication Technology. Department of Communications and Computer Engineeringen_GB
dc.description.reviewedN/Aen_GB
dc.contributor.creatorFarrugia, Sean (2022)-
Appears in Collections:Dissertations - FacICT - 2022
Dissertations - FacICTAI - 2022

Files in This Item:
File Description SizeFormat 
2208ICTICT390900013648_1.PDF
  Restricted Access
3.15 MBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.