Show simple item record

dc.contributor.advisorHougen, Dean
dc.contributor.authorAguilar Escamilla, Jose
dc.date.accessioned2023-05-08T16:32:22Z
dc.date.available2023-05-08T16:32:22Z
dc.date.issued2023-05-12
dc.identifier.urihttps://hdl.handle.net/11244/337592
dc.description.abstractCombinatorial optimization problems have long been a computationally-challenging family of problems with high importance within science. Although algorithms to solve such problems exist, these classical/exact algorithms tend to become computationally intractable as the problem size increases. Due to the relevancy of such problems in real life, research has explored other algorithms that forego the optimality guarantee of classical algorithms. In doing so, heuristic algorithms have achieved fast inference times with performance that is not too far from the optimal performance of classical algorithms. A particular heuristic algorithm from artificial intelligence, the attention model (AM), has achieved state-of-the-art performance in prevalent combinatorial optimization problems such as the traveling salesman, vehicle routing, and orienteering problems, where the gap between classical and heuristic algorithms was diminished. This success is attributed to the ability of the AM to tend to the structure of the input through the use of graph attention (GAT) mechanisms, a type of graph neural network. While the mechanism by which these models extract structural information, it is unknown what kind of information is extracted. This thesis presents a novel variant of the attention model (AM), the persistence attention model (AM-P), which explicitly uses a type of structural information, persistent homology. The model is tested on the traveling salesman problem with different problem node counts to compare the performance of the attention model with and without persistent homology information. It is hypothesized that persistent homology information will help the attention model better exploit the structure of the model, achieving better performance on combinatorial problems. This hypothesis is demonstrated false as there is no statistically-significant differences between the performance of the AM and AM-P. This negative result raises the possibility that the attention model may already extract structural information similar to persistent homology through its GAT mechanisms.en_US
dc.languageen_USen_US
dc.rightsAttribution 4.0 International*
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/*
dc.subjectArtificial Intelligenceen_US
dc.subjectGraph Attentionen_US
dc.subjectTopological Data Analysisen_US
dc.subjectReinforcement Learningen_US
dc.titleGraph Attention and Persistence for Traveling Salesman Problemen_US
dc.contributor.committeeMemberLan, Chao
dc.contributor.committeeMemberDiochnos, Dimitris
dc.date.manuscript2023-05-05
dc.thesis.degreeMaster of Scienceen_US
ou.groupGallogly College of Engineering::School of Computer Scienceen_US
shareok.orcid0000000340743178en_US


Files in this item

Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


Attribution 4.0 International
Except where otherwise noted, this item's license is described as Attribution 4.0 International