Graph Attention and Persistence for Traveling Salesman Problem
Abstract
Combinatorial optimization problems have long been a computationally-challenging family of problems with high importance within science. Although algorithms to solve such problems exist, these classical/exact algorithms tend to become computationally intractable as the problem size increases. Due to the relevancy of such problems in real life, research has explored other algorithms that forego the optimality guarantee of classical algorithms. In doing so, heuristic algorithms have achieved fast inference times with performance that is not too far from the optimal performance of classical algorithms. A particular heuristic algorithm from artificial intelligence, the attention model (AM), has achieved state-of-the-art performance in prevalent combinatorial optimization problems such as the traveling salesman, vehicle routing, and orienteering problems, where the gap between classical and heuristic algorithms was diminished. This success is attributed to the ability of the AM to tend to the structure of the input through the use of graph attention (GAT) mechanisms, a type of graph neural network. While the mechanism by which these models extract structural information, it is unknown what kind of information is extracted. This thesis presents a novel variant of the attention model (AM), the persistence attention model (AM-P), which explicitly uses a type of structural information, persistent homology. The model is tested on the traveling salesman problem with different problem node counts to compare the performance of the attention model with and without persistent homology information. It is hypothesized that persistent homology information will help the attention model better exploit the structure of the model, achieving better performance on combinatorial problems. This hypothesis is demonstrated false as there is no statistically-significant differences between the performance of the AM and AM-P. This negative result raises the possibility that the attention model may already extract structural information similar to persistent homology through its GAT mechanisms.
Collections
- OU - Theses [2115]
The following license files are associated with this item: