1

Beyond A*: Better Planning with Transformers

 6 months ago
source link: https://arxiv.org/abs/2402.14083
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Computer Science > Artificial Intelligence

[Submitted on 21 Feb 2024]

Beyond A*: Better Planning with Transformers via Search Dynamics Bootstrapping

Download PDF HTML (experimental)

While Transformers have enabled tremendous progress in various application settings, such architectures still lag behind traditional symbolic planners for solving complex decision making tasks. In this work, we demonstrate how to train Transformers to solve complex planning tasks and present Searchformer, a Transformer model that optimally solves previously unseen Sokoban puzzles 93.7% of the time, while using up to 26.8% fewer search steps than standard A^* search. Searchformer is an encoder-decoder Transformer model trained to predict the search dynamics of A^*. This model is then fine-tuned via expert iterations to perform fewer search steps than A^* search while still generating an optimal plan. In our training method, A^*'s search dynamics are expressed as a token sequence outlining when task states are added and removed into the search tree during symbolic planning. In our ablation studies on maze navigation, we find that Searchformer significantly outperforms baselines that predict the optimal plan directly with a 5-10\times smaller model size and a 10\times smaller training dataset. We also demonstrate how Searchformer scales to larger and more complex decision making tasks like Sokoban with improved percentage of solved tasks and shortened search dynamics.
Subjects: Artificial Intelligence (cs.AI)
Cite as: arXiv:2402.14083 [cs.AI]
  (or arXiv:2402.14083v1 [cs.AI] for this version)

Submission history

From: Lucas Lehnert [view email]
[v1] Wed, 21 Feb 2024 19:17:28 UTC (758 KB)

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK