Virginia Tech
Browse

Complexity Scaling Laws - Model Evaluation for 2D and Higher-dimensional TSP

dataset
posted on 2025-06-27, 13:55 authored by Lowell WeissmanLowell Weissman

Datasets required to run model evaluations.

Recent work on neural scaling laws demonstrates that model performance scales predictably with compute budget, model size, and dataset size. In this work, we develop scaling laws based on problem complexity. We analyze two fundamental complexity measures: solution space size and representation space size. Using the Traveling Salesman Problem (TSP) as a case study, we show that combinatorial optimization promotes smooth cost trends, and therefore meaningful scaling laws can be obtained even in the absence of an interpretable loss. We then show that suboptimality grows predictably for fixed-size models when scaling the number of TSP nodes or spatial dimensions, independent of whether the model was trained with reinforcement learning or supervised fine-tuning on a static dataset. We conclude with an analogy to problem complexity scaling in local search, showing that a much simpler gradient descent of the cost landscape produces similar trends.

History

Publisher

University Libraries, Virginia Tech

Corresponding Author Name

Lowell Weissman

Files/Folders in Dataset and Description

sol_?n_1280000t_0.npy - Concorde optimal solutions for 2D TSP over node scaling (first of 10 chunks) ?d_2opt_100k_10n_128000t - 2opt local search approximately optimal datasets over spatial dimension scaling (10-nodes) ?d_2opt_1000k_20n_64000t - 2opt local search approximately optimal datasets over spatial dimension scaling (20-nodes)

Usage metrics

    Electrical and Computer Engineering

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC