"Quick Minimization of Tardy Processing Time on a Single Machine" by Baruch Schieber and Pranav Sitaraman
 

Quick Minimization of Tardy Processing Time on a Single Machine

Document Type

Conference Proceeding

Publication Date

1-1-2023

Abstract

We consider the problem of minimizing the total processing time of tardy jobs on a single machine. This is a classical scheduling problem, first considered by [Lawler and Moore 1969], that also generalizes the Subset Sum problem. Recently, it was shown that this problem can be solved efficiently by computing (max, min ) -skewed-convolutions. The running time of the resulting algorithm is the same, up to logarithmic factors, as the time it takes to compute a (max, min ) -skewed-convolution of two vectors of integers whose sum is O(P), where P is the sum of the jobs’ processing times. We further improve the running time of the minimum tardy processing time computation by introducing a job “bundling” technique and achieve a O~(P2-1/α) running time, where O~(Pα) is the running time of a (max, min ) -skewed-convolution of vectors of size P. This results in a O~(P7/5) time algorithm for tardy processing time minimization, an improvement over the previously known O~(P5/3) time algorithm.

Identifier

85172733877 (Scopus)

ISBN

[9783031389054]

Publication Title

Lecture Notes in Computer Science Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics

External Full Text Location

https://doi.org/10.1007/978-3-031-38906-1_42

e-ISSN

16113349

ISSN

03029743

First Page

637

Last Page

643

Volume

14079 LNCS

Grant

UNCE/SCI/004

Fund Ref

National Science Foundation

This document is currently not available here.

Share

COinS