How to dynamically add tasks to a PowerShell job queue

2

1

I have created a PowerShell script that copies any e-books I download to a predetermined directory, which is periodically scanned by my e-book manager and added to my library. The script is run immediately after every download.

The problem is that when several books are downloaded at the same time or around the same time, the e-book manager locks up and goes unresponsive.

Therefore, I would like to queue the copying using PowerShell jobs, but I do not know how to create a single queue (single concurrency) that each subsequent job will await completion of every older job.

That is, I would like the script to create a job (let's call it a "Book Job") that periodically checks the queue of running Book Jobs to see if all older Book Jobs have finished before it runs. When it completes, a Book Job should declare that it has finished in some way that can be detected by younger Book Jobs.

Does anyone know how I can do this? I saw a similar question here that I am looking at: Powershell background tasks, however, in my case I am running the script multiple times (after every new download).

Mavaddat Javid

Posted 2018-03-26T23:39:55.110

Reputation: 43

1I don't want to do that (have the script sleep and check again after an interval), because the PowerShell script is run after each download is completed (with an input parameter for the specific sub directory that contains the newly downloaded files). This means that the queueing logic cannot rely on the script itself persisting in the background. I would like the script to create a job that is able to persist in the background and check if it's the oldest job of its kind before executing the copying logic. – Mavaddat Javid – 2018-03-27T02:11:37.333

Does the below answer look like a reasonable solution for your needs? I'm curious if you want the queue wait to be for each file found one by one so each execution of the script does a single file or just for whatever files each execution finds at execution to only process those files to copy or whatever and not anything added to the monitored download folder since after the job started its execution? Maybe adding an additional folder in the loop and having the files in it be cleared out before copying others is something that may be helpful with this process. – Pimp Juice IT – 2018-03-28T13:08:17.437

Answers

3

My thought is to establish a queue by creating one lock file per new instance of your script. When the script runs, it checks a directory dedicated to tracking the queue for existing instances of the script. If there are none, the script adds itself to the front of the queue, performs some action (runs your code), then cleans up its lock. If there are locks, a new one will be added to the end of the queue, and the instance will endlessly check until it is at the front of the queue.

This allows you to run the same script multiple times, all individually handling themselves by checking the externally-available queue.

The lock files are structured as Index, delimiter ("_"), process ID.

Clear-Host

function New-Lock ([int] $index) {
    $newLock = "$index" + "_" + $pid + ".lck"
    New-Item $queue$newLock | Out-Null
}

$queue = "C:\locks\"

# find the end of the stack
$locks = gci $queue *.lck | sort | select -expandproperty name

# if locks exist, find the end of the stack by selecting the index of the last lock
if($locks) {
    # gets the last lock file, selects the index by splitting on the delimiter
    [int]$last = [convert]::ToInt32(($locks | select -last 1).Split("_")[0],10)

    # add the lock to the end of the stack
    New-Lock ($last + 1)
}
# if no locks exist, create one at the top of the stack
else {
    New-Lock 0
}

# check if we're at the top of the stack
do {
    $locks = gci $queue *.lck | sort | select -expandproperty name

    # this is the PID on the top of the stack
    [int]$top = [convert]::ToInt32(($locks | select -first 1).Split("_")[1].Split(".")[0],10)
    write-verbose "not at the top..."
    sleep 1
} until ($pid -eq $top)

# if we're here, we've been to the top. it's our turn to do something
Write-Verbose "we've reached the top!"
# <do something. put your code here>
# might be good to add some Start-Sleep here
# </do something put your code here>

# now that we're done, let's delete our lock
gci $queue | select -first 1 | Remove-Item

Below is a fictitious timeline example in which you've download three files (I've chosen random PIDs).

  • File 1 is downloaded and launches the script. There are no existing locks. Create lock "0_19831". We're at the top of the stack, so your code is executed. This is a big e-book, so your file transfer code will take a full min to run.
  • File 2 is downloaded and launches the script. Lock(s) exist. Create lock "1_332". We're not at the top of the stack, so we'll wait in our do/until and keep checking until we're first in line.
  • File 1 finished copying. Delete lock "0_19831".
  • File 3 is downloaded and launches the script. Lock(s) exist. Create lock "2_7582". We're not at the top of the stack, wait until we are.
  • File 2 finished copying. Delete lock "1_332".
  • File 3 finished copying. Delete lock "2_7582".

This solution isn't bullet proof, but might work depending on the scale.

root

Posted 2018-03-26T23:39:55.110

Reputation: 2 992