Run a process only when the previous one is finished | Bash [duplicate] Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Data science time! April 2019 and salary with experience The Ask Question Wizard is Live!Quick-and-dirty way to ensure only one instance of a shell script is running at a timeHow do I put an already-running process under nohup?Find and kill a process in one line using bash and regexRunning a limited number of child processes in parallel in bash?Process all arguments except the first one (in a bash script)Can't get on-download-complete to work with aria2Listing only directories using ls in bash: An examinationBash script processing limited number of commands in parallelKeep bash script running child processes until bash script endBash: Waiting on a background python processProcess files while downloading with aria2
What order were files/directories outputted in dir?
I am having problem understanding the behavior of below code in JavaScript
What was the first language to use conditional keywords?
What initially awakened the Balrog?
Why weren't discrete x86 CPUs ever used in game hardware?
Did Deadpool rescue all of the X-Force?
Do I really need to have a message in a novel to appeal to readers?
Is there a kind of relay that only consumes power when switching?
Do any jurisdictions seriously consider reclassifying social media websites as publishers?
Is grep documentation about ignoring case wrong, since it doesn't ignore case in filenames?
Is it a good idea to use CNN to classify 1D signal?
Why do early math courses focus on the cross sections of a cone and not on other 3D objects?
Morning, Afternoon, Night Kanji
What would you call this weird metallic apparatus that allows you to lift people?
What is the appropriate index architecture when forced to implement IsDeleted (soft deletes)?
Update module to run alter command
Illegal assignment from sObject to Id
How were pictures turned from film to a big picture in a picture frame before digital scanning?
How does the math work when buying airline miles?
How often does castling occur in grandmaster games?
Can the Great Weapon Master feat's damage bonus and accuracy penalty apply to attacks from the Spiritual Weapon spell?
Time to Settle Down!
Converted a Scalar function to a TVF function for parallel execution-Still running in Serial mode
Why wasn't DOSKEY integrated with COMMAND.COM?
Run a process only when the previous one is finished | Bash [duplicate]
Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)
Data science time! April 2019 and salary with experience
The Ask Question Wizard is Live!Quick-and-dirty way to ensure only one instance of a shell script is running at a timeHow do I put an already-running process under nohup?Find and kill a process in one line using bash and regexRunning a limited number of child processes in parallel in bash?Process all arguments except the first one (in a bash script)Can't get on-download-complete to work with aria2Listing only directories using ls in bash: An examinationBash script processing limited number of commands in parallelKeep bash script running child processes until bash script endBash: Waiting on a background python processProcess files while downloading with aria2
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;
This question already has an answer here:
Quick-and-dirty way to ensure only one instance of a shell script is running at a time
39 answers
I am using aria2
to download some data with the option --on-download-complete
to run a bash
script automatically to process the data.
aria2c --http-user='***' --http-passwd='***' --check-certificate=false --max-concurrent-downloads=2 -M products.meta4 --on-download-complete=/my/path/script_gpt.sh
Focusing on my bash
script,
#!/bin/bash
oldEnd=.zip
newEnd=_processed.dim
for i in $(ls -d -1 /my/path/S1*.zip)
do
if [ -f $i%$oldEnd$newEnd ]; then
echo "Already processed"
else
gpt /my/path/graph.xml -Pinput1=$i -Poutput1=$i%$oldEnd$newEnd
fi
done
Basically, everytime a download is finished, a for
loop starts. First it checks if the downloaded product has been already processed and if not it runs a specific task.
My issue is that everytime a download is completed, the bash
script is run. This means that if the analysis is not finished from the previous time the bash
script was run, both tasks will overlap and eat all my memory resources.
Ideally, I would like to:
Each time the
bash
script is run, check if there is still and ongoing process.If so, wait until it is finished and then run
Its like creating a queu of task (like in a for
loop where each iteration waits until the previous one is finished).
I have tried to implement the solutin with wait
or identifying the PID
but nothing succesfull.
Maybe changing the approach and instead of using aria2
to process the data that is just donwloaded, implemente another solution?
bash shell directory filesystems aria2
marked as duplicate by tripleee
StackExchange.ready(function()
if (StackExchange.options.isMobile) return;
$('.dupe-hammer-message-hover:not(.hover-bound)').each(function()
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');
$hover.hover(
function()
$hover.showInfoMessage('',
messageElement: $msg.clone().show(),
transient: false,
position: my: 'bottom left', at: 'top center', offsetTop: -7 ,
dismissable: false,
relativeToBody: true
);
,
function()
StackExchange.helpers.removeMessages();
);
);
);
Mar 23 at 7:44
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
|
show 6 more comments
This question already has an answer here:
Quick-and-dirty way to ensure only one instance of a shell script is running at a time
39 answers
I am using aria2
to download some data with the option --on-download-complete
to run a bash
script automatically to process the data.
aria2c --http-user='***' --http-passwd='***' --check-certificate=false --max-concurrent-downloads=2 -M products.meta4 --on-download-complete=/my/path/script_gpt.sh
Focusing on my bash
script,
#!/bin/bash
oldEnd=.zip
newEnd=_processed.dim
for i in $(ls -d -1 /my/path/S1*.zip)
do
if [ -f $i%$oldEnd$newEnd ]; then
echo "Already processed"
else
gpt /my/path/graph.xml -Pinput1=$i -Poutput1=$i%$oldEnd$newEnd
fi
done
Basically, everytime a download is finished, a for
loop starts. First it checks if the downloaded product has been already processed and if not it runs a specific task.
My issue is that everytime a download is completed, the bash
script is run. This means that if the analysis is not finished from the previous time the bash
script was run, both tasks will overlap and eat all my memory resources.
Ideally, I would like to:
Each time the
bash
script is run, check if there is still and ongoing process.If so, wait until it is finished and then run
Its like creating a queu of task (like in a for
loop where each iteration waits until the previous one is finished).
I have tried to implement the solutin with wait
or identifying the PID
but nothing succesfull.
Maybe changing the approach and instead of using aria2
to process the data that is just donwloaded, implemente another solution?
bash shell directory filesystems aria2
marked as duplicate by tripleee
StackExchange.ready(function()
if (StackExchange.options.isMobile) return;
$('.dupe-hammer-message-hover:not(.hover-bound)').each(function()
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');
$hover.hover(
function()
$hover.showInfoMessage('',
messageElement: $msg.clone().show(),
transient: false,
position: my: 'bottom left', at: 'top center', offsetTop: -7 ,
dismissable: false,
relativeToBody: true
);
,
function()
StackExchange.helpers.removeMessages();
);
);
);
Mar 23 at 7:44
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
You could try a file lock in the beginning of the script and wait or exit if the file is locked.
– Poshi
Mar 22 at 10:53
Any example on how to implemented? Not familiar with this type of implementation
– GCGM
Mar 22 at 10:59
I think that something similar to this would work:aria2c --http-user='***' --http-passwd='***' --check-certificate=false --max-concurrent-downloads=2 -M products.meta4 --on-download-complete="flock -x /tmp/aria.lock /my/path/script_gpt.sh"
– Poshi
Mar 22 at 11:06
Getting the following error:Could not execute user command: flock -x /tmp/aria.lock /my/path/script_gpt.sh: No such file or directory
– GCGM
Mar 22 at 11:23
hummm... check ifflock
exists in your system:which flock
– Poshi
Mar 22 at 11:48
|
show 6 more comments
This question already has an answer here:
Quick-and-dirty way to ensure only one instance of a shell script is running at a time
39 answers
I am using aria2
to download some data with the option --on-download-complete
to run a bash
script automatically to process the data.
aria2c --http-user='***' --http-passwd='***' --check-certificate=false --max-concurrent-downloads=2 -M products.meta4 --on-download-complete=/my/path/script_gpt.sh
Focusing on my bash
script,
#!/bin/bash
oldEnd=.zip
newEnd=_processed.dim
for i in $(ls -d -1 /my/path/S1*.zip)
do
if [ -f $i%$oldEnd$newEnd ]; then
echo "Already processed"
else
gpt /my/path/graph.xml -Pinput1=$i -Poutput1=$i%$oldEnd$newEnd
fi
done
Basically, everytime a download is finished, a for
loop starts. First it checks if the downloaded product has been already processed and if not it runs a specific task.
My issue is that everytime a download is completed, the bash
script is run. This means that if the analysis is not finished from the previous time the bash
script was run, both tasks will overlap and eat all my memory resources.
Ideally, I would like to:
Each time the
bash
script is run, check if there is still and ongoing process.If so, wait until it is finished and then run
Its like creating a queu of task (like in a for
loop where each iteration waits until the previous one is finished).
I have tried to implement the solutin with wait
or identifying the PID
but nothing succesfull.
Maybe changing the approach and instead of using aria2
to process the data that is just donwloaded, implemente another solution?
bash shell directory filesystems aria2
This question already has an answer here:
Quick-and-dirty way to ensure only one instance of a shell script is running at a time
39 answers
I am using aria2
to download some data with the option --on-download-complete
to run a bash
script automatically to process the data.
aria2c --http-user='***' --http-passwd='***' --check-certificate=false --max-concurrent-downloads=2 -M products.meta4 --on-download-complete=/my/path/script_gpt.sh
Focusing on my bash
script,
#!/bin/bash
oldEnd=.zip
newEnd=_processed.dim
for i in $(ls -d -1 /my/path/S1*.zip)
do
if [ -f $i%$oldEnd$newEnd ]; then
echo "Already processed"
else
gpt /my/path/graph.xml -Pinput1=$i -Poutput1=$i%$oldEnd$newEnd
fi
done
Basically, everytime a download is finished, a for
loop starts. First it checks if the downloaded product has been already processed and if not it runs a specific task.
My issue is that everytime a download is completed, the bash
script is run. This means that if the analysis is not finished from the previous time the bash
script was run, both tasks will overlap and eat all my memory resources.
Ideally, I would like to:
Each time the
bash
script is run, check if there is still and ongoing process.If so, wait until it is finished and then run
Its like creating a queu of task (like in a for
loop where each iteration waits until the previous one is finished).
I have tried to implement the solutin with wait
or identifying the PID
but nothing succesfull.
Maybe changing the approach and instead of using aria2
to process the data that is just donwloaded, implemente another solution?
This question already has an answer here:
Quick-and-dirty way to ensure only one instance of a shell script is running at a time
39 answers
bash shell directory filesystems aria2
bash shell directory filesystems aria2
asked Mar 22 at 10:31
GCGMGCGM
138125
138125
marked as duplicate by tripleee
StackExchange.ready(function()
if (StackExchange.options.isMobile) return;
$('.dupe-hammer-message-hover:not(.hover-bound)').each(function()
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');
$hover.hover(
function()
$hover.showInfoMessage('',
messageElement: $msg.clone().show(),
transient: false,
position: my: 'bottom left', at: 'top center', offsetTop: -7 ,
dismissable: false,
relativeToBody: true
);
,
function()
StackExchange.helpers.removeMessages();
);
);
);
Mar 23 at 7:44
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
marked as duplicate by tripleee
StackExchange.ready(function()
if (StackExchange.options.isMobile) return;
$('.dupe-hammer-message-hover:not(.hover-bound)').each(function()
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');
$hover.hover(
function()
$hover.showInfoMessage('',
messageElement: $msg.clone().show(),
transient: false,
position: my: 'bottom left', at: 'top center', offsetTop: -7 ,
dismissable: false,
relativeToBody: true
);
,
function()
StackExchange.helpers.removeMessages();
);
);
);
Mar 23 at 7:44
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
You could try a file lock in the beginning of the script and wait or exit if the file is locked.
– Poshi
Mar 22 at 10:53
Any example on how to implemented? Not familiar with this type of implementation
– GCGM
Mar 22 at 10:59
I think that something similar to this would work:aria2c --http-user='***' --http-passwd='***' --check-certificate=false --max-concurrent-downloads=2 -M products.meta4 --on-download-complete="flock -x /tmp/aria.lock /my/path/script_gpt.sh"
– Poshi
Mar 22 at 11:06
Getting the following error:Could not execute user command: flock -x /tmp/aria.lock /my/path/script_gpt.sh: No such file or directory
– GCGM
Mar 22 at 11:23
hummm... check ifflock
exists in your system:which flock
– Poshi
Mar 22 at 11:48
|
show 6 more comments
You could try a file lock in the beginning of the script and wait or exit if the file is locked.
– Poshi
Mar 22 at 10:53
Any example on how to implemented? Not familiar with this type of implementation
– GCGM
Mar 22 at 10:59
I think that something similar to this would work:aria2c --http-user='***' --http-passwd='***' --check-certificate=false --max-concurrent-downloads=2 -M products.meta4 --on-download-complete="flock -x /tmp/aria.lock /my/path/script_gpt.sh"
– Poshi
Mar 22 at 11:06
Getting the following error:Could not execute user command: flock -x /tmp/aria.lock /my/path/script_gpt.sh: No such file or directory
– GCGM
Mar 22 at 11:23
hummm... check ifflock
exists in your system:which flock
– Poshi
Mar 22 at 11:48
You could try a file lock in the beginning of the script and wait or exit if the file is locked.
– Poshi
Mar 22 at 10:53
You could try a file lock in the beginning of the script and wait or exit if the file is locked.
– Poshi
Mar 22 at 10:53
Any example on how to implemented? Not familiar with this type of implementation
– GCGM
Mar 22 at 10:59
Any example on how to implemented? Not familiar with this type of implementation
– GCGM
Mar 22 at 10:59
I think that something similar to this would work:
aria2c --http-user='***' --http-passwd='***' --check-certificate=false --max-concurrent-downloads=2 -M products.meta4 --on-download-complete="flock -x /tmp/aria.lock /my/path/script_gpt.sh"
– Poshi
Mar 22 at 11:06
I think that something similar to this would work:
aria2c --http-user='***' --http-passwd='***' --check-certificate=false --max-concurrent-downloads=2 -M products.meta4 --on-download-complete="flock -x /tmp/aria.lock /my/path/script_gpt.sh"
– Poshi
Mar 22 at 11:06
Getting the following error:
Could not execute user command: flock -x /tmp/aria.lock /my/path/script_gpt.sh: No such file or directory
– GCGM
Mar 22 at 11:23
Getting the following error:
Could not execute user command: flock -x /tmp/aria.lock /my/path/script_gpt.sh: No such file or directory
– GCGM
Mar 22 at 11:23
hummm... check if
flock
exists in your system: which flock
– Poshi
Mar 22 at 11:48
hummm... check if
flock
exists in your system: which flock
– Poshi
Mar 22 at 11:48
|
show 6 more comments
1 Answer
1
active
oldest
votes
You can try to acquire an exclusive file lock and only run when it lock is released. Your code could be like
#!/bin/bash
oldEnd=.zip
newEnd=_processed.dim
flock -e 200
while IFS= read -r -d'' i
do
if [ -f "$i%$oldEnd$newEnd" ];
then
echo "Already processed"
else
gpt /my/path/graph.xml -Pinput1="$i" -Poutput1="$i%$oldEnd$newEnd"
fi
done < <(find /my/path -maxdepth 1 -name "S1*.zip" -print0)
200> /tmp/aria.lock
This code opens an exclusive lock against file descriptor 200 (the one we told bash
to open to redirect output to the lock file, and prevents other scripts to execute the code block until the file is closed. The file is closed as soon as the code block is finished, allowing other waiting processes to continue the execution.
BTW, you should always quote your variables and you should avoid parsing the ls
output. Also, to avoid problems with whitespaces and unexpected globbing, outputting the file list separated by zeros and reading it with read
is a way to avoid those problems.
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
You can try to acquire an exclusive file lock and only run when it lock is released. Your code could be like
#!/bin/bash
oldEnd=.zip
newEnd=_processed.dim
flock -e 200
while IFS= read -r -d'' i
do
if [ -f "$i%$oldEnd$newEnd" ];
then
echo "Already processed"
else
gpt /my/path/graph.xml -Pinput1="$i" -Poutput1="$i%$oldEnd$newEnd"
fi
done < <(find /my/path -maxdepth 1 -name "S1*.zip" -print0)
200> /tmp/aria.lock
This code opens an exclusive lock against file descriptor 200 (the one we told bash
to open to redirect output to the lock file, and prevents other scripts to execute the code block until the file is closed. The file is closed as soon as the code block is finished, allowing other waiting processes to continue the execution.
BTW, you should always quote your variables and you should avoid parsing the ls
output. Also, to avoid problems with whitespaces and unexpected globbing, outputting the file list separated by zeros and reading it with read
is a way to avoid those problems.
add a comment |
You can try to acquire an exclusive file lock and only run when it lock is released. Your code could be like
#!/bin/bash
oldEnd=.zip
newEnd=_processed.dim
flock -e 200
while IFS= read -r -d'' i
do
if [ -f "$i%$oldEnd$newEnd" ];
then
echo "Already processed"
else
gpt /my/path/graph.xml -Pinput1="$i" -Poutput1="$i%$oldEnd$newEnd"
fi
done < <(find /my/path -maxdepth 1 -name "S1*.zip" -print0)
200> /tmp/aria.lock
This code opens an exclusive lock against file descriptor 200 (the one we told bash
to open to redirect output to the lock file, and prevents other scripts to execute the code block until the file is closed. The file is closed as soon as the code block is finished, allowing other waiting processes to continue the execution.
BTW, you should always quote your variables and you should avoid parsing the ls
output. Also, to avoid problems with whitespaces and unexpected globbing, outputting the file list separated by zeros and reading it with read
is a way to avoid those problems.
add a comment |
You can try to acquire an exclusive file lock and only run when it lock is released. Your code could be like
#!/bin/bash
oldEnd=.zip
newEnd=_processed.dim
flock -e 200
while IFS= read -r -d'' i
do
if [ -f "$i%$oldEnd$newEnd" ];
then
echo "Already processed"
else
gpt /my/path/graph.xml -Pinput1="$i" -Poutput1="$i%$oldEnd$newEnd"
fi
done < <(find /my/path -maxdepth 1 -name "S1*.zip" -print0)
200> /tmp/aria.lock
This code opens an exclusive lock against file descriptor 200 (the one we told bash
to open to redirect output to the lock file, and prevents other scripts to execute the code block until the file is closed. The file is closed as soon as the code block is finished, allowing other waiting processes to continue the execution.
BTW, you should always quote your variables and you should avoid parsing the ls
output. Also, to avoid problems with whitespaces and unexpected globbing, outputting the file list separated by zeros and reading it with read
is a way to avoid those problems.
You can try to acquire an exclusive file lock and only run when it lock is released. Your code could be like
#!/bin/bash
oldEnd=.zip
newEnd=_processed.dim
flock -e 200
while IFS= read -r -d'' i
do
if [ -f "$i%$oldEnd$newEnd" ];
then
echo "Already processed"
else
gpt /my/path/graph.xml -Pinput1="$i" -Poutput1="$i%$oldEnd$newEnd"
fi
done < <(find /my/path -maxdepth 1 -name "S1*.zip" -print0)
200> /tmp/aria.lock
This code opens an exclusive lock against file descriptor 200 (the one we told bash
to open to redirect output to the lock file, and prevents other scripts to execute the code block until the file is closed. The file is closed as soon as the code block is finished, allowing other waiting processes to continue the execution.
BTW, you should always quote your variables and you should avoid parsing the ls
output. Also, to avoid problems with whitespaces and unexpected globbing, outputting the file list separated by zeros and reading it with read
is a way to avoid those problems.
answered Mar 22 at 14:05
PoshiPoshi
3,0482925
3,0482925
add a comment |
add a comment |
You could try a file lock in the beginning of the script and wait or exit if the file is locked.
– Poshi
Mar 22 at 10:53
Any example on how to implemented? Not familiar with this type of implementation
– GCGM
Mar 22 at 10:59
I think that something similar to this would work:
aria2c --http-user='***' --http-passwd='***' --check-certificate=false --max-concurrent-downloads=2 -M products.meta4 --on-download-complete="flock -x /tmp/aria.lock /my/path/script_gpt.sh"
– Poshi
Mar 22 at 11:06
Getting the following error:
Could not execute user command: flock -x /tmp/aria.lock /my/path/script_gpt.sh: No such file or directory
– GCGM
Mar 22 at 11:23
hummm... check if
flock
exists in your system:which flock
– Poshi
Mar 22 at 11:48