Run multiple python files simultaneously and terminate all when one has finishedTerminating a Python scriptIs there a way to run Python on Android?How do I copy a file in Python?How do I list all files of a directory?How do I change permissions for a folder and all of its subfolders and files in one step in Linux?Find all files in a directory with extension .txt in PythonHow can I recursively find all files in current and subfolders based on wildcard matching?Recursively look for files with a specific extensionRedirect all output to fileHow do I find all files containing specific text on Linux?

Can someone publish a story that happened to you?

What are the steps to solving this definite integral?

Pre-plastic human skin alternative

Why didn't the Space Shuttle bounce back into space as many times as possible so as to lose a lot of kinetic energy up there?

Apply MapThread to all but one variable

Retract an already submitted recommendation letter (written for an undergrad student)

"Whatever a Russian does, they end up making the Kalashnikov gun"? Are there any similar proverbs in English?

Like totally amazing interchangeable sister outfits II: The Revenge

Could the terminal length of components like resistors be reduced?

Re-entry to Germany after vacation using blue card

Why did C use the -> operator instead of reusing the . operator?

can anyone help me with this awful query plan?

How come there are so many candidates for the 2020 Democratic party presidential nomination?

How could Tony Stark make this in Endgame?

Why does nature favour the Laplacian?

Is there any official lore on the Far Realm?

Discriminated by senior researcher because of my ethnicity

How to denote matrix elements succinctly?

Is it idiomatic to construct against `this`

How do I deal with a coworker that keeps asking to make small superficial changes to a report, and it is seriously triggering my anxiety?

What happens to Mjolnir (Thor's hammer) at the end of Endgame?

Was there a Viking Exchange as well as a Columbian one?

What is the smallest unit of eos?

How can the Githyanki Supreme Commander move while insubstantial?



Run multiple python files simultaneously and terminate all when one has finished


Terminating a Python scriptIs there a way to run Python on Android?How do I copy a file in Python?How do I list all files of a directory?How do I change permissions for a folder and all of its subfolders and files in one step in Linux?Find all files in a directory with extension .txt in PythonHow can I recursively find all files in current and subfolders based on wildcard matching?Recursively look for files with a specific extensionRedirect all output to fileHow do I find all files containing specific text on Linux?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








2















I need to run multiple python files simultaneously, but when one finishes, kill all of the rest. I only need to know the output of the python file that finishes first. After much trial and error, I have decided to use a bash file (I am running Ubuntu) to start all of the files simultaneously, however, I cannot stop all of them once the first one is done.Please help, thanks!!! I have tried many answers I have found online with no success. I have even written a python file to kill other python files which works when started from the terminal but not when called from the bash file or another python file. I feel like I am over complicating the problem. Any Help is greatly appreciated.



Assuming I take in a variable named pass and I pass that to each file here are the commands I want to run in parallel:



 read pass
python3 1.py "$pass"
python3 2.py "$pass"
python3 3.py "$pass"
python3 4.py "$pass"
python3 5.py "$pass"
python3 6.py "$pass"
python3 7.py "$pass"
python3 8.py "$pass"
python3 9.py "$pass"









share|improve this question
























  • Why would you need to launch multiple processes but only take the output of the first to finish? This doesn't make sense to me

    – roganjosh
    Mar 22 at 13:00











  • Ideally I would utilize multiple processes or threads in C or Java, but I need to use python for other reasons and so each file is given the same input but it is trying operations on the same data. There is only one answer so only one of the files should work

    – mike
    Mar 22 at 13:03











  • can't you use forking? you can do all the processes you need, and when a process ends you can exit all the others manually, but I can't really think why you want to exit other processes if it is still working

    – Mobrine Hayde
    Mar 22 at 13:04












  • So all of the processes are doing the same thing and probably finish within less than a second of each other. Depending on how many processes you launch, they may be competing for resources.

    – roganjosh
    Mar 22 at 13:04












  • Yes they are all doing the same thing, but only one file will actually finish. All of the others will continue to look for an answer that is not there.

    – mike
    Mar 22 at 13:06

















2















I need to run multiple python files simultaneously, but when one finishes, kill all of the rest. I only need to know the output of the python file that finishes first. After much trial and error, I have decided to use a bash file (I am running Ubuntu) to start all of the files simultaneously, however, I cannot stop all of them once the first one is done.Please help, thanks!!! I have tried many answers I have found online with no success. I have even written a python file to kill other python files which works when started from the terminal but not when called from the bash file or another python file. I feel like I am over complicating the problem. Any Help is greatly appreciated.



Assuming I take in a variable named pass and I pass that to each file here are the commands I want to run in parallel:



 read pass
python3 1.py "$pass"
python3 2.py "$pass"
python3 3.py "$pass"
python3 4.py "$pass"
python3 5.py "$pass"
python3 6.py "$pass"
python3 7.py "$pass"
python3 8.py "$pass"
python3 9.py "$pass"









share|improve this question
























  • Why would you need to launch multiple processes but only take the output of the first to finish? This doesn't make sense to me

    – roganjosh
    Mar 22 at 13:00











  • Ideally I would utilize multiple processes or threads in C or Java, but I need to use python for other reasons and so each file is given the same input but it is trying operations on the same data. There is only one answer so only one of the files should work

    – mike
    Mar 22 at 13:03











  • can't you use forking? you can do all the processes you need, and when a process ends you can exit all the others manually, but I can't really think why you want to exit other processes if it is still working

    – Mobrine Hayde
    Mar 22 at 13:04












  • So all of the processes are doing the same thing and probably finish within less than a second of each other. Depending on how many processes you launch, they may be competing for resources.

    – roganjosh
    Mar 22 at 13:04












  • Yes they are all doing the same thing, but only one file will actually finish. All of the others will continue to look for an answer that is not there.

    – mike
    Mar 22 at 13:06













2












2








2








I need to run multiple python files simultaneously, but when one finishes, kill all of the rest. I only need to know the output of the python file that finishes first. After much trial and error, I have decided to use a bash file (I am running Ubuntu) to start all of the files simultaneously, however, I cannot stop all of them once the first one is done.Please help, thanks!!! I have tried many answers I have found online with no success. I have even written a python file to kill other python files which works when started from the terminal but not when called from the bash file or another python file. I feel like I am over complicating the problem. Any Help is greatly appreciated.



Assuming I take in a variable named pass and I pass that to each file here are the commands I want to run in parallel:



 read pass
python3 1.py "$pass"
python3 2.py "$pass"
python3 3.py "$pass"
python3 4.py "$pass"
python3 5.py "$pass"
python3 6.py "$pass"
python3 7.py "$pass"
python3 8.py "$pass"
python3 9.py "$pass"









share|improve this question
















I need to run multiple python files simultaneously, but when one finishes, kill all of the rest. I only need to know the output of the python file that finishes first. After much trial and error, I have decided to use a bash file (I am running Ubuntu) to start all of the files simultaneously, however, I cannot stop all of them once the first one is done.Please help, thanks!!! I have tried many answers I have found online with no success. I have even written a python file to kill other python files which works when started from the terminal but not when called from the bash file or another python file. I feel like I am over complicating the problem. Any Help is greatly appreciated.



Assuming I take in a variable named pass and I pass that to each file here are the commands I want to run in parallel:



 read pass
python3 1.py "$pass"
python3 2.py "$pass"
python3 3.py "$pass"
python3 4.py "$pass"
python3 5.py "$pass"
python3 6.py "$pass"
python3 7.py "$pass"
python3 8.py "$pass"
python3 9.py "$pass"






python linux bash






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Mar 23 at 15:39







mike

















asked Mar 22 at 12:55









mikemike

297




297












  • Why would you need to launch multiple processes but only take the output of the first to finish? This doesn't make sense to me

    – roganjosh
    Mar 22 at 13:00











  • Ideally I would utilize multiple processes or threads in C or Java, but I need to use python for other reasons and so each file is given the same input but it is trying operations on the same data. There is only one answer so only one of the files should work

    – mike
    Mar 22 at 13:03











  • can't you use forking? you can do all the processes you need, and when a process ends you can exit all the others manually, but I can't really think why you want to exit other processes if it is still working

    – Mobrine Hayde
    Mar 22 at 13:04












  • So all of the processes are doing the same thing and probably finish within less than a second of each other. Depending on how many processes you launch, they may be competing for resources.

    – roganjosh
    Mar 22 at 13:04












  • Yes they are all doing the same thing, but only one file will actually finish. All of the others will continue to look for an answer that is not there.

    – mike
    Mar 22 at 13:06

















  • Why would you need to launch multiple processes but only take the output of the first to finish? This doesn't make sense to me

    – roganjosh
    Mar 22 at 13:00











  • Ideally I would utilize multiple processes or threads in C or Java, but I need to use python for other reasons and so each file is given the same input but it is trying operations on the same data. There is only one answer so only one of the files should work

    – mike
    Mar 22 at 13:03











  • can't you use forking? you can do all the processes you need, and when a process ends you can exit all the others manually, but I can't really think why you want to exit other processes if it is still working

    – Mobrine Hayde
    Mar 22 at 13:04












  • So all of the processes are doing the same thing and probably finish within less than a second of each other. Depending on how many processes you launch, they may be competing for resources.

    – roganjosh
    Mar 22 at 13:04












  • Yes they are all doing the same thing, but only one file will actually finish. All of the others will continue to look for an answer that is not there.

    – mike
    Mar 22 at 13:06
















Why would you need to launch multiple processes but only take the output of the first to finish? This doesn't make sense to me

– roganjosh
Mar 22 at 13:00





Why would you need to launch multiple processes but only take the output of the first to finish? This doesn't make sense to me

– roganjosh
Mar 22 at 13:00













Ideally I would utilize multiple processes or threads in C or Java, but I need to use python for other reasons and so each file is given the same input but it is trying operations on the same data. There is only one answer so only one of the files should work

– mike
Mar 22 at 13:03





Ideally I would utilize multiple processes or threads in C or Java, but I need to use python for other reasons and so each file is given the same input but it is trying operations on the same data. There is only one answer so only one of the files should work

– mike
Mar 22 at 13:03













can't you use forking? you can do all the processes you need, and when a process ends you can exit all the others manually, but I can't really think why you want to exit other processes if it is still working

– Mobrine Hayde
Mar 22 at 13:04






can't you use forking? you can do all the processes you need, and when a process ends you can exit all the others manually, but I can't really think why you want to exit other processes if it is still working

– Mobrine Hayde
Mar 22 at 13:04














So all of the processes are doing the same thing and probably finish within less than a second of each other. Depending on how many processes you launch, they may be competing for resources.

– roganjosh
Mar 22 at 13:04






So all of the processes are doing the same thing and probably finish within less than a second of each other. Depending on how many processes you launch, they may be competing for resources.

– roganjosh
Mar 22 at 13:04














Yes they are all doing the same thing, but only one file will actually finish. All of the others will continue to look for an answer that is not there.

– mike
Mar 22 at 13:06





Yes they are all doing the same thing, but only one file will actually finish. All of the others will continue to look for an answer that is not there.

– mike
Mar 22 at 13:06












2 Answers
2






active

oldest

votes


















3














Here is a little example of how you can start multible python scripts in parallel. After the first script print the output result, all other where be killed.



python script



import sys
import time
seconds = sys.argv[1]
time.sleep(int(seconds))
print("runner sleep seconds.".format(seconds))


bash script



start() # function for starting python scripts
echo "start $@" # print which runner is started
"$@" & # start python script in background
pids+=("$!") # save pids in array


start ./runner.py 5 # start python scripts
start ./runner.py 7
start ./runner.py 1
start ./runner.py 2
start ./runner.py 3
start ./runner.py 9
start ./runner.py 8
start ./runner.py 3
start ./runner.py 4
start ./runner.py 7

wait -n "$pids[@]" # wait for first finished process
kill "$pids[@]" 2>/dev/null # kill all/other process


output



start ./runner.py 5
start ./runner.py 7
start ./runner.py 1
start ./runner.py 2
start ./runner.py 3
start ./runner.py 9
start ./runner.py 8
start ./runner.py 3
start ./runner.py 4
start ./runner.py 7
runner sleep 1 seconds.





share|improve this answer

























  • You could get rid of the pids array by running wait -n without any PIDs to wait for all children, and pkill -P "$$" to kill all of the shell's children.

    – John Kugelman
    Mar 22 at 17:26












  • Thanks everyone, I think this will work!

    – mike
    Mar 22 at 23:41











  • @UtLox I am getting this error on line 6: Syntax error: word unexpected (expecting ")") For me, line 6 is this: pids+=("$!")

    – mike
    Mar 23 at 1:40







  • 1





    Your script starts with "#!/bin/sh" - correct? Use "#!/bin/bash" and it will be run!

    – UtLox
    Mar 23 at 7:18











  • You were right! Thanks!!

    – mike
    Mar 23 at 15:48


















3














Updated Answer



Thank you for your inputs. In the light of those, I would suggest you use:



parallel --dry-run -k --halt now,success=1 python3 .py "$pass" ::: 1..9


Sample Output



python3 1.py "$pass"
python3 2.py "$pass"
python3 3.py "$pass"
python3 4.py "$pass"
python3 5.py "$pass"
python3 6.py "$pass"
python3 7.py "$pass"
python3 8.py "$pass"
python3 9.py "$pass"


If that looks correct, you can remove --dry-run and -k and simply run:



parallel --halt now,success=1 python3 .py "$pass" ::: 1..9


Original Answer



I think you can leverage GNU Parallel for this, specifically its halt policy. So, as an example, start 10 jobs, that sleep 10, 20, 30, 40... 100 seconds each and kill the others as soon as any one succeeds:



parallel --halt now,success=1 sleep ::: $(seq 10 10 100)
parallel: This job succeeded:
sleep 10


If you add -k and --dry-run after parallel, you will see that it starts the following jobs:



parallel -k --dry-run --halt now,success=1 sleep ::: $(seq 10 10 100)

sleep 10
sleep 20
sleep 30
sleep 40
sleep 50
sleep 60
sleep 70
sleep 80
sleep 90
sleep 100



Another example which kills outstanding jobs once 20% of the jobs have succeeded:



parallel --halt now,success=20% sleep ::: $(seq 10 10 100)
parallel: This job succeeded:
sleep 10
parallel: This job succeeded:
sleep 20





share|improve this answer

























  • I see you have the same task, sleep, but you pass different variables to it: 10, 10, 100. How would I use this if I have the same input but different files?

    – mike
    Mar 23 at 2:15






  • 1





    Click edit under your question and add a list of the commands you want to run in parallel and I'll update my answer to show how.

    – Mark Setchell
    Mar 23 at 9:42











  • Ok, I edited my question

    – mike
    Mar 23 at 15:22






  • 1





    I have updated my answer - please have another look.

    – Mark Setchell
    Mar 23 at 19:28











Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55300087%2frun-multiple-python-files-simultaneously-and-terminate-all-when-one-has-finished%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









3














Here is a little example of how you can start multible python scripts in parallel. After the first script print the output result, all other where be killed.



python script



import sys
import time
seconds = sys.argv[1]
time.sleep(int(seconds))
print("runner sleep seconds.".format(seconds))


bash script



start() # function for starting python scripts
echo "start $@" # print which runner is started
"$@" & # start python script in background
pids+=("$!") # save pids in array


start ./runner.py 5 # start python scripts
start ./runner.py 7
start ./runner.py 1
start ./runner.py 2
start ./runner.py 3
start ./runner.py 9
start ./runner.py 8
start ./runner.py 3
start ./runner.py 4
start ./runner.py 7

wait -n "$pids[@]" # wait for first finished process
kill "$pids[@]" 2>/dev/null # kill all/other process


output



start ./runner.py 5
start ./runner.py 7
start ./runner.py 1
start ./runner.py 2
start ./runner.py 3
start ./runner.py 9
start ./runner.py 8
start ./runner.py 3
start ./runner.py 4
start ./runner.py 7
runner sleep 1 seconds.





share|improve this answer

























  • You could get rid of the pids array by running wait -n without any PIDs to wait for all children, and pkill -P "$$" to kill all of the shell's children.

    – John Kugelman
    Mar 22 at 17:26












  • Thanks everyone, I think this will work!

    – mike
    Mar 22 at 23:41











  • @UtLox I am getting this error on line 6: Syntax error: word unexpected (expecting ")") For me, line 6 is this: pids+=("$!")

    – mike
    Mar 23 at 1:40







  • 1





    Your script starts with "#!/bin/sh" - correct? Use "#!/bin/bash" and it will be run!

    – UtLox
    Mar 23 at 7:18











  • You were right! Thanks!!

    – mike
    Mar 23 at 15:48















3














Here is a little example of how you can start multible python scripts in parallel. After the first script print the output result, all other where be killed.



python script



import sys
import time
seconds = sys.argv[1]
time.sleep(int(seconds))
print("runner sleep seconds.".format(seconds))


bash script



start() # function for starting python scripts
echo "start $@" # print which runner is started
"$@" & # start python script in background
pids+=("$!") # save pids in array


start ./runner.py 5 # start python scripts
start ./runner.py 7
start ./runner.py 1
start ./runner.py 2
start ./runner.py 3
start ./runner.py 9
start ./runner.py 8
start ./runner.py 3
start ./runner.py 4
start ./runner.py 7

wait -n "$pids[@]" # wait for first finished process
kill "$pids[@]" 2>/dev/null # kill all/other process


output



start ./runner.py 5
start ./runner.py 7
start ./runner.py 1
start ./runner.py 2
start ./runner.py 3
start ./runner.py 9
start ./runner.py 8
start ./runner.py 3
start ./runner.py 4
start ./runner.py 7
runner sleep 1 seconds.





share|improve this answer

























  • You could get rid of the pids array by running wait -n without any PIDs to wait for all children, and pkill -P "$$" to kill all of the shell's children.

    – John Kugelman
    Mar 22 at 17:26












  • Thanks everyone, I think this will work!

    – mike
    Mar 22 at 23:41











  • @UtLox I am getting this error on line 6: Syntax error: word unexpected (expecting ")") For me, line 6 is this: pids+=("$!")

    – mike
    Mar 23 at 1:40







  • 1





    Your script starts with "#!/bin/sh" - correct? Use "#!/bin/bash" and it will be run!

    – UtLox
    Mar 23 at 7:18











  • You were right! Thanks!!

    – mike
    Mar 23 at 15:48













3












3








3







Here is a little example of how you can start multible python scripts in parallel. After the first script print the output result, all other where be killed.



python script



import sys
import time
seconds = sys.argv[1]
time.sleep(int(seconds))
print("runner sleep seconds.".format(seconds))


bash script



start() # function for starting python scripts
echo "start $@" # print which runner is started
"$@" & # start python script in background
pids+=("$!") # save pids in array


start ./runner.py 5 # start python scripts
start ./runner.py 7
start ./runner.py 1
start ./runner.py 2
start ./runner.py 3
start ./runner.py 9
start ./runner.py 8
start ./runner.py 3
start ./runner.py 4
start ./runner.py 7

wait -n "$pids[@]" # wait for first finished process
kill "$pids[@]" 2>/dev/null # kill all/other process


output



start ./runner.py 5
start ./runner.py 7
start ./runner.py 1
start ./runner.py 2
start ./runner.py 3
start ./runner.py 9
start ./runner.py 8
start ./runner.py 3
start ./runner.py 4
start ./runner.py 7
runner sleep 1 seconds.





share|improve this answer















Here is a little example of how you can start multible python scripts in parallel. After the first script print the output result, all other where be killed.



python script



import sys
import time
seconds = sys.argv[1]
time.sleep(int(seconds))
print("runner sleep seconds.".format(seconds))


bash script



start() # function for starting python scripts
echo "start $@" # print which runner is started
"$@" & # start python script in background
pids+=("$!") # save pids in array


start ./runner.py 5 # start python scripts
start ./runner.py 7
start ./runner.py 1
start ./runner.py 2
start ./runner.py 3
start ./runner.py 9
start ./runner.py 8
start ./runner.py 3
start ./runner.py 4
start ./runner.py 7

wait -n "$pids[@]" # wait for first finished process
kill "$pids[@]" 2>/dev/null # kill all/other process


output



start ./runner.py 5
start ./runner.py 7
start ./runner.py 1
start ./runner.py 2
start ./runner.py 3
start ./runner.py 9
start ./runner.py 8
start ./runner.py 3
start ./runner.py 4
start ./runner.py 7
runner sleep 1 seconds.






share|improve this answer














share|improve this answer



share|improve this answer








edited Mar 22 at 17:19









John Kugelman

250k54407461




250k54407461










answered Mar 22 at 17:04









UtLoxUtLox

62425




62425












  • You could get rid of the pids array by running wait -n without any PIDs to wait for all children, and pkill -P "$$" to kill all of the shell's children.

    – John Kugelman
    Mar 22 at 17:26












  • Thanks everyone, I think this will work!

    – mike
    Mar 22 at 23:41











  • @UtLox I am getting this error on line 6: Syntax error: word unexpected (expecting ")") For me, line 6 is this: pids+=("$!")

    – mike
    Mar 23 at 1:40







  • 1





    Your script starts with "#!/bin/sh" - correct? Use "#!/bin/bash" and it will be run!

    – UtLox
    Mar 23 at 7:18











  • You were right! Thanks!!

    – mike
    Mar 23 at 15:48

















  • You could get rid of the pids array by running wait -n without any PIDs to wait for all children, and pkill -P "$$" to kill all of the shell's children.

    – John Kugelman
    Mar 22 at 17:26












  • Thanks everyone, I think this will work!

    – mike
    Mar 22 at 23:41











  • @UtLox I am getting this error on line 6: Syntax error: word unexpected (expecting ")") For me, line 6 is this: pids+=("$!")

    – mike
    Mar 23 at 1:40







  • 1





    Your script starts with "#!/bin/sh" - correct? Use "#!/bin/bash" and it will be run!

    – UtLox
    Mar 23 at 7:18











  • You were right! Thanks!!

    – mike
    Mar 23 at 15:48
















You could get rid of the pids array by running wait -n without any PIDs to wait for all children, and pkill -P "$$" to kill all of the shell's children.

– John Kugelman
Mar 22 at 17:26






You could get rid of the pids array by running wait -n without any PIDs to wait for all children, and pkill -P "$$" to kill all of the shell's children.

– John Kugelman
Mar 22 at 17:26














Thanks everyone, I think this will work!

– mike
Mar 22 at 23:41





Thanks everyone, I think this will work!

– mike
Mar 22 at 23:41













@UtLox I am getting this error on line 6: Syntax error: word unexpected (expecting ")") For me, line 6 is this: pids+=("$!")

– mike
Mar 23 at 1:40






@UtLox I am getting this error on line 6: Syntax error: word unexpected (expecting ")") For me, line 6 is this: pids+=("$!")

– mike
Mar 23 at 1:40





1




1





Your script starts with "#!/bin/sh" - correct? Use "#!/bin/bash" and it will be run!

– UtLox
Mar 23 at 7:18





Your script starts with "#!/bin/sh" - correct? Use "#!/bin/bash" and it will be run!

– UtLox
Mar 23 at 7:18













You were right! Thanks!!

– mike
Mar 23 at 15:48





You were right! Thanks!!

– mike
Mar 23 at 15:48













3














Updated Answer



Thank you for your inputs. In the light of those, I would suggest you use:



parallel --dry-run -k --halt now,success=1 python3 .py "$pass" ::: 1..9


Sample Output



python3 1.py "$pass"
python3 2.py "$pass"
python3 3.py "$pass"
python3 4.py "$pass"
python3 5.py "$pass"
python3 6.py "$pass"
python3 7.py "$pass"
python3 8.py "$pass"
python3 9.py "$pass"


If that looks correct, you can remove --dry-run and -k and simply run:



parallel --halt now,success=1 python3 .py "$pass" ::: 1..9


Original Answer



I think you can leverage GNU Parallel for this, specifically its halt policy. So, as an example, start 10 jobs, that sleep 10, 20, 30, 40... 100 seconds each and kill the others as soon as any one succeeds:



parallel --halt now,success=1 sleep ::: $(seq 10 10 100)
parallel: This job succeeded:
sleep 10


If you add -k and --dry-run after parallel, you will see that it starts the following jobs:



parallel -k --dry-run --halt now,success=1 sleep ::: $(seq 10 10 100)

sleep 10
sleep 20
sleep 30
sleep 40
sleep 50
sleep 60
sleep 70
sleep 80
sleep 90
sleep 100



Another example which kills outstanding jobs once 20% of the jobs have succeeded:



parallel --halt now,success=20% sleep ::: $(seq 10 10 100)
parallel: This job succeeded:
sleep 10
parallel: This job succeeded:
sleep 20





share|improve this answer

























  • I see you have the same task, sleep, but you pass different variables to it: 10, 10, 100. How would I use this if I have the same input but different files?

    – mike
    Mar 23 at 2:15






  • 1





    Click edit under your question and add a list of the commands you want to run in parallel and I'll update my answer to show how.

    – Mark Setchell
    Mar 23 at 9:42











  • Ok, I edited my question

    – mike
    Mar 23 at 15:22






  • 1





    I have updated my answer - please have another look.

    – Mark Setchell
    Mar 23 at 19:28















3














Updated Answer



Thank you for your inputs. In the light of those, I would suggest you use:



parallel --dry-run -k --halt now,success=1 python3 .py "$pass" ::: 1..9


Sample Output



python3 1.py "$pass"
python3 2.py "$pass"
python3 3.py "$pass"
python3 4.py "$pass"
python3 5.py "$pass"
python3 6.py "$pass"
python3 7.py "$pass"
python3 8.py "$pass"
python3 9.py "$pass"


If that looks correct, you can remove --dry-run and -k and simply run:



parallel --halt now,success=1 python3 .py "$pass" ::: 1..9


Original Answer



I think you can leverage GNU Parallel for this, specifically its halt policy. So, as an example, start 10 jobs, that sleep 10, 20, 30, 40... 100 seconds each and kill the others as soon as any one succeeds:



parallel --halt now,success=1 sleep ::: $(seq 10 10 100)
parallel: This job succeeded:
sleep 10


If you add -k and --dry-run after parallel, you will see that it starts the following jobs:



parallel -k --dry-run --halt now,success=1 sleep ::: $(seq 10 10 100)

sleep 10
sleep 20
sleep 30
sleep 40
sleep 50
sleep 60
sleep 70
sleep 80
sleep 90
sleep 100



Another example which kills outstanding jobs once 20% of the jobs have succeeded:



parallel --halt now,success=20% sleep ::: $(seq 10 10 100)
parallel: This job succeeded:
sleep 10
parallel: This job succeeded:
sleep 20





share|improve this answer

























  • I see you have the same task, sleep, but you pass different variables to it: 10, 10, 100. How would I use this if I have the same input but different files?

    – mike
    Mar 23 at 2:15






  • 1





    Click edit under your question and add a list of the commands you want to run in parallel and I'll update my answer to show how.

    – Mark Setchell
    Mar 23 at 9:42











  • Ok, I edited my question

    – mike
    Mar 23 at 15:22






  • 1





    I have updated my answer - please have another look.

    – Mark Setchell
    Mar 23 at 19:28













3












3








3







Updated Answer



Thank you for your inputs. In the light of those, I would suggest you use:



parallel --dry-run -k --halt now,success=1 python3 .py "$pass" ::: 1..9


Sample Output



python3 1.py "$pass"
python3 2.py "$pass"
python3 3.py "$pass"
python3 4.py "$pass"
python3 5.py "$pass"
python3 6.py "$pass"
python3 7.py "$pass"
python3 8.py "$pass"
python3 9.py "$pass"


If that looks correct, you can remove --dry-run and -k and simply run:



parallel --halt now,success=1 python3 .py "$pass" ::: 1..9


Original Answer



I think you can leverage GNU Parallel for this, specifically its halt policy. So, as an example, start 10 jobs, that sleep 10, 20, 30, 40... 100 seconds each and kill the others as soon as any one succeeds:



parallel --halt now,success=1 sleep ::: $(seq 10 10 100)
parallel: This job succeeded:
sleep 10


If you add -k and --dry-run after parallel, you will see that it starts the following jobs:



parallel -k --dry-run --halt now,success=1 sleep ::: $(seq 10 10 100)

sleep 10
sleep 20
sleep 30
sleep 40
sleep 50
sleep 60
sleep 70
sleep 80
sleep 90
sleep 100



Another example which kills outstanding jobs once 20% of the jobs have succeeded:



parallel --halt now,success=20% sleep ::: $(seq 10 10 100)
parallel: This job succeeded:
sleep 10
parallel: This job succeeded:
sleep 20





share|improve this answer















Updated Answer



Thank you for your inputs. In the light of those, I would suggest you use:



parallel --dry-run -k --halt now,success=1 python3 .py "$pass" ::: 1..9


Sample Output



python3 1.py "$pass"
python3 2.py "$pass"
python3 3.py "$pass"
python3 4.py "$pass"
python3 5.py "$pass"
python3 6.py "$pass"
python3 7.py "$pass"
python3 8.py "$pass"
python3 9.py "$pass"


If that looks correct, you can remove --dry-run and -k and simply run:



parallel --halt now,success=1 python3 .py "$pass" ::: 1..9


Original Answer



I think you can leverage GNU Parallel for this, specifically its halt policy. So, as an example, start 10 jobs, that sleep 10, 20, 30, 40... 100 seconds each and kill the others as soon as any one succeeds:



parallel --halt now,success=1 sleep ::: $(seq 10 10 100)
parallel: This job succeeded:
sleep 10


If you add -k and --dry-run after parallel, you will see that it starts the following jobs:



parallel -k --dry-run --halt now,success=1 sleep ::: $(seq 10 10 100)

sleep 10
sleep 20
sleep 30
sleep 40
sleep 50
sleep 60
sleep 70
sleep 80
sleep 90
sleep 100



Another example which kills outstanding jobs once 20% of the jobs have succeeded:



parallel --halt now,success=20% sleep ::: $(seq 10 10 100)
parallel: This job succeeded:
sleep 10
parallel: This job succeeded:
sleep 20






share|improve this answer














share|improve this answer



share|improve this answer








edited Mar 23 at 19:27

























answered Mar 22 at 17:27









Mark SetchellMark Setchell

94k786195




94k786195












  • I see you have the same task, sleep, but you pass different variables to it: 10, 10, 100. How would I use this if I have the same input but different files?

    – mike
    Mar 23 at 2:15






  • 1





    Click edit under your question and add a list of the commands you want to run in parallel and I'll update my answer to show how.

    – Mark Setchell
    Mar 23 at 9:42











  • Ok, I edited my question

    – mike
    Mar 23 at 15:22






  • 1





    I have updated my answer - please have another look.

    – Mark Setchell
    Mar 23 at 19:28

















  • I see you have the same task, sleep, but you pass different variables to it: 10, 10, 100. How would I use this if I have the same input but different files?

    – mike
    Mar 23 at 2:15






  • 1





    Click edit under your question and add a list of the commands you want to run in parallel and I'll update my answer to show how.

    – Mark Setchell
    Mar 23 at 9:42











  • Ok, I edited my question

    – mike
    Mar 23 at 15:22






  • 1





    I have updated my answer - please have another look.

    – Mark Setchell
    Mar 23 at 19:28
















I see you have the same task, sleep, but you pass different variables to it: 10, 10, 100. How would I use this if I have the same input but different files?

– mike
Mar 23 at 2:15





I see you have the same task, sleep, but you pass different variables to it: 10, 10, 100. How would I use this if I have the same input but different files?

– mike
Mar 23 at 2:15




1




1





Click edit under your question and add a list of the commands you want to run in parallel and I'll update my answer to show how.

– Mark Setchell
Mar 23 at 9:42





Click edit under your question and add a list of the commands you want to run in parallel and I'll update my answer to show how.

– Mark Setchell
Mar 23 at 9:42













Ok, I edited my question

– mike
Mar 23 at 15:22





Ok, I edited my question

– mike
Mar 23 at 15:22




1




1





I have updated my answer - please have another look.

– Mark Setchell
Mar 23 at 19:28





I have updated my answer - please have another look.

– Mark Setchell
Mar 23 at 19:28

















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55300087%2frun-multiple-python-files-simultaneously-and-terminate-all-when-one-has-finished%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

Swift 4 - func physicsWorld not invoked on collision? The Next CEO of Stack OverflowHow to call Objective-C code from Swift#ifdef replacement in the Swift language@selector() in Swift?#pragma mark in Swift?Swift for loop: for index, element in array?dispatch_after - GCD in Swift?Swift Beta performance: sorting arraysSplit a String into an array in Swift?The use of Swift 3 @objc inference in Swift 4 mode is deprecated?How to optimize UITableViewCell, because my UITableView lags

Access current req object everywhere in Node.js ExpressWhy are global variables considered bad practice? (node.js)Using req & res across functionsHow do I get the path to the current script with Node.js?What is Node.js' Connect, Express and “middleware”?Node.js w/ express error handling in callbackHow to access the GET parameters after “?” in Express?Modify Node.js req object parametersAccess “app” variable inside of ExpressJS/ConnectJS middleware?Node.js Express app - request objectAngular Http Module considered middleware?Session variables in ExpressJSAdd properties to the req object in expressjs with Typescript