Fixed the limit.sh file according to the problem in #47 .
The issues I found were:
The variable node_tag was read as empty in the awk command, therefore matching with a lot of processes. This was solved by using a global variable to store the value
After that, cgclassify was working, but always adding an extra process (different each call, not findable in the process list), I know its caused by the pipe, so in order to correct it, and only move to the group the correct processes I used an auxiliary variable aux
The limitFile, where the information read from REST is stored, now is deleted everytime the script is run.
I tested it manually, with the following dummy scripy as process:
#!/usr/bin/env bash
while :
do
echo "Hello"
echo $$
sleep 10
done
and it was working, also with multiple instances of dummy.
Fixed the limit.sh file according to the problem in #47 .
The issues I found were:
node_tag
was read as empty in the awk command, therefore matching with a lot of processes. This was solved by using a global variable to store the valueaux
I tested it manually, with the following dummy scripy as process:
and it was working, also with multiple instances of dummy.