rsalmei / alive-progress

A new kind of Progress Bar, with real-time throughput, ETA, and very cool animations!
MIT License
5.53k stars 206 forks source link

Using progress bar with multiprocessing #139

Closed Tim1808 closed 2 years ago

Tim1808 commented 2 years ago

This is not a bug report, I just can't figure out how to do what I want.

Basically I have a multiprocessing_func which is the target which I put in the processes, where I iterate over i (which are different board sizes for chess).

#!./env/bin/python3
import csv
import subprocess
from time import perf_counter
from multiprocessing import Process
from alive_progress import alive_bar

BOARD_SIZES = range(4, 29)
PLAYER_TYPES_W = ['0', '2']
PLAYER_TYPES_B = range(0, 3)
QUEEN_ROOK = ['q', 'r']
MAX_GAME_LENGTH = 100
SEED = 123
PRINT = 0
SIMULATIONS = 100
CONFIGCOUNT = len(BOARD_SIZES)

....
....

def simulateCombinations(size):
    for type_w in PLAYER_TYPES_W:
        for type_b in PLAYER_TYPES_B:  # black
            for queen_or_rook in QUEEN_ROOK:
                sim = {"bin_path": "./chess",
                       "size": size,
                       "simulations": SIMULATIONS,
                       "max_game_length": MAX_GAME_LENGTH,
                       "type_W": type_w,
                       "type_B": type_b,
                       "queen_or_rook": queen_or_rook,
                       "print": PRINT,
                       "seed": SEED
                       }
                simulation = [str(x) for x in sim.values()]
                subprocess.run(simulation)

def multiprocessing_func(size):
    with alive_bar(CONFIGCOUNT) as bar:
        start = perf_counter()
        simulateCombinations(size)
        end = perf_counter()
        print(
            "Done simulating {2} games for each player on a {0}x{0} board in {1}s".format(size, round(end-start, 4), SIMULATIONS))
        bar()

if __name__ == '__main__':
    initCSV()
    start = perf_counter()
    processes = []
    for i in BOARD_SIZES:
        p = Process(target=multiprocessing_func, args=(i,))
        processes.append(p)
        p.start()

    for process in processes:
        process.join

    end = perf_counter()
    print("That took {0}s".format(round(end-start, 4)))
    orderCSV()

I would like the have an alive_bar which has the length of the BOARD_SIZES range as its argument, and each time a process is finished I want to call bar() to increase the bar. Clearly, the way I coded it right now, I get 25 bars which only get 1 fill out of the possible 25, like so:

....
|█▋⚠︎                                     | (!) 1/25 [4%] in 31.5s (0.03/s)
on 0: Done simulating 100 games for each player on a 27x27 board in 36.1741s
|█▋⚠︎                                     | (!) 1/25 [4%] in 36.2s (0.03/s)
on 0: Done simulating 100 games for each player on a 26x26 board in 36.4138s
|█▋⚠︎                                     | (!) 1/25 [4%] in 36.4s (0.03/s)
on 0: Done simulating 100 games for each player on a 28x28 board in 41.364s
|█▋⚠︎                                     | (!) 1/25 [4%] in 41.4s (0.02/s)

How should I do this for my use case?

TheTechRobo commented 2 years ago

Related: #20

Tim1808 commented 2 years ago

Related: #20

What I am trying to do seems to be simpler. I do not want multiple bars because I can't even get one to work. Would be cool though.

TheTechRobo commented 2 years ago

Oh, I thought that was what you were trying to do, sorry. (I saw multiple bars in your output of 25 bars and didn't see that you didn't mean for it to have 25 bars...)

TheTechRobo commented 2 years ago

Well, your current problem is that you keep mkaing new bars in the function every time it's called, rather than using an existing one. Try making a bar and passing it to the multiprocessed function as a parameter?

Tim1808 commented 2 years ago

Well, your current problem is that you keep mkaing new bars in the function every time it's called, rather than using an existing one. Try making a bar and passing it to the multiprocessed function as a parameter?

#!./env/bin/python3
import csv
import subprocess
from time import perf_counter
from multiprocessing import Process
from alive_progress import alive_bar

BOARD_SIZES = range(4, 29)
PLAYER_TYPES_W = ['0', '2']
PLAYER_TYPES_B = range(0, 3)
QUEEN_ROOK = ['q', 'r']
MAX_GAME_LENGTH = 100
SEED = 123
PRINT = 0
SIMULATIONS = 100
CONFIGCOUNT = len(BOARD_SIZES)

def initCSV():
    subprocess.run(["make", "clean"])  # start with ` empty` csv file
    with open('output.csv', 'w') as file:
        fieldnames = ["average_game_length", "max_game_length", "queen", "size",
                      "stalemates", "stopped", "ties", "total_simulations", "type_black", "type_white", "wins"]
        writer = csv.DictWriter(file, fieldnames=fieldnames)
        writer.writeheader()
    subprocess.run(["make"])  # build cpp project

def orderCSV():
    with open('output.csv', 'r') as infile, open('outputord.csv', 'w') as outfile:
        fieldnames = ['size', 'total_simulations', 'max_game_length', 'type_white', 'type_black',
                      'queen', 'wins', 'stalemates', 'ties', 'stopped', 'average_game_length']
        writer = csv.DictWriter(outfile, fieldnames=fieldnames)
        writer.writeheader()
        for row in csv.DictReader(infile):
            writer.writerow(row)

def simulateCombinations(size):
    for type_w in PLAYER_TYPES_W:
        for type_b in PLAYER_TYPES_B:  # black
            for queen_or_rook in QUEEN_ROOK:
                sim = {"bin_path": "./chess",
                       "size": size,
                       "simulations": SIMULATIONS,
                       "max_game_length": MAX_GAME_LENGTH,
                       "type_W": type_w,
                       "type_B": type_b,
                       "queen_or_rook": queen_or_rook,
                       "print": PRINT,
                       "seed": SEED
                       }
                simulation = [str(x) for x in sim.values()]
                subprocess.run(simulation)

def multiprocessing_func(size, bar):
    start = perf_counter()
    simulateCombinations(size)
    end = perf_counter()
    print(
        "Done simulating {2} games for each player on a {0}x{0} board in {1}s".format(size, round(end-start, 4), SIMULATIONS))
    bar()

if __name__ == '__main__':
    with alive_bar(CONFIGCOUNT) as bar:
        initCSV()
        start = perf_counter()
        processes = []
        for i in BOARD_SIZES:
            p = Process(target=multiprocessing_func, args=(i, bar))
            processes.append(p)
            p.start()

        for process in processes:
            process.join

        end = perf_counter()
        print("That took {0}s".format(round(end-start, 4)))
        orderCSV()

Changed it to this, still I get e.g.

|                                        | ▁▃▅ 0/25 [0%] in 0s (0.0/s, eta: -)rm *.csv *.txt *.o *.aux *.log *.out *.synctex.gz PGN/*.pgn chess || true
rm: cannot remove '*.txt': No such file or directory
rm: cannot remove '*.aux': No such file or directory
rm: cannot remove '*.log': No such file or directory
rm: cannot remove '*.out': No such file or directory
rm: cannot remove '*.synctex.gz': No such file or directory
g++ -Wall -Wextra -pedantic -O0 -g -std=c++2a   -c -o chess2022.o chess2022.cc
|                                        | ▇▇▅ 0/25 [0%] in 1s (0.0/s, eta: -)g++ chess2022.o -o chess
on 0: That took 0.0106s
|⚠︎                                       | (!) 0/25 [0%] in 0.8s (0.00/s)
on 0: Done simulating 100 games for each player on a 4x4 board in 0.2886s
on 0: Done simulating 100 games for each player on a 5x5 board in 0.334s
on 0: Done simulating 100 games for each player on a 6x6 board in 0.4427s
on 0: Done simulating 100 games for each player on a 7x7 board in 0.693s
on 0: Done simulating 100 games for each player on a 8x8 board in 1.0546s
on 0: Done simulating 100 games for each player on a 9x9 board in 1.231s
on 0: Done simulating 100 games for each player on a 10x10 board in 1.4186s
on 0: Done simulating 100 games for each player on a 11x11 board in 1.8807s
on 0: Done simulating 100 games for each player on a 12x12 board in 2.4778s
on 0: Done simulating 100 games for each player on a 13x13 board in 2.8718s
on 0: Done simulating 100 games for each player on a 14x14 board in 3.7818s
on 0: Done simulating 100 games for each player on a 15x15 board in 4.7353s
on 0: Done simulating 100 games for each player on a 16x16 board in 5.6172s
on 0: Done simulating 100 games for each player on a 17x17 board in 6.704s
on 0: Done simulating 100 games for each player on a 18x18 board in 8.2397s
on 0: Done simulating 100 games for each player on a 19x19 board in 8.9021s
on 0: Done simulating 100 games for each player on a 20x20 board in 9.9686s
on 0: Done simulating 100 games for each player on a 21x21 board in 12.3204s
on 0: Done simulating 100 games for each player on a 22x22 board in 14.6251s
on 0: Done simulating 100 games for each player on a 23x23 board in 16.488s
on 0: Done simulating 100 games for each player on a 24x24 board in 18.7866s
on 0: Done simulating 100 games for each player on a 25x25 board in 20.1712s
on 0: Done simulating 100 games for each player on a 26x26 board in 22.8905s
on 0: Done simulating 100 games for each player on a 27x27 board in 27.2148s
on 0: Done simulating 100 games for each player on a 28x28 board in 29.5006s
rsalmei commented 2 years ago

Yes @Tim1808, as @TheTechRobo has said, you are calling multiprocessing_func several times in different processes, which are creating a new alive_bar inside each process. Another error is: in the same func, you call simulate, which by itself make all combinations. When it does return, the bar is incremented only once...

To make it work, you should delete multiprocessing_func, move alive_bar to __main__, and use a Pool of processes: https://docs.python.org/3/library/multiprocessing.html#module-multiprocessing.pool Whenever the pool returns a result, the bar is incremented. There's an example there in the Python docs. I think I have some examples here in issues too, but I can't look for them now. Please look for closed issues about this.

rsalmei commented 2 years ago

Not even an answer? Just closed it, after my thorough response, which I've invested time and even searched Python docs for you?

Tim1808 commented 2 years ago

Not even an answer? Just closed it, after my thorough response, which I've invested time and even searched Python docs for you?

Sorry, not meant to be rude. Closed it because your solution fixed it for me. Thank you very much for your efforts.

rsalmei commented 2 years ago

Ah, that's great! No harm done. 👍