lgarithm / crystalnet

crystalnet -- a mini core AI library (being refactored, see https://github.com/lgarithm/stdnn-ops)
MIT License
16 stars 3 forks source link

Profile numpy memory usage #89

Open lgarithm opened 6 years ago

lgarithm commented 6 years ago

run-with-vm-limit.sh

#!/bin/sh

set -e

do_with_vmem_limit() {
    local VMEM_LIMIT_MB=$1
    shift
    local COMMAND=$1
    shift

    VMEM_LIMIT=$((VMEM_LIMIT_MB * 1000)) #  kbytes

    echo "will run ${COMMAND} with virtual memory limit ${VMEM_LIMIT_MB} MB"
    ulimit -v ${VMEM_LIMIT} # doesn't work on OSX
    set +e
    ${COMMAND} $@
    local CODE=$?
    echo "${COMMAND} finished with ${CODE}"
    if [ ${CODE} = 0 ]; then
        local STATUS="OK"
    else
        local STATUS="FAIL"
    fi
    echo "${VMEM_LIMIT_MB} $@ ${STATUS}" | tee -a result.log
}

with_vmem_limit() {
    do_with_vmem_limit $@ &
    wait
}

echo -n >result.log
for n in $(seq 10); do
    for l in $(seq 163 180); do
        with_vmem_limit ${l} ./np-example.py ${n}
    done
done
#!/usr/bin/env python3

import numpy as np
import sys

def main(args):
    Ki = 1 << 10
    Mi = 1 << 20
    Gi = 1 << 30

    n = int(args[0])
    print('will consume %d Mi' % n)
    # x = np.array(range(n*Mi), np.uint8)
    x = np.zeros([n*Mi], np.uint8)
    for i in range(n * Mi):
        x[i] = 1
    y = x.sum()
    print('done, y=%d' % y)

main(sys.argv[1:])

Result shows import numpy consumes 163 MB, including python prelude.

Python prelude consumes 29MB.

lgarithm commented 6 years ago

A further investigation shows

module memory (MB)
numpy 162
tensorflow 457
tensorlayer 570

(The result of tensorflow is interesting)

441 1 FAIL
442 1 OK
443 1 OK
444 1 OK
445 1 FAIL
446 1 FAIL
447 1 FAIL
448 1 FAIL
449 1 FAIL
450 1 FAIL
451 1 FAIL
452 1 FAIL
453 1 FAIL
454 1 FAIL
455 1 FAIL
456 1 FAIL
457 1 OK
458 1 OK
459 1 OK
460 1 OK