Skip to content Skip to sidebar Skip to footer

Python Line-by-line Memory Profiler?

I'm looking to generate, from a large Python codebase, a summary of heap usage or memory allocations over the course of a function's run. I'm familiar with heapy, and it's served m

Solution 1:

I would use sys.settrace at program startup to register a custom tracer function. The custom_trace_function will be called for each line of code. Then you can use that function to store information gathered by heapy or meliae in a file for later processing.

Here is a very simple example which logs the output of hpy.heap() each second to a plain text file:

import sys
import time
import atexit
from guppy importhpy_last_log_time= time.time()
_logfile = open('logfile.txt', 'w')

def heapy_profile(frame, event, arg):
    currtime = time.time()
    if currtime - _last_log_time < 1:
        return_last_log_time=currtimecode= frame.f_codefilename= code.co_filenamelineno= code.co_firstlinenoidset= hpy().heap()
    logfile.write('%s %s:%s\n%s\n\n' % (currtime, filename, lineno, idset))
    logfile.flush()

atexit.register(_logfile.close)
sys.settrace(heapy_profile)

Solution 2:

Post a Comment for "Python Line-by-line Memory Profiler?"