Python Memory Footprint

This is a cross listed blog post. I had posted this blog on my company’s blog and wanted to repost here as it is a topic that very few python developers understand. Python has a high memory footprint, understanding that is the key to writing very space efficient python programs.

Note there will be a follow up to this blog post to discuss memory footprint of data structures used by numpy. Numpy has its own set of data structures that can be more efficient depending on the use case. When it comes to scientific computation (matrices, numerical methods, …) and hence Machine Learning (NLP, AI, …) can be much more efficient.

This article applies to python 2.7 64-bit (32bit and py3k may be different)

Edit: I added simpler estimate formulas along side the actual formulas, so memory footprint can be quickly visualized.

Edit2: 32 bit python seems to be using around half of the memory; this seems to be due to the use of 32bit pointers instead of 64. That being said, you are limited to 2GB of memory.

Some developers are unaware of the memory footprint python has and tend to hit walls especially if they are trying to load big data into memory instead of using efficient cache oblivious algorithms and data-structures.

This post demonstrates the memory footprint of basic python objects/data-structures. You can use this data to estimate how much memory you would need to support your program or better layout your program if your memory starts to run out. This data was collected using the python profiler Guppy-PE.

  1. Boolean and Numerical Types
  2. Strings
  3. DataStructures (Lists, Tuples, Dict)

Boolean and Numerical Types

  • Boolean (bool): 24 bytes
  • Integers (int): 24 bytes
  • Long Integers (long): 32 + int(math.log(NUMBER, 2) / 60) * 8 bytes
  • Float: 24 bytes
  • Decimal (decimal.Decimal): 40 bytes

Strings

Note: In python 2 string concat using the __add__ or essentially ‘+’ creates intermediate strings which will essentially grab much more memory than you need. The efficient way to join strings is to use the string join method or ‘%s’ string formatting (for example). Just avoid use of ‘+’ with strings until you move to python 3.

Every 8 chars use 8 bytes, with an initial 40 bytes (for up to 3 chars)

  • String: 40 + ((len(s) – 4) / 8 + 1) * 8 bytes ~= 40 + len(s) * 8

DataStructures (Lists, Tuples, Dict)

the following is just the structure’s memory usage and not whats inside of it:

  • Tuple: 56 + 8 * len(t) bytes
  • List: 72 + 64 * int(1 + (len(l) + 1) / 8) bytes ~= 72 + len(l) * 8
  • Dictionary (dict): memory based on number of buckets, below are the details, and here is the pattern that seems to be exhibited. the first 5 elements are included in the initial 280 bytes. The next bucket can hold up to (2**4) 16 more elements with 52.5 bytes per element. The next bucket can hold (2 ** 6) 64 more elements with 36 bytes per element. The next bucket can host (2 ** 8) 256 more elements with 36 bytes per element. The next can host (2** 10) 1024 more elements with 36 bytes per element … I have not tried to come up with a formula for this one, feel free to solve this in the comments.