... and combines rather well with that other game-changing library I like, Numba.
To illustrate this, let's look at image erosion, which is the replacement of each pixel in an image by the minimum of its neighbourhood.
ndimage has a fast C implementation, which serves as a perfect benchmark against the generic version, using a generic filter with
min as the operator. Let's start with a 2048 x 2048 random image:
>>> %timeit ndi.generic_filter(image, LowLevelCallable(nbmin.ctypes), footprint=footprint) 10 loops, best of 3: 113 ms per loop
That's right: it's even marginally faster than the pure C version! I almost cried when I ran that.
Higher-order functions, ie functions that take other functions as input, enable powerful, concise, elegant expressions of various algorithms. Unfortunately, these have been hampered in Python for large-scale data processing because of Python's function call overhead. SciPy's latest update goes a long way towards redressing this.