Python Forum

Full Version: Numpy or Scipy Variance Convolution
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I was wondering what the most efficient / fastest way to code a Variance Convolution would be in Python?

I currently have the following code, that takes a 3D Numpy array, creates a maximum pixelwise intensity layer, and finds the convolved variance of that image using OpenCV.  This seems to be far too slow, and from what I can see, GPU support for OpenCV-Python isn't in place yet.

Any suggestions for a faster approach would be much appreciated.

MaxFrom3DArray = numpy.amax(imgArray, axis=0)  # imgArray is 3D numpy array
Back2ImMax = Image.fromarray(MaxFrom3DArray, 'P')
Back2ImMax.save(os.path.join(MaxFromMulti, filename), "TIFF")

ForVariance = cv2.imread((MaxFromMulti + filename), cv2.IMREAD_UNCHANGED)
wlen = 40
def winVar(img, wlen):
wmean, wsqrmean = (cv2.boxFilter(x, -1, (wlen, wlen),
  borderType=cv2.BORDER_REFLECT) for x in (img, img*img))
return wsqrmean - wmean*wmean
windowVar = winVar(ForVariance, wlen)
numpy.set_printoptions(threshold='nan')
print windowVar

Best wishes

TWP