# Código de citoxón para el binning adaptativo -- python campo con performance campo con signal-processing campo con cython campo con scipy camp codereview Relacionados El problema

## Cython code for adaptive binning

3

### problema

Español

Aquí está mi código de cython utilizado para el binning adaptativo. La función ` calcAdaptiveBinnedRateMap ` se llama desde otro script de Python. El script se compila usando Cython, pero la velocidad que estoy esperando aún no es genial. ¿Cómo puedo mejorar la velocidad de ejecución?

` ` import numpy as np import scipy.ndimage.morphology as ndimmor  #define sampline rate and alpha cdef float samplingRate = 30.0 cdef double alpha = 0.0001 #skaggs and sachin use this value while Jim uses 0.001  """ runs the iteration for adaptive binning till the criteria mentioned in Skaggs et al 1996 is met INPUT: spike map, occupancy map, alpha, Number of occupancy (Nocc), Euclidean distance transform (dists) OUTPUT: Nspikes (number of spikes), Nocc (occupancy count) """ def mexAdaptwhile(spikeMap, occMap, alpha, Nocc, dists):     #initialize the variable     cdef int Nspikes2 = 1     cdef double rsq = 0     cdef int EnoughPoints = 0     cdef int Nspikes = 0      #find the row and column count for occupancy map     cdef int rowLen = int(np.shape(occMap)[0])     cdef int colLen = int(np.shape(occMap)[1])      #while the radius is less than 200 and enough points are not covered     while rsq<200.00 and EnoughPoints==0:         r = np.sqrt(rsq)         #need to set these Nocc and Nspikes to zero, otherwise the two for loops above add the same spikes over again.         Nspikes = 0         Nocc = 0         for i in range(rowLen):             for j in range(colLen):                 #if the distance is less than radius                 if dists[i,j]<=r:                     #add Nspikes                     Nspikes = Nspikes + spikeMap[i,j]                     #add occupancy                     Nocc = Nocc +occMap[i,j]                  #if number spikes is greater than 0, then set nspikes2 = number of spikes         if Nspikes > 0:             Nspikes2 = Nspikes         #check for the condition from skaggs et al 1996         if (alpha*alpha*Nocc*Nocc*rsq*Nspikes2 > 1):             EnoughPoints = 1 #set the flag to end the loop         #keep increasing the radius         rsq = rsq + 1         #output occupancy = occupancy         Noccout = Nocc     return Nspikes, Noccout   """calculate adaptive binned rate map     INPUT: spikemap, occupancy map (both of them just binned for 2cm/4cm no other operation applied on them) NOTE: occupancy is still in terms of number of frames OUTPUT: adaptive binned rate Map which is color adjusted """ def calcAdaptiveBinnedRateMap(spikeMap, occMap):     #set unoccupied occupancy and spike corresponding to unoccupied position = 0     spikeMap[0,0] = 0     occMap[0,0] = 0      #find the row, col index of minimum occupied and maximum occupied pixel     row, col = np.where(occMap)     minrow = np.min(row)     maxrow = np.max(row)     mincol = np.min(col)     maxcol = np.max(col)      #select the best fitting rectangle according to occupied area     occMap = occMap[minrow:maxrow+1,mincol:maxcol+1]     spikeMap = spikeMap[minrow:maxrow+1,mincol:maxcol+1]      #matrix of zeros same size as of occupancy map     z = np.zeros(np.shape(occMap))     #variable to hold adaptive binned rate map value     abrMap = np.copy(z)     #variale to hold adaptive binned occupancy map     abrOcc = np.copy(z)      #check to endure if number of spikes is greater than 1     if np.max(np.max(spikeMap))>0:         #iterate over the values         for x in range(int(np.shape(occMap)[1])):             for y in range(int(np.shape(occMap)[0])):                 if occMap[y,x] > 0:                     #pretend there's atleas 1 spike, and 1 occ.needed to avoid 0 threshold.                      Nspikes2 = 1                     Nocc = occMap[y,x]                     d = np.copy(z)                     d[y,x] = 1                     #computes the Euclidean distance transform of the input matrix.                     #For each pixel in BW, the distance transform assigns a number that is the                      #distance between that pixel and the nearest nonzero pixel of BW.                      dists = ndimmor.distance_transform_edt(d==0)                     # function to keep on iterating while the condition mentioned in Skaggs et al 1996 is met                     Nspikes, Nocc = mexAdaptwhile(spikeMap, occMap, alpha, Nocc, dists)                     if Nocc < 12: #occupancy cutoff = 0.4seconds                          #if less than 0.4 seconds set it to 0                         abrMap[y,x] = 0                         abrOcc[y,x] = 0                     else:                         #else equal to number of spikes/occupancy map                         abrMap[y,x] = samplingRate*float(Nspikes)/float(Nocc)                         #adaptive binned ocuupancy map = nocc                         abrOcc[y,x] = Nocc      #find the maximum value of adaptive binned rate map     cmax = np.max(np.max(abrMap))     if cmax > 0:         #minimum = maximum value found above/60         cmin = -(cmax/60.0);     else:         cmin = -1;     #set adaptive binned rate map = cmin wherever adaptive binned occupancy map = 0      abrMap[abrOcc==0] = cmin      #return the adaptive binned rate map     return abrMap   ``
Original en ingles

Here is my Cython code used for adaptive binning. The `calcAdaptiveBinnedRateMap` function is called from another Python script. The script is compiled using Cython but the speed I am expecting is still not great. How can I improve the speed of execution?

``import numpy as np import scipy.ndimage.morphology as ndimmor  #define sampline rate and alpha cdef float samplingRate = 30.0 cdef double alpha = 0.0001 #skaggs and sachin use this value while Jim uses 0.001  """ runs the iteration for adaptive binning till the criteria mentioned in Skaggs et al 1996 is met INPUT: spike map, occupancy map, alpha, Number of occupancy (Nocc), Euclidean distance transform (dists) OUTPUT: Nspikes (number of spikes), Nocc (occupancy count) """ def mexAdaptwhile(spikeMap, occMap, alpha, Nocc, dists):     #initialize the variable     cdef int Nspikes2 = 1     cdef double rsq = 0     cdef int EnoughPoints = 0     cdef int Nspikes = 0      #find the row and column count for occupancy map     cdef int rowLen = int(np.shape(occMap)[0])     cdef int colLen = int(np.shape(occMap)[1])      #while the radius is less than 200 and enough points are not covered     while rsq<200.00 and EnoughPoints==0:         r = np.sqrt(rsq)         #need to set these Nocc and Nspikes to zero, otherwise the two for loops above add the same spikes over again.         Nspikes = 0         Nocc = 0         for i in range(rowLen):             for j in range(colLen):                 #if the distance is less than radius                 if dists[i,j]<=r:                     #add Nspikes                     Nspikes = Nspikes + spikeMap[i,j]                     #add occupancy                     Nocc = Nocc +occMap[i,j]                  #if number spikes is greater than 0, then set nspikes2 = number of spikes         if Nspikes > 0:             Nspikes2 = Nspikes         #check for the condition from skaggs et al 1996         if (alpha*alpha*Nocc*Nocc*rsq*Nspikes2 > 1):             EnoughPoints = 1 #set the flag to end the loop         #keep increasing the radius         rsq = rsq + 1         #output occupancy = occupancy         Noccout = Nocc     return Nspikes, Noccout   """calculate adaptive binned rate map     INPUT: spikemap, occupancy map (both of them just binned for 2cm/4cm no other operation applied on them) NOTE: occupancy is still in terms of number of frames OUTPUT: adaptive binned rate Map which is color adjusted """ def calcAdaptiveBinnedRateMap(spikeMap, occMap):     #set unoccupied occupancy and spike corresponding to unoccupied position = 0     spikeMap[0,0] = 0     occMap[0,0] = 0      #find the row, col index of minimum occupied and maximum occupied pixel     row, col = np.where(occMap)     minrow = np.min(row)     maxrow = np.max(row)     mincol = np.min(col)     maxcol = np.max(col)      #select the best fitting rectangle according to occupied area     occMap = occMap[minrow:maxrow+1,mincol:maxcol+1]     spikeMap = spikeMap[minrow:maxrow+1,mincol:maxcol+1]      #matrix of zeros same size as of occupancy map     z = np.zeros(np.shape(occMap))     #variable to hold adaptive binned rate map value     abrMap = np.copy(z)     #variale to hold adaptive binned occupancy map     abrOcc = np.copy(z)      #check to endure if number of spikes is greater than 1     if np.max(np.max(spikeMap))>0:         #iterate over the values         for x in range(int(np.shape(occMap)[1])):             for y in range(int(np.shape(occMap)[0])):                 if occMap[y,x] > 0:                     #pretend there's atleas 1 spike, and 1 occ.needed to avoid 0 threshold.                      Nspikes2 = 1                     Nocc = occMap[y,x]                     d = np.copy(z)                     d[y,x] = 1                     #computes the Euclidean distance transform of the input matrix.                     #For each pixel in BW, the distance transform assigns a number that is the                      #distance between that pixel and the nearest nonzero pixel of BW.                      dists = ndimmor.distance_transform_edt(d==0)                     # function to keep on iterating while the condition mentioned in Skaggs et al 1996 is met                     Nspikes, Nocc = mexAdaptwhile(spikeMap, occMap, alpha, Nocc, dists)                     if Nocc < 12: #occupancy cutoff = 0.4seconds                          #if less than 0.4 seconds set it to 0                         abrMap[y,x] = 0                         abrOcc[y,x] = 0                     else:                         #else equal to number of spikes/occupancy map                         abrMap[y,x] = samplingRate*float(Nspikes)/float(Nocc)                         #adaptive binned ocuupancy map = nocc                         abrOcc[y,x] = Nocc      #find the maximum value of adaptive binned rate map     cmax = np.max(np.max(abrMap))     if cmax > 0:         #minimum = maximum value found above/60         cmin = -(cmax/60.0);     else:         cmin = -1;     #set adaptive binned rate map = cmin wherever adaptive binned occupancy map = 0      abrMap[abrOcc==0] = cmin      #return the adaptive binned rate map     return abrMap ``

## Lista de respuestas

1

Asegúrese de declarar todas las variables con cytón (de lo contrario, las líneas con tales variables básicamente se ejecutan con la velocidad de Python). Especialmente importante para las variables "para" -loop (es decir, ` null1 `, ` null2 `, ` null3 `, ` null4 ). < / p> `

` También puede usar null5 para crear una página HTML anotada que se muestre, qué líneas se ejecutan con C o Velocidad de Python, respectivamente. `

` `

Make sure to declare all variables with Cython (otherwise the lines with such variables basically run with Python-speed). Especially important for the "for"-loop variables (i.e. `i`, `j`, `x`, `y`).

One can also use `cython -a mycode.pyx` to create an annotated HTML page that shows, which lines are running with C or Python speed, respectively.

3  Cálculo de la dispersión óptica de los espectrogramas con Python  ( Optical dispersion calculation from spectrograms with python )
Primero, me gustaría proporcionar una pequeña explicación sobre lo que se supone que debe hacer mi código. Es parte de un proyecto de tamaño medio. Reestructu...

13  Prueba de independencia de Chi Square para dos columnas Panda DF  ( Chi square independence test for two pandas df columns )
Quiero calcular el malloc6 Para dos columnas de una pandas datos de datos . Los datos son categóricos, como este: malloc7 Aquí está los datos d...

2  Resuelve problema de optimización multidimencionante usando Basinhopping  ( Solve multi dimentional optimization problem using basinhopping )
Estoy buscando una solución de optimización, que es un vector 8D que representa a 4 elementos complejos, donde cada elemento está dentro del círculo complejo ...

5  Probabilidad de evento usando el límite central del teorema + resultados de trazado  ( Probability of event using central limit theorem plotting results )
He estado haciendo un curso de udemy llamado: "Estadísticas para la ciencia de los datos" y decidí resolver una de las tareas con Python para matar a dos pája...

2  Algoritmo de agrupamiento de un solo paso para las matrices escasas  ( Single pass clustering algorithm for sparse matrices )
He escrito un solo PASS CRUTING ALGO para leer matrices escasas que pasan de Scikit Tfidfvectoriser, pero la velocidad es rey de promedio para la matriz de ta...

6  Rutina de estimación de errores de Monte Carlo  ( Monte carlo errors estimation routine )
Valoría su opinión sobre la siguiente pieza de código. Soy bastante nuevo para el análisis de Python y Monte Carlo, así que me preguntaba si la rutina tiene s...

2  Convertir una imagen de coordenadas polar a cartesianas  ( Converting an image from polar to cartesian coordinates )
Tengo el siguiente código que tiene demasiados bucles dentro de ella (casi 13 mil millones). Este código es en realidad para el procesamiento de la imagen. Pr...

1  Reducir uno para el bucle para disminuir la complejidad del tiempo  ( Reduce one for loop to decrease time complexity )
Estoy usando NOMPY para descubrir la interpolación polinomial de LANGRAGE. Estoy usando 2 para el bucle para descubrir el polinomio de LANGRAPE, pero quiero r...

6  Lectura de la matriz escasa del archivo binario  ( Reading sparse matrix from binary file )
Tengo archivos binarios que contienen matrices escasas. Su formato es: package main import ( "golang.org/x/tour/pic" "image" "image/color" ) ...

2  Implementar Metaclass para el estimador de máximo de vida  ( Implement metaclass for maximumlikelihood estimator )
con el fin de definir muchos estimadores de probabilidad máxima, Creo que necesito un metaclass. En este momento, tengo que copiar / pegar un gran código pa...