Img_diff = np.ndarray(shape=img1.shape, dtype='float32') Img2 = np.array(Image.open(z).convert('L')) Print(' seconds'.format(k, sum(v) / len(v))) Run_times.append(time.time() - start_time) Img = np.array(Image.open(z).convert('L')) Run_times.append(time.time() - start_time) start_time = time.time() Run_times = dict(sk=list(), pil=list(), scipy=list()) In addition the colors are converted slightly different, see the example from the CUB-200 dataset. PIL and SciPy gave identical numpy arrays (ranging from 0 to 255). Three of the suggested methods were tested for speed with 1000 RGBA PNG images (224 x 256 pixels) running with Python 3.5 on Ubuntu 16.04 LTS (Xeon E5 2670 with SSD). Matlab's (NTSC/PAL) implementation: import numpy as np Sebastian has improved my function, but I'm still hoping to find the built-in one. It's horribly inefficient, but that's why I was hoping for a professional implementation built-in. I wrote a very simple function that works with the image imported using imread in 5 minutes. Isn't this a common operation in image processing? I find it hard to believe that numpy or matplotlib doesn't have a built-in function to convert from rgb to gray. They just read in the image import matplotlib.image as mpimgĪnd then they slice the array, but that's not the same thing as converting RGB to grayscale from what I understand. In the matplotlib tutorial they don't cover it. In matlab I use this: img = rgb2gray(imread('image.png')) I'm trying to use matplotlib to read in an RGB image and convert it to grayscale.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |