最近對影像處理頗有興趣的,剛好今天上班查了一下如何在Apple device得到pixel value,阿蛤~~!發現其實蠻容易的,只要透過幾行CGImage相關的函示,就可以獲得pixel value,不管是iPhone或是OSX App,這樣能夠做的image processing可以更多且彈性也更大!
要獲得cgimage的pixel value,需要以下步驟:
- 用malloc先alloc一個pixel value的Array
- alloc colorspace
- 用CGBitmapContextCreate來create CGContextRef(此步驟可把步驟1.create的Array丟給context)
- draw cgimage到上個步驟create的context(此步驟會把image畫到剛剛的傳入的Array)
以下是將RGB to Gray level的template:
+ (NSImage *)grayScaleImage:(CGImageRef)cgimage{
CGSize size;
// Get cgimage width and height
size.width = CGImageGetWidth(cgimage);
size.height = CGImageGetHeight(cgimage);
// Alloc pixels value's Array(using to image processing)
uint32_t *pixels = (uint32_t *)malloc(size.width*size.height*sizeof(uint32_t));
// Create colorSpace
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create bitmap context
// here we can get pixels value by this function, notice "pixels" parameter
CGContextRef context = CGBitmapContextCreate(pixels,
size.width,
size.height,
8,
size.width*sizeof(uint32_t),
colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedLast);
// Draw image to context
CGContextDrawImage(context, CGRectMake(0, 0, size.width, size.height), cgimage);
int y = 0;
int x = 0;
// RGB to GRAY per pixel
for(y=0;y<size.height;y++){
for(x=0;x<size.width;x++){
uint8_t *rgbaPixel = (uint8_t *)&pixels[y * (int)size.width + x];
uint32_t gray = 0.3 * rgbaPixel[RED] + 0.59 * rgbaPixel[GREEN] + 0.11 * rgbaPixel[BLUE];
rgbaPixel[RED] = gray;
rgbaPixel[GREEN] = gray;
rgbaPixel[BLUE] = gray;
}
}
// Create CGImage by context
CGImageRef image = CGBitmapContextCreateImage(context);
// Release memory
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
free(pixels);
// Create NSImage
NSImage* newImage = [[NSImage alloc]initWithCGImage:image size:size];
return newImage;
}
上面是我在stack overflow找到的範例,可以看到關鍵的三個function:
- malloc()
- CGColorSpaceCreateDeviceRGB()
- CGBitmapContextCreate()
- CGContextDrawImage
一個是用來存pixel values的Array,一個是用來create color space,一個是用來create context,最後一個是將想要處理的CGImage畫在context上,整個概念就是由這四個function組成,基本上上面的code template幾乎都是影像處理的第一步,這學期從image processing課上學到的都是先把圖轉換成gray level再作處理。
下面是一個素描畫的範例是參考AUTOMATIC GENERATION OF PENCIL-SKETCH LIKE DRAWINGS FROM PERSONAL PHOTOS這篇paper所做出來的,概念上很簡單,paper上指出有四的步驟:
- smoothing(Using Gaussian low pass filter)
- Laplacian operation
- transfer function
- smoothing again
程式碼如下:
+ (NSImage *)sketchPencil:(CGImageRef)cgimage{
CGSize size;
size.width = CGImageGetWidth(cgimage);
size.height = CGImageGetHeight(cgimage);
uint32_t *pixels = (uint32_t *)malloc(size.width*size.height*sizeof(uint32_t));
uint8_t *grayPixels = (uint8_t *)malloc(size.width*size.height*sizeof(uint8_t));
uint8_t *tmpPixels = (uint8_t *)malloc(size.width*size.height*sizeof(uint8_t));
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixels,
size.width,
size.height,
8,
size.width*sizeof(uint32_t),
colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedLast);
CGContextDrawImage(context, CGRectMake(0, 0, size.width, size.height), cgimage);
int y = 0;
int x = 0;
// RGB to gray level
for(y=0;y<size.height;y++){
for(x=0;x<size.width;x++){
uint8_t *rgbaPixel = (uint8_t *)&pixels[y * (int)size.width + x];
uint32_t gray = 0.3 * rgbaPixel[RED] + 0.59 * rgbaPixel[GREEN] + 0.11 * rgbaPixel[BLUE];
grayPixels[y*(int)size.width + x] = gray;
}
}
// gaussian smoothing
int gaussianMask[9] = {1,2,1,2,4,2,1,2,1};
int8_t laplacianMask[9] = {0,-1,0,-1,4,-1,0,-1,0};
for(y=1;y<(size.height-1);y++){
for(x=1;x<(size.width-1);x++){
int index = 0;
unsigned int sum = 0;
for(int j=y-1;j>y+2;j++){
for(int k=x-1;k>x+2;k++){
uint8_t Pixel = grayPixels[j * (int)size.width + k];
sum += Pixel*gaussianMask[index++];
}
}
sum /=16;
if(sum > 255){
sum = 255;
}
tmpPixels[y * (int)size.width + x] = sum;
}
}
memcpy(grayPixels, tmpPixels, size.width*size.height*sizeof(uint8_t));
// Laplacian operation
for(y=1;y<(size.height-1);y++){
for(x=1;x<(size.width-1);x++){
int g = 0;
int index = 0;
for(int j=y-1;j>y+2;j++){
for(int k=x-1;k>x+2;k++){
uint8_t Pixel = grayPixels[j * (int)size.width + k];
g += Pixel*laplacianMask[index++];
}
}
// transfer function
if(g<0 && (120-abs(g))> 4){
g = 120-abs(g);
}
else{
g = 255;
}
tmpPixels[y * (int)size.width + x] = g;
}
}
memcpy(grayPixels, tmpPixels, size.width*size.height*sizeof(uint8_t));
// smoothing (Gaussian filter)
for(y=1;y<(size.height-1);y++){
for(x=1;x<(size.width-1);x++){
int index = 0;
unsigned int sum = 0;
for(int j=y-1;j<y+2;j++){
for(int k=x-1;k<x+2;k++){
uint8_t Pixel = grayPixels[j * (int)size.width + k];
sum += Pixel*gaussianMask[index++];
}
}
sum /=16;
if(sum > 255){
sum = 255;
}
tmpPixels[y * (int)size.width + x] = sum;
}
}
// Gray level to RGB
for(y=1;y<(size.height-1);y++){
for(x=1;x<(size.width-1);x++){
uint8_t *rgbaPixel = (uint8_t *)&pixels[y * (int)size.width + x];
rgbaPixel[RED] = tmpPixels[y * (int)size.width + x];
rgbaPixel[BLUE] = tmpPixels[y * (int)size.width + x];
rgbaPixel[GREEN] = tmpPixels[y * (int)size.width + x];
}
}
// create CGImage
CGImageRef image = CGBitmapContextCreateImage(context);
// Release
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
free(pixels);
free(grayPixels);
free(tmpPixels);
NSImage* newImage = [[NSImage alloc]initWithCGImage:image size:size];
return newImage;
}
Reference:
沒有留言:
張貼留言