581Resizing an UIImage

Nice and simple Category to resize an UIImage
@interface UIImage (Resize)

- (UIImage*)scaleToSize:(CGSize)size;

#import "UIImageResize.h"

@implementation UIImage (Resize)

- (UIImage*)scaleToSize:(CGSize)size {

	[self drawInRect:CGRectMake(0, 0, size.width, size.height)];
	UIImage* scaledImage = UIGraphicsGetImageFromCurrentImageContext();
	return scaledImage;


569Layering one UIImage onto of another UIImage

Combining two images, especially useful, if the overlay image has an alpha value:
//  UIImage+Category.h
//  ImageOverlay
//  Created by Georg Tremmel on 29/04/2010.


@interface UIImage (combine)

- (UIImage*)overlayWith:(UIImage*)overlayImage;

And the implementation file.
//  UIImage+Category.m
//  ImageOverlay
//  Created by Georg Tremmel on 29/04/2010.

#import "UIImage+Category.h"

@implementation UIImage (combine)

- (UIImage*)overlayWith:(UIImage*)overlayImage {
	// size is taken from the background image
	[self drawAtPoint:CGPointZero];
	[overlayImage drawAtPoint:CGPointZero];
	// If Image Artifacts appear, replace the "overlayImage drawAtPoint" , method with the following
	// Yes, it's a workaround, yes I filed a bug report
	CGRect imageRect = CGRectMake(0, 0, self.size.width, self.size.height);
	[overlayImage drawInRect:imageRect blendMode:kCGBlendModeOverlay alpha:0.999999999];
	UIImage *combinedImage = UIGraphicsGetImageFromCurrentImageContext();
	return combinedImage;


An update to 334 Combining Images with UIImage & CGContext – (Offscreen drawing) (Did I say, how much I love Categories…?) Update I came across some strange behaviour when layering a PNG image with transparency over another image. Did not show up in the Simulator, only in iPhone 3GS (and probably also on other devices.) The base image draws fine, but the overlay image appears to be truncated and the last pixels shifted, producing some bright green artifacts. Changing
[overlayImage drawAtPoint:CGPointZero];
CGRect imageRect = CGRectMake(0, 0, self.size.width, self.size.height);
[overlayImage drawInRect:imageRect blendMode:kCGBlendModeOverlay alpha:1.0];
did not really help; the green artifacts remainded. It was strange though, that they did not appear in the other blendmodes. Using
CGContextDrawImage(c, imageRect, [overlayImage CGImage]);
would also work, but then the images turn up upside down. Not what I really needed. (Yes, I know, there might not be a hard fix for that, but really – it should be that complicated.) After playing a bit more with the values, I found, that setting alpha lower than 1.0 gets rid of the display artifact:
[overlayImage drawInRect:imageRect blendMode:kCGBlendModeOverlay alpha:0.9999999];
Bug filed at Apple’s Bug Report, let’s see. Or maybe am I missing something here? Anyway here they the files are, zipped and ready for download. Post Scriptum Test Project, showing the visual artifact in action. Only appears on the device, NOT IN THE SIMULATOR.

347UIImage → pixelData → UIImage Rountrip

The questions is rather simple: How to manipulate single pixels of an UIImage? The answer is rather long; but includes a joyful trip into Quartz 2D Graphic land…
// load image, convert to CGImageRef
UIImage *c = [UIImage imageNamed:@"c.png"];
CGImageRef cRef = CGImageRetain(c.CGImage);
// png alpha to mask
NSData* pixelData = (NSData*) CGDataProviderCopyData(CGImageGetDataProvider(cRef));
// image raw data
//NSData* pixelDataRep = UIImagePNGRepresentation(c);
// compressed png data
//NSLog(@"pixelData %i", [pixelData length]);
//NSLog(@"pixelDataRep %i", [pixelDataRep length]);
//NSLog(@"pixelDataRep equal to pixelData: %@", [pixelData isEqualToData:pixelDataRep] ? @"YES" : @"NO");
//UIImage* newImage = [UIImage imageWithData:pixelData];
//[newImage drawInRect:CGRectMake(10, 340, 65, 65)];

//NSLog(@"pixelData %@", pixelData);

unsigned char* pixelBytes = (unsigned char *)[pixelData bytes];
// return pointer to data
// step through char data
for(int i = 0; i < [pixelData length]; i += 4) {
	// change accordingly
	pixelBytes[i] = pixelBytes[i];
	pixelBytes[i+1] = pixelBytes[i+1];
	pixelBytes[i+2] = pixelBytes[i+2];
	pixelBytes[i+3] = 255;
//1ms in Simulator , 5ms on iPhone 3GS , 65x65 pixel
// copy bytes in new NSData
NSData* newPixelData = [NSData dataWithBytes:pixelBytes length:[pixelData length]];
//NSLog(@"newPixelData %@", newPixelData);
//NSLog(@"newPixelData: %@", newPixelData ? @"ok" : @"nil");
//NSLog(@"newPixelData equal to pixelData: %@", [pixelData isEqualToData:newPixelData] ? @"YES" : @"NO");
// cast NSData as CFDataRef
CFDataRef imgData = (CFDataRef)pixelData;
//NSLog(@"CFDataGetLength %i", CFDataGetLength(imgData) );
	// Make a data provider from CFData
	CGDataProviderRef imgDataProvider = CGDataProviderCreateWithCFData(imgData);
	// testing... create data provider from file.... works
	//NSString* imageFileName = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent:@"c.png"];
	//CGDataProviderRef imgDataProvider = CGDataProviderCreateWithFilename([imageFileName UTF8String]);
// does not work like that
// new image needs to get PNG properties
//CGImageRef throughCGImage = CGImageCreateWithPNGDataProvider(imgDataProvider, NULL, true, kCGRenderingIntentDefault);
// get PNG properties from cRef
size_t width = CGImageGetWidth(cRef);
size_t height = CGImageGetHeight(cRef);
size_t bitsPerComponent = CGImageGetBitsPerComponent(cRef);
size_t bitsPerPixel = CGImageGetBitsPerPixel(cRef);
size_t bytesPerRow = CGImageGetBytesPerRow(cRef);
CGColorSpaceRef colorSpace = CGImageGetColorSpace(cRef);
CGBitmapInfo info = CGImageGetBitmapInfo(cRef);
CGFloat *decode = NULL;
BOOL shouldInteroplate = NO;
CGColorRenderingIntent intent = CGImageGetRenderingIntent(cRef);
// cRef PNG properties + imgDataProvider's data
CGImageRef throughCGImage = CGImageCreate(width, height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpace, info, imgDataProvider, decode, shouldInteroplate, intent);
//NSLog(@"c %i, throughCGImage: %i", CGImageGetHeight(cRef), CGImageGetHeight(throughCGImage) );

// make UIImage with CGImage
UIImage* newImage = [UIImage imageWithCGImage:throughCGImage];
//NSLog(@"newImage: %@", newImage);
// draw UIImage
[newImage drawInRect:CGRectMake(10, 340, 65, 65)];
References: - http://iphoneincubator.com/blog/tag/cgimageref - http://www.nixwire.com/getting-uiimage-to-work-with-nscoding-encodewithcoder/ - http://developer.apple.com/mac/library/qa/qa2007/qa1509.html - http://stackoverflow.com/questions/1282830/uiimagepickercontroller-uiimage-memory-and-more - http://lists.apple.com/archives/Quartz-dev/2008//Aug/msg00073.html

338Accessing RGBA Pixel Data

NSData* pixelData = (NSData*) 
unsigned char* pixelBytes = (unsigned char *)[pixelData bytes];
for(int i = 0; i < [pixelData length]; i += 4) {
		NSLog(@"pixelBytes[i] R:%i G:%i B:%i A:%i ", 
		pixelBytes[i] = pixelBytes[i+3];
		pixelBytes[i+1] = pixelBytes[i+3];
		pixelBytes[i+2] = pixelBytes[i+3];
		pixelBytes[i+3] = 0;
	NSData* newPixelData = [NSData dataWithBytes:pixelBytes length:[pixelData length]];
	UIImage* newImage = [UIImage imageWithData:newPixelData];

334Combining Images with UIImage & CGContext – (Offscreen drawing)

In Cocoa NSImage has a lockFocus method, that allows to draw images offscreen and combine them into one.
[img lockFocus];
[img unlockFocus];
On the iPhone, UIImage lacks the lockFocus methods, instead the following:
// Create new offscreen context with desired size
UIGraphicsBeginImageContext(CGSizeMake(64.0f, 64.0f));

// draw img at 0,0 in the context
[img drawAtPoint:CGPointZero];

// draw another at 0,0 in the context, maybe with an alpha value
[another drawAtPoint:CGPointZero];

// ... and other operations

// assign context to UIImage
UIImage *outputImg = UIGraphicsGetImageFromCurrentImageContext();

// end context

16UIImage & cache

[UIImage imageNamed:@"image.jpg"]
caches image for lifetime of app.
[UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:@"image" ofType:@"png"]]
no caching, always loaded from memory. after caching, faster display problem, running out of memory