How To: iPhone SDK – Play and Record Audio concurrently

Unfortunately, it’s quite a fiddly process to record audio and play it back at the same time on the iPhone.  By default, the sound output is very quiet from the iPhone’s speaker when you are recording sound.  So how do we fix this?

First, setup your audio session to record audio:

NSURL *url = [NSURL fileURLWithPath:@"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
   [NSNumber numberWithFloat: 44100.0],                 AVSampleRateKey,
   [NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
   [NSNumber numberWithInt: 1],                         AVNumberOfChannelsKey,
   [NSNumber numberWithInt: AVAudioQualityMax],         AVEncoderAudioQualityKey, nil];

recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
if (recorder) {
   [recorder prepareToRecord]; 
   recorder.meteringEnabled = YES;
   [recorder record]; 
} else {
   NSLog(@"Error: %@", error);

Then tell the device you want to record and play audio at the same time:

AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = nil;
[audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];

Then, and this is the key, allow the volume from the speakers to also be loud when recording:

UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,

The end result? You can happily play audio at full volume while recording from the device’s microphone. Happy days.

How-To: Convert UIImage to Greyscale Equivalent

Here’s a simple code snippet for converting a UIImage to a greyscale equivalent:

-(UIImage *) convertToGreyscale:(UIImage *)i {
    int kRed = 1;
    int kGreen = 2;
    int kBlue = 4;
    int colors = kGreen;
    int m_width = i.size.width;
    int m_height = i.size.height;
    uint32_t *rgbImage = (uint32_t *) malloc(m_width * m_height * sizeof(uint32_t));
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(rgbImage, m_width, m_height, 8, m_width * 4, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipLast);
    CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
    CGContextSetShouldAntialias(context, NO);
    CGContextDrawImage(context, CGRectMake(0, 0, m_width, m_height), [i CGImage]);
    // now convert to grayscale
    uint8_t *m_imageData = (uint8_t *) malloc(m_width * m_height);
    for(int y = 0; y < m_height; y++) {
        for(int x = 0; x < m_width; x++) {
			uint32_t rgbPixel=rgbImage[y*m_width+x];
			uint32_t sum=0,count=0;
			if (colors & kRed) {sum += (rgbPixel>>24)&255; count++;}
			if (colors & kGreen) {sum += (rgbPixel>>16)&255; count++;}
			if (colors & kBlue) {sum += (rgbPixel>>8)&255; count++;}
    // convert from a gray scale image back into a UIImage
    uint8_t *result = (uint8_t *) calloc(m_width * m_height *sizeof(uint32_t), 1);
    // process the image back to rgb
    for(int i = 0; i < m_height * m_width; i++) {
        int val=m_imageData[i];
    // create a UIImage
    colorSpace = CGColorSpaceCreateDeviceRGB();
    context = CGBitmapContextCreate(result, m_width, m_height, 8, m_width * sizeof(uint32_t), colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipLast);
    CGImageRef image = CGBitmapContextCreateImage(context);
    UIImage *resultUIImage = [UIImage imageWithCGImage:image];
    // make sure the data will be released by giving it to an autoreleased NSData
    [NSData dataWithBytesNoCopy:result length:m_width * m_height];
    return resultUIImage;

Thanks to the friendly people at for this snipet.

How-To: Convert NSData to NSString

How do you convert an NSData object into it’s string representation?  Should be easy, right?  It is, when you know how…

NSString* theString = [[NSString alloc] initWithData:theData encoding:NSASCIIStringEncoding];

How-To: Detect when MKAnnotation, MKAnnotationView is selected

I’m using MapKit to display a satellite map in one of my apps.  I create custom annotations – and it all works great.  However I wanted to be able to play sounds when a user touches one of my MKAnnotation’s on my MapKit MKMapView so that the sound matches the display of the callout (and when it disappears too).

There is no delegate method or other built-in mechanism for detecting when your annotation is selected, however it does have a selected property.  So how to detect when the MKAnnotation is selected and then play my sounds?  The answer is key/value observing.

Setup an observer on our imageAnnontationView:

// At the top of the .m file put:
static NSString* const GMAP_ANNOTATION_SELECTED = @"gMapAnnontationSelected";
// Then later somewhere in your code, add the observer
[imageAnnotationView addObserver:self

Then we get a callback whenever the selected property of our annotation changes:

- (void)observeValueForKeyPath:(NSString *)keyPath
                        change:(NSDictionary *)change
                       context:(void *)context{

    NSString *action = (NSString*)context;

    if([action isEqualToString:GMAP_ANNOTATION_SELECTED]){
      BOOL annotationAppeared = [[change valueForKey:@"new"] boolValue];
     // do something

The value of annotationAppeared will change based on the state of the annontations selected property.  GMAP_ANOTATION_SELECTED is a constant string I set at the top of my file.

I’ve updated my post to make the GMAP_ANOTATION_SELECTED constant more obvious.  Hopefully this helps people use the snippet above.