How To: iPhone SDK – Play and Record Audio concurrently

Unfortunately, it’s quite a fiddly process to record audio and play it back at the same time on the iPhone.  By default, the sound output is very quiet from the iPhone’s speaker when you are recording sound.  So how do we fix this?

First, setup your audio session to record audio:

NSURL *url = [NSURL fileURLWithPath:@"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
   [NSNumber numberWithFloat: 44100.0],                 AVSampleRateKey,
   [NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
   [NSNumber numberWithInt: 1],                         AVNumberOfChannelsKey,
   [NSNumber numberWithInt: AVAudioQualityMax],         AVEncoderAudioQualityKey, nil];

recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
if (recorder) {
   [recorder prepareToRecord]; 
   recorder.meteringEnabled = YES;
   [recorder record]; 
} else {
   NSLog(@"Error: %@", error);
}

Then tell the device you want to record and play audio at the same time:

AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = nil;
[audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];

Then, and this is the key, allow the volume from the speakers to also be loud when recording:

UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,
sizeof(audioRouteOverride),&audioRouteOverride);

The end result? You can happily play audio at full volume while recording from the device’s microphone. Happy days.

How-To: Convert UIImage to Greyscale Equivalent

Here’s a simple code snippet for converting a UIImage to a greyscale equivalent:

-(UIImage *) convertToGreyscale:(UIImage *)i {
	
    int kRed = 1;
    int kGreen = 2;
    int kBlue = 4;
	
    int colors = kGreen;
    int m_width = i.size.width;
    int m_height = i.size.height;
	
    uint32_t *rgbImage = (uint32_t *) malloc(m_width * m_height * sizeof(uint32_t));
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(rgbImage, m_width, m_height, 8, m_width * 4, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipLast);
    CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
    CGContextSetShouldAntialias(context, NO);
    CGContextDrawImage(context, CGRectMake(0, 0, m_width, m_height), [i CGImage]);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
	
    // now convert to grayscale
    uint8_t *m_imageData = (uint8_t *) malloc(m_width * m_height);
    for(int y = 0; y < m_height; y++) {
        for(int x = 0; x < m_width; x++) {
			uint32_t rgbPixel=rgbImage[y*m_width+x];
			uint32_t sum=0,count=0;
			if (colors & kRed) {sum += (rgbPixel>>24)&255; count++;}
			if (colors & kGreen) {sum += (rgbPixel>>16)&255; count++;}
			if (colors & kBlue) {sum += (rgbPixel>>8)&255; count++;}
			m_imageData[y*m_width+x]=sum/count;
        }
    }
    free(rgbImage);
	
    // convert from a gray scale image back into a UIImage
    uint8_t *result = (uint8_t *) calloc(m_width * m_height *sizeof(uint32_t), 1);
	
    // process the image back to rgb
    for(int i = 0; i < m_height * m_width; i++) {
        result[i*4]=0;
        int val=m_imageData[i];
        result[i*4+1]=val;
        result[i*4+2]=val;
        result[i*4+3]=val;
    }
	
    // create a UIImage
    colorSpace = CGColorSpaceCreateDeviceRGB();
    context = CGBitmapContextCreate(result, m_width, m_height, 8, m_width * sizeof(uint32_t), colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipLast);
    CGImageRef image = CGBitmapContextCreateImage(context);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
    UIImage *resultUIImage = [UIImage imageWithCGImage:image];
    CGImageRelease(image);
	
    // make sure the data will be released by giving it to an autoreleased NSData
    [NSData dataWithBytesNoCopy:result length:m_width * m_height];
	
    return resultUIImage;
}

Thanks to the friendly people at StackOverflow.com for this snipet.

How-To: Convert NSData to NSString

How do you convert an NSData object into it’s string representation?  Should be easy, right?  It is, when you know how…

NSString* theString = [[NSString alloc] initWithData:theData encoding:NSASCIIStringEncoding];

How-To: Detect when MKAnnotation, MKAnnotationView is selected

I’m using MapKit to display a satellite map in one of my apps.  I create custom annotations – and it all works great.  However I wanted to be able to play sounds when a user touches one of my MKAnnotation’s on my MapKit MKMapView so that the sound matches the display of the callout (and when it disappears too).

There is no delegate method or other built-in mechanism for detecting when your annotation is selected, however it does have a selected property.  So how to detect when the MKAnnotation is selected and then play my sounds?  The answer is key/value observing.

Setup an observer on our imageAnnontationView:

// At the top of the .m file put:
static NSString* const GMAP_ANNOTATION_SELECTED = @"gMapAnnontationSelected";
// Then later somewhere in your code, add the observer
[imageAnnotationView addObserver:self
		 forKeyPath:@"selected"
		options:NSKeyValueObservingOptionNew
		context:GMAP_ANNOTATION_SELECTED];

Then we get a callback whenever the selected property of our annotation changes:

- (void)observeValueForKeyPath:(NSString *)keyPath
                      ofObject:(id)object
                        change:(NSDictionary *)change
                       context:(void *)context{

    NSString *action = (NSString*)context;

    if([action isEqualToString:GMAP_ANNOTATION_SELECTED]){
      BOOL annotationAppeared = [[change valueForKey:@"new"] boolValue];
     // do something
   }
}

The value of annotationAppeared will change based on the state of the annontations selected property.  GMAP_ANOTATION_SELECTED is a constant string I set at the top of my file.

I’ve updated my post to make the GMAP_ANOTATION_SELECTED constant more obvious.  Hopefully this helps people use the snippet above.

How-To: Detect if users have turned off the iPhone GPS

Simple and common situation.  You have an app that needs to use the GPS to function correctly.  However, users have the last say in this and can simply say “No” when the iPhone asks them if your application asks to turn on the GPS.  If you don’t deal with this situation, then it’s likely that Apple will reject your application.

So what to do?  Simple, implement the following CLLocationManager delegate method:

- (void)locationManager:(CLLocationManager *)manager 
       didFailWithError:(NSError *)error

Inside this method, make sure you deal with not receiving any GPS locations in a sensible manner (show a popup, skip etc).

A solution to: Application failed codesign verification

Okay, so you’ve spent months working hard on your iPhone project, and you finally go gold.  You get sign off from your client, you’ve squashed the last of your bugs and you think “I’m ready to submit this bad boy!”.  You package it up for release as per Apple’s instructions, and then try and upload it using the Application Loader that Apple supplies.

Continue reading “A solution to: Application failed codesign verification”

How-To: Remove grey shadow from iPhone UIWebView

By default, when you create a clickable HTML element in a webpage (or in HTML that is displayed in a UIWebView) the iPhone adds a grey shadow/box thing over the top of it when you touch it.

Sometimes you don’t want this ‘highlight’ to appear – or you want it to be another color.  Here is the code that will help you in this situation:

Remove the grey highlight completely:

-webkit-tap-highlight-color:rgba(0,0,0,0);

Change the color of the highlight:

-webkit-tap-highlight-color:your-color-here;