Wednesday, December 8, 2010

Create a Basic iPhone Audio Player with AV Foundation Framework


Last week we received an email requesting a tutorial that covers playing sounds on the iPhone. I started digging into the documentation and it appears the SDK offers two choices. Using the AV Foundation Framework, which makes it very easy, or using Audio Queue Services, which is about the most difficult thing in the world. The person in the email mentioned buffers and such, so it is clear that this person wanted a tutorial covering Audio Queue Services. I'll be getting to a tutorial on that in the near future, but for now I'm going to create a simple audio player that should cover most people's needs using the AV Foundation Framework.
I think the biggest criticism I have about programming for the iPhone is simply finding documentation about things I want to do. I actually got pretty far into Audio Queue Services before I even discovered AV Foundation Framework. The entire time in between I was cursing Apple's name for making something so simple - playing audio - so difficult to program. But, that time's over and having discovered AV Foundation Framework, I'm once again a content iPhone developer.
Below is a screenshot for what we'll be building today. It's simply a small application with two buttons - Play and Stop. Play begins playing an audio file I've embedded into the app as as resource, and Stops stops the playback.
Example Application Screenshot
I guess the first thing you're going to need to do is get some audio to play and add it as a resource. The playback is based around the AVAudioPlayer class, which supports lots of different formats. As far as actually adding it, it's as simple as right mouse clicking on the resources folder, selected Add Existing Files, and selecting your sound file.
Now we need to bring AV Foundation Framework into our project. For me, this wasn't in the ordinary list of frameworks you see when you right mouse click on Frameworks and select Add / Existing Frameworks. I added the framework from here:
/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS3.0.sdk/System/
Library/Frameworks/AVFoundation.framework
The next thing we need to do is create the user interface. I added a new UIViewController subclass called AudioPlayer to handle this. I then used Interface Builder to slap a couple of buttons on the screen and hook up the button presses to my code. If you're not familiar with how to create user interfaces using Interface Builder, I would recommend checking out our getting started tutorial. Here's the header file for my completed view controller.
#import <UIKit/UIKit.h>

@class AVAudioPlayer;

@interface AudioPlayer : UIViewController {
  IBOutlet UIButton *playButton;
  IBOutlet UIButton *stopButton;
  AVAudioPlayer *audioPlayer;
}

@property (nonatomic, retain) IBOutlet UIButton *playButton;
@property (nonatomic, retain) IBOutlet UIButton *stopButton;
@property (nonatomic, retain) AVAudioPlayer *audioPlayer;

-(IBAction)play;
-(IBAction)stop;

@end
Here I simply have some properties to hold my buttons, and another to hold the AVAudioPlayer that will actually be playing the sounds. I also stuck in a couple of methods that are invoked when the buttons are pressed.
Most of the work is placed in the viewDidLoad function for this view controller. Here's the contents of that function.
- (void)viewDidLoad {
  [super viewDidLoad];
  
  // Get the file path to the song to play.
  NSString *filePath = [[NSBundle mainBundle] pathForResource:@"TNG_Theme" 
                                                       ofType:@"mp3"];
  
  // Convert the file path to a URL.
  NSURL *fileURL = [[NSURL alloc] initFileURLWithPath:filePath];
  
  //Initialize the AVAudioPlayer.
  self.audioPlayer = [[AVAudioPlayer alloc] 
                           initWithContentsOfURL:fileURL error:nil];
  
  // Preloads the buffer and prepares the audio for playing.
  [self.audioPlayer prepareToPlay];
  
  [filePath release];
  [fileURL release];
  
}
The first thing we need to do is get a file path for my audio file. The file I'm using is the theme song for Star Trek - The Next Generation, and it's an mp3. I then convert the path to a NSURL object, which is what AVAudioPlayer needs when it's being initialized. Next up is actually initializing the audio player. I set the error to nil since I don't care about getting error callbacks for this tutorial. The next thing I do is call prepareToPlay. This function will initialize the buffers and prepare the hardware for audio playback. This function is used to reduce the amount of time between when the play button is pressed and when audio is actually heard.
The only thing left is to actually start and stop the audio when the buttons are pressed.
-(IBAction)play {

  // Make sure the audio is at the start of the stream.
  self.audioPlayer.currentTime = 0;
  
  [self.audioPlayer play];
  
}

-(IBAction)stop {

  [self.audioPlayer stop];
  
}
These calls should be very straight forward. The only oddity is setting the currentTime property. Since stop does not reset the song to the begining, this call ensures the audio is played from the beginning anytime the play button is pressed.
And that's it for this tutorial. I've attached my entire XCode project below. As I mentioned before, I'm still working my way through Audio Queue Services, and I'll be creating a tutorial on that in the near future. If you've got any questions, feel free to ask in the comments.

No comments:

Post a Comment

Followers