Monday, December 15, 2014

Designing your First Swift Voice App


Learn it through lectures on Udacity using this link: https://www.udacity.com/course/ud585.

Code for the full project available on github: Pitch Perfect

Open Xcode->Single View Application->



Once created your Navigator will show the default files created namely AppDelegate.swift and ViewController.swift.  Things are very similar to the Objective-C world for now.

We are designing an app that will input your voice using the microphone and then allow you to play back that voice in slow and fast speeds and also add some effects like making you sound like a chipmunk or like Darth Vader.

First, we list down all the things we need to do to make this app happen:
  1. We need a view with a microphone button
  2. On pressing the button, the microphone should record our voice
  3. The recorded voice should be stored
  4. Once saved, we should have the options to play back the voice in different speeds and with different effects.
Going with Apple's Model View Controller Architecture, we can visualize the following components in our application:
  • Model - Stores our audio file
  • Controller - Allows us to input our voice and also allows us to add effects to it
  • View - Displays the microphone, provides a button to record voice, also provides a screen with options to add effects

Interactions:

  • Outlet A way the view can be accessed on the controller.  The controller can use this outlet to make changes to the view like changing the text of a label.
  • ActionA way the view can interact with the controller when events/changes take place on the view. An example would be a button being clicked on the view would invoke an action in the controller to handle the event.

Life Cycles of a viewController

  • viewDidLoad - used for one time initialization
  • viewWillAppear - used to hide/show controls
  • viewDidAppear - used to perform actions after controls appears
  • viewWillDisappear - used to wrap-up actions before the view disappears
  • viewDidDisappear - used to wrap up actions after the view disappears

Navigation View Controller

  • We can embed our view controllers in another navigation view controller that will allow us to navigate between different screens
  • One of the view controllers will be the root view controller which will be displayed first
  • Other view controllers can be segued from the root view controllers by linking it to events like button clicks
  • We navigate back to the calling viewController using the back button

Auto Layout

  • Xcode now provides a way to place controls on the screen and then specify their layout.  
  • Clicking on the control and dragging up or down will allow you to specify constraints on the display of the control.
  • You can make controls horizontally and vertically central to the display
  • You can make controls be positioned at a constant distance from the vertical and horizontal borders
  • You can change the auto layout by clicking on the component and dragging or by clicking on the component and using the auto layout menu at the bottom right of the storyboard layout to specify values directly.

Image

  • image.xcassets allows you to create new image sets for display of images in various apps.
  • These can be scaled as 1x,2x,3x depending on the device used for display.
  • Image sets can then be linked to different UI components using their image attribute.

Designing first view





We have the following components:

1. A microphone button
2. A label to display "recording" when we are recording voice

Steps:
  • Create a UIButton called recordButton and stopButton.  Create a label called recording label.  Set the stopButton and recordingLabel to be hidden when the view first appears.
  • Create outlets for the buttons and labels
  • Create actions for the recordButton and stopButton.
  • When the recordButton is pressed, show the label indicating that the audio is being recorded and also enable the stop button.  Also disable the recordButton so it cannot be pressed again.
  • When the stop button is pressed, hide the label that the audio is being recorded.

 import UIKit  
 class ViewController: UIViewController {  
   @IBOutlet weak var recordButton: UIButton!  
   @IBOutlet weak var stopButton: UIButton!  
   @IBOutlet weak var recordLabel: UILabel!  
   override func viewDidLoad() {  
     super.viewDidLoad()  
     // Do any additional setup after loading the view, typically from a nib.  
   }  
   override func didReceiveMemoryWarning() {  
     super.didReceiveMemoryWarning()  
     // Dispose of any resources that can be recreated.  
   }  
   override func viewDidAppear(animated: Bool) {  
     stopButton.hidden=true  
     recordButton.enabled=true  
   }  
   @IBAction func recordAudio(sender: UIButton) {  
     println("in record")  
     recordLabel.hidden=false  
     stopButton.hidden=false  
     recordButton.enabled=false  
   }  
   @IBAction func stopRecording(sender: UIButton) {  
     recordLabel.hidden=true  
     recordButton.enabled=true  
   }  
 }  

Create a Model

Create a new class and name it recordedAudio.  In this class, we'll store the title and the path for our recording.  When we switch between views in our application, this object will enable us to pass the recorded audio or namely ("the data") of our application.  This is in line with the general design that Apple recommends for storing our application's data.

 import Foundation  
 //This object represents the model  
 //The stored audio file has a title and URL  
 class RecordedAudio:NSObject  
 {  
   var title:String!  
   var filePathURL:NSURL!  
 }  

Initialize your model in your viewController

 class ViewController: UIViewController, AVAudioRecorderDelegate {  
   @IBOutlet weak var recordLabel: UILabel!  
   @IBOutlet weak var recordButton: UIButton!  
   @IBOutlet weak var stopButton: UIButton!  
   @IBOutlet weak var playButton: UIButton!  
   var audioPlayer:AVAudioPlayer!  
   var audioRecorder:AVAudioRecorder!  
   var recordedAudio:RecordedAudio!  
   var filePath:NSURL!  
   ......  
 }  


Record your voice

Back in your viewController, once you have your buttons and event triggering working and your model created, now add the ability to record your voice and stop recording to the recordAudio and stopRecording button action events:
  1. First get the directory path in your app's document folder to save the song
  2. Create a file name - you can use any filename, I'm appending the date/time to mine to make it unique everytime.
  3. Print it out so you can playblack and check the recording
  4. Create a new session for recording
  5. Assign yourself as the delegate for audio recording.  This will need you to also implement the AVAudioRecorderDelegate in your class.  Assigning our class as the delegate for recording will give us the ability to handle any events like "audioRecorderDidFinishRecording".  The advantage of this would be that we can make sure we let CoreAudio handle all the processing for the recording (like recording a 5 min long session) and we make all our transition (to other actions like switching pages) happens only after the processing is done.

  
@IBAction func recordAudio(sender: UIButton) {  
     recordLabel.hidden=false  
     stopButton.hidden=false  
     recordButton.enabled=false  
     
     //Get the place to store the recorded file in the app's memory  
     let dirPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory,.UserDomainMask,true)[0] as String  
     
     //Name the file with date/time to be unique  
     var currentDateTime=NSDate();  
     var formatter = NSDateFormatter();  
     formatter.dateFormat = "ddMMyyyy-HHmmss";  
     var recordingName = formatter.stringFromDate(currentDateTime)+".wav"  
     var pathArray = [dirPath, recordingName]  
     let filePath = NSURL.fileURLWithPathComponents(pathArray)  
     println(filePath)  
     
     //Create a session  
     var session=AVAudioSession.sharedInstance()  
     session.setCategory(AVAudioSessionCategoryPlayAndRecord,error:nil)  
     
     //Create a new audio recorder  
     audioRecorder = AVAudioRecorder(URL: filePath, settings:nil, error:nil)  
     audioRecorder.delegate = self  
     audioRecorder.meteringEnabled=true  
     audioRecorder.prepareToRecord()  
     audioRecorder.record()  
   }  

Stop Recording

Now that we have a button to start recording, lets give the user the ability to stop the recording with a stop button:

   @IBAction func stopRecording(sender: UIButton) {  
     recordLabel.hidden=true  
     recordButton.enabled=true  
     audioRecorder.stop()  
     var audioSession = AVAudioSession.sharedInstance()  
     audioSession.setActive(false, error: nil)  
   }  

Once we have stopped recording,  we can implement the "audioRecorderDidFinishRecording" method to make sure we save our recorded file to our Model object (in our case its going to be a file called recordedAudio) so that we can use this object in the same or other views and it will now contain the recorded file.

   func audioRecorderDidFinishRecording(recorder: AVAudioRecorder!, successfully flag: Bool) {  
     if(flag)  
     {  
       //Store in Model  
       recordedAudio=RecordedAudio()  
       recordedAudio.filePathURL=recorder.url  
       recordedAudio.title=recorder.url.lastPathComponent  
       //Segway once we've finished processing the audio   
     }  
     else  
     {  
       println("recording not successful")  
       recordButton.enabled=true  
       stopButton.hidden=true  
     }  
   }  

In the above snippet, the segway comment gives you the area where can now implement any transitions to other views.

Playing an audio file

We need the AVFoundation library to provide classes to play an audio file.
The AVAudioPlayer is the class that provides for this.

Following is a good code snippet to play an audio file:
  • audioPlayer variable is declared globally so that different functions in the controller can access it.
  • filePath represents the path of the file.  
  • NSBundle.mainBundle asks XCode to look in the current directory for the resource "movie_quote" which is an mp3 file
  • we convert this filePath to a URL so that we can initialize the audio player with it
  • we set the enableRate property of the audio player to be true to allow us to change the rate at which the audio is played.
   var audioPlayer:AVAudioPlayer!  
   override func viewDidLoad() {  
     super.viewDidLoad()  
     if var filePath=NSBundle.mainBundle().pathForResource("movie_quote", ofType: "mp3")  
     {  
       var url=NSURL.fileURLWithPath(filePath)  
       audioPlayer = AVAudioPlayer(contentsOfURL: url, error: nil)  
       audioPlayer.enableRate=true;  
     }  
     else  
     {  
       println("file path is incorrect");  
     }  
     // Do any additional setup after loading the view.  
   }  

The snipped above first let you test the player functionality with a generic recording to make sure your audio player is functioning properly.  You can tweak the code above to make it play the audio file you stored in your model object as well.

4 comments:

  1. Hi Aparna,

    Can you please tel me how to record the voice in swift?

    ReplyDelete
    Replies
    1. First, thanks for reading ! I've added some more snippets to explain how to record and stop recording in new sections above. You can also download the code for the entire project from my github page. Hope it helps!

      Delete
    2. Looks perfect!!
      Great!!!!
      It really helped...
      Thanks a lot Aparna!!!!

      Delete
  2. May I know how you add the code snippets to your blog in the structured manner?

    ReplyDelete