Beginners Tutorial To Perform Facial Emotion Detection Using Keras Models on Amazon SageMaker
- To run the same Keras code on Amazon SageMaker that you run on your local machine, use script mode.
- To optimize hyperparameters, launch automatic model tuning.
- Deploy your models with Amazon Elastic Inference.
This example demonstrates training a simple convolutional neural network on the Facial Expression Recognition dataset. The data consists of 48×48 pixel grayscale images of faces. The task is to categorize each face based on the emotion shown in the facial expression into one of seven categories (0=Angry, 1=Disgust, 2=Fear, 3=Happy, 4=Sad, 5=Surprise, 6=Neutral).
- Grab optional parameters from the command line, or use default values if they’re missing.
- Download the dataset and save it to the /data directory.
- Normalize the pixel values, and one hot encode labels.
- Build the convolutional neural network.
- Train the model.
- Save the model to TensorFlow Serving format for deployment.
Now check that this code works by running it on a local machine, without using Amazon SageMaker.
- SM_NUM_GPUS—The number of GPUs present in the instance.
- SM_MODEL_DIR— The output location for the model.
- SM_CHANNEL_TRAINING— The location of the training dataset.
- SM_CHANNEL_VALIDATION—The location of the validation dataset.
parser.add_argument(‘–gpu-count’, type=int, default=os.environ[‘SM_NUM_GPUS’])
parser.add_argument(‘–model-dir’, type=str, default=os.environ[‘SM_MODEL_DIR’])
parser.add_argument(‘–training’, type=str, default=os.environ[‘SM_CHANNEL_TRAINING’])
parser.add_argument(‘–validation’, type=str, default=os.environ[‘SM_CHANNEL_VALIDATION’])
What about hyperparameters? No work needed there. Amazon SageMaker passes them as command-line arguments to your code.
For more information, see the updated script, fertrain.py.
Now We implement step by step
1. Create an S3 bucket instance on AWS
2. Sign in to the AWS Management Console and open the Amazon S3 console at https://console.aws.amazon.com/s3/.
3. Choose Create bucket.
4. In the Bucket name field, type a unique DNS-compliant name for your new bucket. (The example screenshot uses the bucket name admin-created. You cannot use this name because S3 bucket names must be unique.)
5. For Region, choose US West (Oregon) as the region where you want the bucket to reside.
6. Choose Create and download all file from git hub and upload on a bucket.
7. Create Notebook instance on AWS
1. Open the Amazon SageMaker console at https://console.aws.amazon.com/sagemaker/.
2. Choose Notebook instances, then choose to Create notebook instance.
3. Set Notebook instances setting show below
4. After Create New IAM Role and enter your S3 bucket name
5. Then choose Create notebook instance and Wait Sometime when notebook instance created.
After deploying your Keras model, you can begin training on Amazon SageMaker. For more information, see the fer-amazon-sagemakers.ipynb notebook.
- Load the dataset.
- Define the training and validation channels.
- Configure the TensorFlow estimator, enabling script mode and passing some hyperparameters.
- Train, deploy and predict.