+ All Categories
Home > Documents > Getting started with the Affectiva SDK

Getting started with the Affectiva SDK

Date post: 15-Oct-2021
Category:
Upload: others
View: 3 times
Download: 0 times
Share this document with a friend
15
Getting started with the Affectiva SDK Tutorial Written by Thomas Oropeza and Beste Filiz Yuksel CS 686/486 Affective Computing, University of San Francisco Affectiva’s SDKs enable developers to create interactive and exciting emotion-aware apps and digital experiences. Affectiva’s SDKs capture and report emotion insights from facial expressions using different devices, analyzing faces obtained from the device camera, a video, or a single image — processed on-device and in real time. Affectiva has SDKs available for iOS, Android, Web, etc. However this tutorial will go over how to set up a starter project to interact with the Affectiva SDK, using the Affectiva Android SDK in Android Studio. The tutorial also assumes that you have no previous knowledge of Android or Mobile Development. Download Android Studio The first step is to download Android studio. If you already have this downloaded you can go ahead and skip to the next section. 1. Visit https://developer.android.com/studio/index.html and click “Download Android Studio”. 2. Follow the instructions in the installer to finish the installation process Create Your First Project Now that you have Android Studio fully installed you are ready to create the Android Studio project. 1. Open up Android Studio and click “Start a new Android Studio project” in the main menu 2. Name the project and select the location to create it in 3. Select the “Target Android Device”
Transcript
Page 1: Getting started with the Affectiva SDK

Getting started with the Affectiva SDK Tutorial Written by Thomas Oropeza and Beste Filiz Yuksel

CS 686/486 Affective Computing, University of San Francisco

Affectiva’s SDKs enable developers to create interactive and exciting emotion-aware apps and digital experiences. Affectiva’s SDKs capture and report emotion insights from facial expressions using different devices, analyzing faces obtained from the device camera, a video, or a single image — processed on-device and in real time. Affectiva has SDKs available for iOS, Android, Web, etc. However this tutorial will go over how to set up a starter project to interact with the Affectiva SDK, using the Affectiva Android SDK in Android Studio. The tutorial also assumes that you have no previous knowledge of Android or Mobile Development.

Download Android Studio The first step is to download Android studio. If you already have this downloaded you can go ahead and skip to the next section.

1. Visit https://developer.android.com/studio/index.html and click “Download Android Studio”.

2. Follow the instructions in the installer to finish the installation process

Create Your First Project Now that you have Android Studio fully installed you are ready to create the Android Studio project. 1. Open up Android Studio and click “Start a new Android Studio project” in the main menu 2. Name the project and select the location to create it in 3. Select the “Target Android Device”

Page 2: Getting started with the Affectiva SDK

Target Android Device

The “Target Android Device” for an Android application is the minimum version of the Android SDK that your app supports. The Affectiva Android SDK supports all devices from API 16 (Android Jelly Bean) and newer which supports 95.2% of Android devices. Therefore we will select the same minimum SDK for our app.

4. Click “Next” 5. Select “Empty Activity” 6. Click “Finish”

The Android Studio Workspace After creating the Android Studio project your app development workspace will open. Android Studio will also open the “MainActivity.java” file for you located under app/java/<package name>/

Page 3: Getting started with the Affectiva SDK

Activities

In Mobile development, a common design pattern is the MVC design pattern in which there are three components to the application. The View, the Controller, and the Model. The View contains the visual components of our app (Buttons, Text, etc.). The Model contains all of our data for the app. The Controller connects the two together while also implementing the logic and functionality for the app.

Activities are the controller for and Android app. They provide functionality for the Views, like implementing what happens when a Button is clicked. The Views for Android apps are XML files that describe the different View Components and layout parameters in the app.

This starter app will only have the View and Controller components, and the data will be coming from the Affectiva SDK.

Your MainActivity.java will have a method called onCreate(). This method will initialize the View layout for our app and is where new View Components are typically initialized.

In this method it is necessary to call the Activity’s super class onCreate() method and also call the setContentView() method where the XML layout file is connected to this activity. The R static class contains all of the resources for the app, so it is where the XML View is located.

Including the Affectiva SDK To include the Affectiva SDK into our app, there are a few configuration steps which will include the SDK as a library to the Android compiler. Open the Gradle Scripts directory under the project navigator. There will be two build.gradle files. In the build.gradle (Module: app) file add the compile dependency under “dependencies”. compile('com.affectiva.android:affdexsdk:3.1.2')

Page 4: Getting started with the Affectiva SDK

In the build.gradle (Project: <App Name>) file add the maven url to include the Affectiva SDK. url "http://maven.affectiva.com" The two build.gradle files will have the following if implemented correctly. build.gradle (Project: <App Name>)

build.gradle (Module: app)

Then click the “Sync Now” message, located at the top of the Editor, and this will add the Affectiva SDK to your project properly. If the build fails, try changing this affdex sdk version to 3.+ like this: compile('com.affectiva.android:affdexsdk:3.+') And go to Project Structure -> app -> Source Compatibility and set that to 1.7

Page 5: Getting started with the Affectiva SDK

Adding the Permissions In Android apps it is required to specify permissions for the app so that the user agrees to allow usage of them. For example, this app will use the phone’s camera so it is necessary to specify that the app will use the camera in a permission presented to the User. The permissions are added to the app’s Manifest.xml file which is located under app/manifests. Open the Manifest.xml file and add the following permissions under the <manifest> tag. If you are copying this from a pdf, you will have to write out the lines for the Manifest.xml file manually (it won’t recognize the characters from a pdf). The other files you can copy and paste the lines of code. <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> <uses-permission android:name="android.permission.INTERNET" /> <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" /> <uses-permission android:name="android.permission.CAMERA" /> Now there are a few XML tags that are necessary for the Affectiva SDK to work properly. Add the following tag inside of the Manifest tag before the package property. xmlns:tools="http://schemas.android.com/tools" Add the following three tags inside the Application tag and delete any conflicting properties that already existed there. android:label="@string/app_name" android:allowBackup="false" tools:replace="android:allowBackup,android:label" The end result Manifest file will look like the following.

Page 6: Getting started with the Affectiva SDK

Building the View for the App This sample app will use an Affectiva component called the Camera Detector. The Camera Detector will read the User’s face and return information about it in real time. The Camera Detector needs to be embedded in a special view component called a Surface View. To add the Surface View to the XML layout file, open up the XML file located under app/res/layout/activity_main.xml. When opening the XML file you might get a “Rendering Problems” error which is a bug in Android Studio. If you do not get this error, go ahead and continue.

Page 7: Getting started with the Affectiva SDK
Page 8: Getting started with the Affectiva SDK

To fix the Rendering Error, click on the Android Alien icon in the toolbar above the layout view and click API 16 (The Minimum SDK our app is supporting).

This should reveal the phone screen with the “Hello World” TextView. Go ahead and click on the Hello World text and click Delete on your keyboard. Now drag the Surface View Component onto the phone screen. You will find the Surface View Component on the left of the phone screen towards the bottom of all the other view components.

Page 9: Getting started with the Affectiva SDK

Now give that Surface View an id so that it can be referenced from the Activity’s resource package.

Page 10: Getting started with the Affectiva SDK

Initializing the Camera Detector After creating the Surface View in the layout file we need to have a reference to it in the Activity class. This is done in the same way we implemented the setContentView() method in the Activity’s onCreate() method.

Hint*

To use the classes such as the CameraDetector from the Affectiva SDK, you will need to import them at the top of the class. This can be done by manually importing it or by clicking on the class you would like to import in the editor and using the “Option+Enter” shortcut on Mac

and “Alt+Enter” on Windows.

Here is an example of initializing the Camera Detector with the Surface View. Each of the numbers commented in the code below are described in the numerated list.

1. Create instance variables for the Surface View and Camera Detector references 2. Specify the max processing rate used by the Camera Detector. (This is in FPS,

Frames per Second) 3. Grab the reference to the Surface View by calling the Activity’s findViewById() method,

which returns the View object from the Activity’s layout based on the given id. The id is retrieved from the Resource class (R). This is where the id we gave the Surface View when we created it comes in handy. Then cast it to a Surface View object since the findViewById() method returns an Object type by default, but it is safe to force cast it since we are sure that the object specified by the id is in fact a Surface View object.

4. Initialize the Camera Detector passing in: “this” which is a reference to the Activity, the phone’s cameras we are using, and the Surface View to embed the detector in

5. Set the processing rate 6. Set the MainActivity class to be the listener for the Camera Detector. (This will cause

an error that we will fix in the next section, “Implementing the Camera Detector”) 7. Specify that we want the Camera Detector to detect all emotions 8. Tell the Camera Detector to start detecting

//1 SurfaceView cameraDetectorSurfaceView; CameraDetector cameraDetector; //2 int maxProcessingRate = 10; @Override

Page 11: Getting started with the Affectiva SDK

protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main ); //3 cameraDetectorSurfaceView = (SurfaceView) findViewById(R.id.cameraDetectorSurfaceView ); //4 cameraDetector = new CameraDetector(this, CameraDetector.CameraType.CAMERA_FRONT , cameraDetectorSurfaceView); //5 cameraDetector.setMaxProcessRate(maxProcessingRate); //6 cameraDetector.setImageListener(this); cameraDetector.setOnCameraEventListener(this); //7 cameraDetector.setDetectAllEmotions(true); //8 cameraDetector.start(); }

Implementing the Camera Detector The camera detector will send emotion detection data to a callback method. The Affectiva SDK requires the implementation of these callback methods to describe what occurs on the event that a face is recognized. There are two interfaces that need to be implemented in order to create a basic camera detector.

Page 12: Getting started with the Affectiva SDK

CameraEventListener

The Camera Event Listener requires implementation of the method:

onCameraSizeSelected(int cameraHeight, int cameraWidth, Rotate rotation)

This method allows the specification of the camera’s height and width. The SDK will provide the params cameraWidth, cameraHeight, and rotation. These parameters give the recommended sizing for the Camera Detector that will work best with the Affectiva SDK and the current orientation of the phone. Therefore the onCameraSizeSelected() method is where the sizing of the camera detector is specified.

ImageListener

The Image Listener requires implementation of the method:

onImageResults(List<Face> faces, Frame frame, float timeStamp)

This method gives the main information for Affectiva’s emotion detection. It gives a list of face objects which wrap all sorts of information about a face (Level of emotions, expressions, characteristics, etc.), the frame object which is a wrapper for the image captured, and a timestamp of when the frame was captured.

Implement these two interfaces in the MainActivity class like so. public class MainActivity extends AppCompatActivity implements CameraDetector.CameraEventListener, CameraDetector.ImageListener This will require you to implement the two methods for the interfaces. Below are two basic implementations of the methods to give a general depiction of their uses. Each of the numbers commented in the code below are described in the numerated list.

onCameraSizeSelected() 1. Grab the Layout Parameters from the Surface View. (Every view component in

Android has Layout Parameters) 2. Change the parameter’s height and width to the recommended sizing given by the

Affectiva SDK 3. Set the Surface View’s Layout Params with the new sizing

Page 13: Getting started with the Affectiva SDK

/** * Sizing the Camera Detector container for what works best with the Affectiva SDK */ @Override public void onCameraSizeSelected(int cameraHeight, int cameraWidth, Frame.ROTATE rotation) { //1 ViewGroup.LayoutParams params = cameraDetectorSurfaceView.getLayoutParams(); //2 params.height = cameraHeight; params.width = cameraWidth; //3 cameraDetectorSurfaceView.setLayoutParams(params); }

onImageResults() 1. Check if the frame was processed 2. Check if there are any faces currently recognized. 3. Grab the first face 4. Grab any emotion data from the processed frame. This is where the actual

implementation of the facial expressions will occur. The Face object provides all the different information that can be used.

5. Print the levels of emotion. (The scale goes from 0-100)

/** * Process image results from Affectiva SDK * */ @Override public void onImageResults(List<Face> faces, Frame frame, float timeStamp) { //1

Page 14: Getting started with the Affectiva SDK

if (faces == null ) return ; //frame was not processed //2 if (faces.size() == 0 ) return ; //no face found //3 Face face = faces.get( 0 ); //4 float joy = face. emotions .getJoy(); float anger = face. emotions .getAnger(); float suprise = face. emotions .getSurprise(); //5 System. out .println( "Joy: " + joy); System. out .println( "Anger: " + anger); System. out .println( "Suprise: " + suprise); }

Running the App on a Device Take a look at the Android Studio instructions for running your Android Studio app on an Android device. Android Developer Documentation on Running Apps on an Android Device *Running on an Android Device Requires that you allow the app permission to access the camera and local phone storage. It is possible that these permissions will reset after each run, so double check that necessary permissions are granted to the app if it is not running properly *If developing on Windows or Linux, this will potentially require a bit more set up.

Further Reading For more information of the Affectiva Android SDK refer to the following links: Getting Started with the Affectiva SDK in Android (Affectiva Tutorial)

Page 15: Getting started with the Affectiva SDK

JavaDocs for the Affectiva Android SDK Exploring Android Studio

Potential Errors The following are different errors and solutions that you might run into while following this tutorial. Error: Execution failed for task ':app:processDebugManifest' Solution: Open “build.gradle (Module:app)” file and change the minSdkVersion to 16.

Tutorial Written by Thomas Oropeza and Beste Filiz Yuksel


Recommended