Click here to Skip to main content
15,881,938 members
Articles / Artificial Intelligence
Article

Lightning Detection with Deep Learning and Tensorflow on Android: Setting Up a TFLite Model

Rate me:
Please Sign up or sign in to vote.
5.00/5 (5 votes)
16 Nov 2020CPOL3 min read 6.2K   129   3  
In this article we’ll set up the TFLite model in the Android environment and create a working demo application.
Here we’ll look at the MainActivity.java file, the graphic overlay on the portion of the screen where we detect lightning while in camera mode, and other class files.

Introduction

In the previous article, we walked through the basic setup of the model-based app in the Android environment.

The MainActivity.java File

This is a very important file - the one that facilitates the user interaction with our Android app. We’ll walk through this file if not line by line, chunk by chunk.

This file is located in the com.ruturaj.detectlightning package. The com.ruturaj.detectlightning.mlkit contains all the class files required for object detection.

Let’s start with the packages:

Java
package com.ruturaj.detectlightning;

import android.content.Context;
import android.content.pm.PackageInfo;
import android.content.pm.PackageManager;
import android.os.Bundle;
import android.util.Log;
import android.widget.Toast;

import androidx.annotation.NonNull;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;

import com.google.mlkit.common.model.LocalModel;
import com.google.mlkit.vision.objects.custom.CustomObjectDetectorOptions;
import com.ruturaj.detectlightning.mlkit.CameraSource;
import com.ruturaj.detectlightning.mlkit.CameraSourcePreview;
import com.ruturaj.detectlightning.mlkit.GraphicOverlay;
import com.ruturaj.detectlightning.mlkit.ObjectDetectorProcessor;
import com.ruturaj.detectlightning.mlkit.PreferenceUtils;

import java.io.IOException;
import java.util.ArrayList;
import java.util.List;

public class MainActivity extends AppCompatActivity implements ActivityCompat.OnRequestPermissionsResultCallback {

Some important variables and strings are defined in class-globally:

XML
private static final String OBJECT_DETECTION_CUSTOM = "Lightning";

private static final String TAG = "MainActivity";
private static final int PERMISSION_REQUESTS = 1;

private CameraSource cameraSource = null;
private CameraSourcePreview preview;
private GraphicOverlay graphicOverlay;
private String selectedModel = OBJECT_DETECTION_CUSTOM;

If you are familiar with the Android lifecycle, I bet you know the following method!

Java
@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_main);

Setting up preview for our live stream, from the back camera.

Java
preview = findViewById(R.id.preview_view);
if (preview == null) {
    Log.e(OBJECT_DETECTION_CUSTOM, "Preview is null");
}

Next, we draw a graphic overlay (in this case, a bounding box) on the portion of the screen where we detect lightning while in camera mode.

Java
graphicOverlay = findViewById(R.id.graphic_overlay);
if (graphicOverlay == null) {
    Log.e(OBJECT_DETECTION_CUSTOM, "graphicOverlay is null");
}

Next, we check if we already have the permissions our app needs and if not, we ask for them at runtime:

Java
    if (allPermissionsGranted()) {
        createCameraSource(selectedModel);
    } else {
        getRuntimePermissions();
    }
}

The CameraSource class is called to get the camera view onto the screen using our model:

Java
    if (allPermissionsGranted()) {
        createCameraSource(selectedModel);
    } else {
        getRuntimePermissions();
    }
}

When the model, stored in the assets folder, runs, we call the LocalModel class to build the model and detect objects. CustomObjectDetectorOptions is typically used for live detection of the most prominent objects in the camera’s field of vision.

Java
    try {
        if (OBJECT_DETECTION_CUSTOM.equals(model)) {
            Log.e(OBJECT_DETECTION_CUSTOM, "Using Custom Object Detector Processor");
            LocalModel localModel =
                    new LocalModel.Builder()
                            .setAssetFilePath("model.tflite")
                            .build();
            CustomObjectDetectorOptions customObjectDetectorOptions =
                    PreferenceUtils.getCustomObjectDetectorOptionsForLivePreview(this, localModel);
            cameraSource.setMachineLearningFrameProcessor(
                    new ObjectDetectorProcessor(this, customObjectDetectorOptions));
        } else {
            Log.e(OBJECT_DETECTION_CUSTOM, "Unknown model: " + model);
        }
    } catch (Exception e) {
        Log.e(OBJECT_DETECTION_CUSTOM, "Can not create image processor: " + model, e);
        Toast.makeText(
             getApplicationContext(),
                "Can not create image processor: " + e.getMessage(),
                Toast.LENGTH_LONG)
                .show();
    }
}

Start the camera as soon as the permissions have been granted and the app shows the preview.

Java
private void startCameraSource() {
    if (cameraSource != null) {
        try {
            if (preview == null) {
                Log.e(TAG, "resume: Preview is null");
            }
            if (graphicOverlay == null) {
                Log.e(TAG, "resume: graphOverlay is null");
            }
            preview.start(cameraSource, graphicOverlay);
        } catch (IOException e) {
            Log.e(TAG, "Unable to start camera source.", e);
            cameraSource.release();
            cameraSource = null;
        }
    }
}

The following code maintains the Android UI lifecycle in case of an interruption:

Java
@Override
public void onResume() {
    super.onResume();
    Log.e(TAG, "onResume");
    createCameraSource(selectedModel);
    startCameraSource();
}

App pauses are handled as follows:

Java
@Override
protected void onPause() {
    super.onPause();
    preview.stop();
}

Killing the app destroys the camera preview:

Java
@Override
public void onDestroy() {
    super.onDestroy();
    if (cameraSource != null) {
        cameraSource.release();
    }
}

After an onCreate call grants access to the required permissions:

Java
    private String[] getRequiredPermissions() {
    	try {
        	PackageInfo info =
                	this.getPackageManager()
                        	.getPackageInfo(this.getPackageName(), PackageManager.GET_PERMISSIONS);
        	String[] ps = info.requestedPermissions;
        	if (ps != null && ps.length > 0) {
            	return ps;
        	} else {
            	return new String[0];
        	}
    	} catch (Exception e) {
        	return new String[0];
    	}
	}

	private boolean allPermissionsGranted() {
    	for (String permission : getRequiredPermissions()) {
        	if (!isPermissionGranted(this, permission)) {
            	return false;
        	}
    	}
    	return true;
	}

	private void getRuntimePermissions() {
    	List<String> allNeededPermissions = new ArrayList<>();
    	for (String permission : getRequiredPermissions()) {
        	if (!isPermissionGranted(this, permission)) {
                allNeededPermissions.add(permission);
        	}
    	}

    	if (!allNeededPermissions.isEmpty()) {
        	ActivityCompat.requestPermissions(
                	this, allNeededPermissions.toArray(new String[0]), PERMISSION_REQUESTS);
    	}
	}

	@Override
	public void onRequestPermissionsResult(
        	int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
    	Log.i(TAG, "Permission granted!");
    	if (allPermissionsGranted()) {
        	createCameraSource(selectedModel);
    	}
    	super.onRequestPermissionsResult(requestCode, permissions, grantResults);
	}

	private static boolean isPermissionGranted(Context context, String permission) {
    	if (ContextCompat.checkSelfPermission(context, permission)
            	== PackageManager.PERMISSION_GRANTED) {
        	Log.i(TAG, "Permission granted: " + permission);
        	return true;
    	}
    	Log.i(TAG, "Permission NOT granted: " + permission);
    	return false;
	}

}

You’ll see quite a few instances of Log.e… in the code. This is just to make the Logcat inspection stand out visually. When I do the e- Error log, the traces appear in red - easy to see.

Other Class Files

Class file name Description
BitmapUtils A utils function for bitmap conversions.
CameraImageGraphic Draws the camera image in the background.
CameraSource Manages the camera and allows UI updates on top of it (e.g., overlaying extra graphics or displaying extra information). It receives preview frames from the camera at a specified rate and sends these frames to child classes' detectors/classifiers as fast as it can process them.
CameraSourcePreview Previews the camera image on the screen.
FrameMetaData Contains additional information about the frame.
GraphicOverlay Renders a series of custom graphics to be overlayed on top of a preview (i.e., the camera preview). The creator can add graphics objects, update these objects, and remove them, triggering the appropriate drawing and invalidation within the view.
InferenceInfoGraphic A graphic instance for rendering the inference info (latency, FPS, resolution) in an overlay view.
ObjectDetectorProcessor A processor to run an object detector.
ObjectGraphic Draws the detected object info in the preview.
PreferenceUtils A utility class to retrieve shared preferences.
ScopedExecutor Wraps an existing executor to provide a method for subsequent cancellation of the submitted Runnable.
VisionImageProcessor An interface to process images with the various vision detectors and custom image models.
VisionProcessorBase Abstracts the base class for vision frame processors. Subclasses need to implement this to define what they want to with the detection results and specify the detector object.

Next Steps

In the next article, we’ll carry out real-time testing of our app on an Android device.

This article is part of the series 'Live Lightning Detection with Deep Learning and Tensorflow on Android View All

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Technical Writer
Canada Canada
www.ruturaj.me | Technical Specialist & Author

Comments and Discussions

 
-- There are no messages in this forum --