Click here to Skip to main content
15,849,768 members
Articles / Artificial Intelligence

Running Face Recognition on Kubernetes

Rate me:
Please Sign up or sign in to vote.
5.00/5 (3 votes)
3 Aug 2021CPOL3 min read 8.3K   7   1
In this next article of the series, we’ll show how to run the face recognition servers on Kubernetes.
Here we give practical cases when we need to use Kubernetes for scaling and deploying our AI solution in a real production environment and show how to run the containerized API in a local Kubernetes computer.


Face recognition is one area of artificial intelligence (AI) where the modern approaches of deep learning (DL) have had great success during the last decade. The best face recognition systems can recognize people in images and video with the same precision humans can – or even better.

Our series of articles on this topic is divided into two parts:

  • Face detection, where the client-side application detects human faces in images or in a video feed, aligns the detected face pictures, and submits them to the server.
  • Face recognition (this part), where the server-side application performs face recognition.

We assume that you are familiar with DNN, Python, Keras, and TensorFlow. You are welcome to download this project code ...

In one of the previous articles, we learned how to run the face identification server in a Docker container. Containerization allows us to develop and test the application in a predefined environment and then deploy the software with ease. In this article, we’ll run our face recognition web server in a local Kubernetes cluster.

Kubernetes is a set of services designed to manage an orchestrated cluster of Docker containers. Kubernetes simplifies the software deployment, network routing, and server load balancing. This is what you would want to use when deploying a facial recognition system in a production environment, where you need to scale the face identification server to serve many client applications (edge devices running face detection).

Modifying the Container for Kubernetes

To run our AI face recognition container on Kubernetes, we need to modify it slightly. When the container starts, it must run the server-side Python application we created. Let’s create a new Docker image using Dockerfile. Dockerfile is a set of instructions for creating an image from an existing one. In our case, the file contains only four lines:

FROM sergeylgladkiy/fr:v1

RUN rm /home/pi_fr/rec/*


CMD ["python", "/home/pi_fr/", "", "50"]
  • Line 1: Specify the base image.
  • Line 2: Clean up the /home/pi_fr/rec/ directory.
  • Line 3: Make the container use port 50.
  • Line 4: Run the Python application with the specified parameters.

Now we can build a new image via the command line. Put the created Dockerfile in the current directory of the terminal and execute the following command:

docker build -t "frrun:v1" .

This creates the image named frrun with the tag v1. At the start of the container, the Python code is executed, and the face recognition web server gets ready to receive face images.

Installing Kubernetes

The next step is to install Kubernetes. Go to the Settings/Kubernetes tab in Docker Desktop and select the Enable Kubernetes check box.

Image 1

Because we are going to run a web application, we must create a service and a deployment for Kubernetes. We can put both of these into a single YAML file:

apiVersion: v1
kind: Service
  name: face-rec-service
    app: face-rec
  - protocol: "TCP"
    port: 5050
    targetPort: 50
  type: LoadBalancer

apiVersion: apps/v1
kind: Deployment
  name: face-rec
      app: face-rec
  replicas: 2
        app: face-rec
      - name: face-rec
        image: frrun:v1
        imagePullPolicy: Never
        - containerPort: 50

The YAML file describes the load-balanced face-rec-service service with the appropriate ports and the face-rec deployment that runs two container replicas based on the frrun:v1 image.

Running the Container on Kubernetes

Now we can run our system on Kubernetes with the following command:

kubectl apply -f C:\PI_FR\fr_deploy.yaml

After issuing the command, look at the Containers/Apps tab. You’ll see running Kubernetes pods.

Image 2

Let’s test how our load-balanced service works and manages the deployed face recognition web application. We’ll run client applications on two PCs. Each client application will detect faces in a video file and send the face images to the same IP address. We can see the recognition results in the logs provided for each of the two containers. Once the process has finished, we can run terminals for the containers and analyze the results. Here are the lists of resulting images for the two containers.

Image 3

Image 4

The results show how the load balancer distributes the requests almost evenly across two pods: the first container received and processed 252 face images while the second container did 223. You can see how easy it is to manage our servers with Kubernetes and scale the system to any number of client applications.

Next Step

In the next article, we’ll discuss some aspects of developing a facial recognition system from scratch. Stay tuned!

This article is part of the series 'Hybrid Edge AI for Facial Recognition View All


This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Written By
Team Leader VIPAKS
Russian Federation Russian Federation

Master’s degree in Mechanics.

PhD degree in Mathematics and Physics.


15 years’ experience in developing scientific programs
(C#, C++, Delphi, Java, Fortran).


Mathematical modeling, symbolic computer algebra, numerical methods, 3D geometry modeling, artificial intelligence, differential equations, boundary value problems.

Comments and Discussions

GeneralMy vote of 5 Pin
Ștefan-Mihai MOGA29-Sep-21 20:12
professionalȘtefan-Mihai MOGA29-Sep-21 20:12 
QuestionMessage Closed Pin
26-Aug-21 22:28
household packers26-Aug-21 22:28 
QuestionMessage Closed Pin
6-Aug-21 0:51
Member 153151056-Aug-21 0:51 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.