Click here to Skip to main content
15,887,746 members
Articles / Artificial Intelligence

CodeProject.AI Server: AI the easy way.

Rate me:
Please Sign up or sign in to vote.
5.00/5 (91 votes)
29 Feb 202416 min read 3.2M   489.3K   270   6.7K
Version 2.6.2. Our fast, free, self-hosted Artificial Intelligence Server for any platform, any language
CodeProject.AI Server is a locally installed, self-hosted, fast, free and Open Source Artificial Intelligence server for any platform, any language. No off-device or out of network data transfer, no messing around with dependencies, and able to be used from any platform, any language. Runs as a Windows Service or a Docker container.

Image 1

Previous

Quick Links

CodeProject.AI Server: An Artificial Intelligence Server

For those who want to integrate AI functionality into their applications without writing the AI functionality or dealing with the insanely painful task of ensuring everything is setup correctly. CodeProject.AI Server manages your MLOps for you.

Think of CodeProject.AI Server like a database server: you install it, it runs in the background, and provides AI operations for any application via a simple API. The AI operations are handled by drop-in modules that can be easily created using any language, any stack, as long as that stack runs on the host machine. Python, .NET, node - whatever works for you.

CodeProject.AI server runs as a Windows service, under systemd in Linux, or on startup on macOS. Alternatively there are multiple Docker images for x64, arm64 and CUDA enabled systems. Any language that can make HTTP calls can access the service, and the server does not require an external internet connection. Your data stays in your network.

Image 2 Image 3 Image 4 Image 5 Image 6 Image 7 Image 8 Image 9 Image 10
Windows macOS macOS-arm64 Ubuntu Debian Raspberry Pi Orange Pi Jetson Nano Docker

What Does It Do?

Image 11

The CodeProject.AI Server's Dashboard

Currently CodeProject.AI Server contains AI modules that provide:

  • Object Detection (Python and .NET versions that use YOLO, plus a Tensorflow-Lite module that's ultra-lightweight and great for Raspberry Pi and Coral USB sticks
  • Face Detection and recognition
  • Text processing such as sentiment analysis and summarization
  • Image processing such as background removal, background blur, cartoon-isation and resolution enhancement
  • Model training, including dataset acquisition, for YOLO object detection

How Do I Use It?

Install the server and start making calls to the API. It's that easy.

Guides, Help, FAQs

CodeProject.AI Server Home Assistant Blue Iris

Image 12

The CodeProject.AI Server's Explorer in action

Why We Built CodeProject.AI Server

  • AI programming is something every single developer should be aware of

    We wanted a fun project we could use to help teach developers and get them involved in AI. We'll be using CodeProject.AI Server as a focus for articles and exploration to make it fun and painless to learn AI programming.

    We want your contributions!

  • AI coding examples have too many moving parts

    You need to install packages and languages and extensions to tools, and then updates and libraries (but version X, not version Y) and then you have to configure paths and...Oh, you want to run on Windows not Linux? In that case, you need to... It's all too hard. There was much yelling at CodeProject.

    CodeProject.AI Server includes everything you need in a single installer. CodeProject.AI Server also provides an installation script that will setup your dev environment and get you debugging within a couple of clicks.

  • AI solutions often require the use of cloud services

    If you trust the cloud provider, or understand the billing structure, or can be assured you aren't sending sensitive data or won't go over the free tier, this is fine. If you have a webcam inside your house, or can't work out how much AWS will charge, it's not so OK.

    CodeProject.AI Server can be installed locally. Your machine, your network, no data needs to leave your device.

1: Running and Playing With the Features

  1. Install and Run
    1. For a Windows Service, download the latest version, install, and launch the shortcut to the server's dashboard on your desktop or open a browser to http://localhost:32168.

      If you wish to take advantage of a CUDA enabled NVIDIA GPU, please ensure you have the CUDA drivers installed before you install CodeProject.AI. We recommend CUDA 11.8 if running Windows

    2. For a Docker Container for 64 Bit Linux, run:
      docker run -p 32168:32168 --name CodeProject.AI -d codeproject/ai-server

      For Docker GPU (supports NVIDIA CUDA), please use:

      docker run --gpus all -p 32168:32168 --name CodeProject.AI -d codeproject/ai-server:cuda11_7
  2. On the dashboard, at the top, is a link to the demo playground. Open that and play!

2: Running and Debugging the Code

  1. Clone the CodeProject CodeProject.AI Server repository.
  2. Make sure you have Visual Studio Code or Visual Studio 2019+ installed.
  3. Run the setup script in /src
  4. Debug the front-end server application (see notes below, but it's easy).

3. Using CodeProject.AI Server in My Application

Here's an example of using the API for scene detection using a simple JavaScript call:

HTML
<html>
<body>
Detect the scene in this file: <input id="image" type="file" />
<input type="button" value="Detect Scene" onclick="detectScene(image)" />

<script>
function detectScene(fileChooser) {
    var formData = new FormData();
    formData.append('image', fileChooser.files[0]);

    fetch('http://localhost:5000/v1/vision/detect/scene', {
        method: "POST",
        body: formData
    })
    .then(response => {
        if (response.ok) response.json().then(data => {
            console.log(`Scene is ${data.label}, ${data.confidence} confidence`)
        });
    });
}
</script>
</body>
</html>

You can include the CodeProject.AI Server installer (or just a link to the latest version of the installer) in your own apps and installers and voila, you have an AI enabled app.

See the API documentation for a complete rundown of functionality.

Notes on the installers

The native installers (Windows, Ubuntu and macOS) all install the server as a service. On Windows it's a Windows service, on Ubuntu it uses systemd, and on macOS it's simply a login item so will start each time you login.

For all platforms, open http://localhost:32168 to view the dashboard.

To uninstall, please take note of the instructions when you install. For reference:

  • Windows uses the standard Windows installer, so use the Control Panel / Apps and Features applet to manage the installation.
     
  • Ubuntu uses dpkg, so to uninstall simply call
    Bash
    sudo dpkg -r codeproject.ai-server
  • macOS uninstall is via the command line
    Shell
    sudo bash "/Library/CodeProject.AI Server/<version>/uninstall.sh"

Notes on CUDA and Nvidia Support

If you have a CUDA enabled Nvidia card, please then ensure you

  1. install the CUDA Drivers (We recommend CUDA 11.7 or CUDA 11.8 if running Windows)
  2. Install CUDA Toolkit 11.8.
  3. Download and run our cuDNN install script to install cuDNN 8.9.4.

Nvidia downloads and drivers are challenging! Please ensure you download a driver that is compatible with CUDA 11.7+, which generally means the CUDA driver version 516.94 or below. Version 522.x or above may not work. You may need to refer to the release notes for each driver to confirm.

Our Docker images are based on CUDA 11.7 (for legacy reasons) and 12.2. As long as you have a driver installed that can handle 11.7 or 12.2 then the docker image will interface with your drivers and work fine.

CUDA 12.2 brings a few challenges with code that uses PyTorch due to the move to Torch 2.0, so we tend to favour 11.7. Some older cards will not be compatible with CUDA 12, or even CUDA 11.7. If you are struggling with older cards that don't support CUDA 11.7 then post a comment and we'll try and help.

Since we are using CUDA 11.7+ (which has support for compute capability 3.7 and above), we can only support Nvidia CUDA cards that are equal to or better than a GK210 or Tesla K80 card. Please refer to this table of supported cards to determine if your card has compute capability 3.7 or above.

Newer cards such as the GTX 10xx, 20xx and 30xx series, RTX, MX series are fully supported.

AI is a memory intensive operation. Some cards with 2GB RAM or less may struggle in some situations. Using the dashboard, you can either disable modules you don't need, or disable GPU support entirely for one or more modules. This will free up memory and help get you back on track.

What Does It Include?

CodeProject.AI Server includes:

  • A HTTP REST API Server. The server listens for requests from other apps, passes them to the backend analysis services for processing, and then passes the results back to the caller. It runs as a simple self-contained web service on your device.
  • Backend Analysis services. The brains of the operation is in the analysis services sitting behind the front end API. All processing of data is done on the current machine. No calls to the cloud and no data leaving the device.
  • The source code, naturally.

CodeProject.AI Server can currently

  • Detect objects in images
  • Detect faces in images
  • Detect the type of scene represented in an image
  • Recognise faces that have been registered with the service
  • Perform detection on custom models

The development environment also provides modules that can

  • Remove a background from an image
  • Blur a background from an image
  • Enhance the resolution of an image
  • Pull out the most important sentences in text to generate a text summary
  • Prove sentiment analysis on text

We will be constantly expanding the feature list.

Our Goals

  • To promote AI development and inspire the AI developer community to dive in and have a go. Artificial Intelligence is a huge paradigm change in the industry and all developers owe it to themselves to experiment in and familiarize themselves with the technology. CodeProject.AI Server was built as a learning tool, a demonstration, and a library and service that can be used out of the box.
  • To make AI development easy. It's not that AI development is that hard. It's that there are so, so many options. Our architecture is designed to allow any AI implementation to find a home in our system, and for our service to be callable from any language.
  • To focus on core use-cases. We're deliberately not a solution for everyone. Instead, we're a solution for common day-to-day needs. We will be adding dozens of modules and scores of AI capabilities to our system, but our goal is always clarity and simplicity over a 100% solution.
  • To tap the expertise of the Developer Community. We're not experts but we know a developer or two out there who are. The true power of CodeProject.AI Server comes from the contributions and improvements from our AI community.

License

CodeProject.AI Server is licensed under the Server-Side Public License.

Release Notes

What's New - 2.6

  • You can now select, at install time, which modules you wish to have initially installed
  • Some modules (Coral, Yolov8) now allow you to download individual models at runtime via the dashboard.
  • A new generative AI module (Llama LLM Chatbot)
  • A standardised way to handle (in code) modules that run long processes such as generative AI
  • Debian support has been improved
  • Small UI improvements to the dashboard
  • Some simplification of the modulesettings files
  • The inclusion, in the source code, of template .NET and Python modules (both simple and long process demos)
  • Improvements to the Coral and ALPR modules (thanks to Seth and Mike)
  • Docker CUDA 12.2 image now includes cuDNN
  • Install script fixes
  • Added Object Segmentation to the YOLOv8 module

Previous Versions

Release 2.5

  • Dynamic Explorer UI: Each module now supplies its own UI for the explorer
  • Improved dashboard and explorer
    • The module listing now shows module version history if you click the version number
    • Explorer benchmark has been updated to use the custom models of the currently active object detection module
    • The Info button on the dashboard now includes a status data dump from the module. For things like object detectors, it will include a dictionary of labels / counts so you can see what's being detected. For longer running modules such as training it will include the training status. This is here to enable better UI features in the future
  • Updated module settings schema that includes module author and original project acknowledgement
  • Installer fixes
  • Improved Jetson support
  • Lots of bug fixes, but specifically there was a script issue affecting module installs, and a modulesettings.json issue affecting the YOLOv5 6.2 module, as well as the SuperResolution module.
  • Updated ALPR, OCR (PP-OCR4 support thanks to Mike Lud) and Coral Object Detection (multi-TPU support thanks to Seth Price) modules
  • Pre-installed modules in Docker can now be uninstalled / reinstalled
  • A new Sound Classifier module has been included
  • 2.5.4: A separate status update from each module that decouples the stats for a module. This just cleans things up a little on the backend
  • 2.5.4: Minor modulesettings.json schema update, which introduces the concept of model requirements.
  • 2.5.5: Support for long running processes with accompanying stable difussion module.

Release 2.4

  • Mesh support Automatically offload inference work to other servers on your network based on inference speed. Zero config, and dashboard support to enable/disable.
  • CUDA detection fixed
  • Module self-test performed on installation
  • YOLOv8 module added
  • YOLOv5 .NET module fixes for GPU, and YOLOv5 3.1 GPU support fixed
  • Python package and .NET installation issues fixed
  • Better prompts for admin-only installs
  • More logging output to help diagnose issues
  • VC Redist hash error fixed
  • General bug fixes.
  • Breaking: modulesettings.json schema changed

Release 2.3

  • A focus on improving the installation of modules at runtime. More error checks, faster re-install, better reporting, and manual fallbacks in situations where admin rights are needed
  • A revamped SDK that removes much (or all, in some cases) of the boilerplate code needed in install scripts
  • Fine grained support for different CUDA versions as well as systems such as Raspberry Pi, Orange Pi and Jetson
  • Support for CUDA 12.2
  • GPU support for PaddlePaddle (OCR and license plate readers benefit)
  • CUDA 12.2 Docker image
  • Lots of bug fixes in install scripts
  • UI tweaks
  • 2.3.4 ALPR now using GPU in Windows
  • 2.3.4 Corrections to Linux/macOS installers

Release 2.2.0

This release is still in testing and is focussed mainly on the installation process

  • An entirely new Windows installer offering more installation options and a smoother upgrade experience from here on.
  • New macOS and Ubuntu native installers, for x64 and arm64 (including Raspberry Pi)
  • A new installation SDK for making module installers far easier
  • Improved installation feedback and self-checks
  • Coral.AI support for Linux, macOS (version 11 and 12 only) and Windows
  • Updates:
    • 2.2.1 - 2.2.3 various installer fixes
    • 2.2.4 - Fix to remove chunking in order to allow HTTP1.1 access to the API (Blue Iris fix)

Release 2.1.x Beta

  • Improved Raspberry Pi support. A new, fast object detection module with support for the Coral.AI TPU, all within an Arm64 Docker image
  • All modules can now be installed / uninstalled (rather than having some modules fixed and uninstallable).
  • Installer is streamlined: Only the server is installed at installation time, and on first run, we install Object Detection (Python and .NET) and Face Processing (which can be uninstalled).
  • Reworking of the Python module SDK. Modules are new child classes, not aggregators of our module runner.
  • Reworking of the modulesettings file to make it simpler and have less replication
  • Improved logging: quantity, quality, filtering and better information
  • Addition of two modules: ObjectDetectionTFLite for Object Detection on Raspberry Pi using Coral, and Cartoonise for some fun
  • Improvements to half-precision support checks on CUDA cards
  • Modules are now versioned and our module registry will now only show modules that fit your current server version.
  • Various bug fixes
  • Shared Python runtimes now in /runtimes.
  • All modules moved from the /AnalysisLayer folder to the /modules folder
  • Tested on CUDA 12
     
  • Patch 2.1.11: YOLO training modulke now allows you to use your own dataset. YOLO 6.2 / Face Processing reverted back to Torch 1.13.
  • Patch 2.1.10: Added YOLOv5 training module and support. Improved system info. Orange Pi and NVIDIA Jetson support. Added Triggers. Renamed VersionCompatibililty to ModuleReleases. Becoz speling.
  • Patch 2.1.9: Increased and adjustable module install timeout and improved install logs. Fixes around resource contention in PyTorch, Fixes to resource usage reporting, improved Native Linux/WSL CUDA setup. Async fixes. Improvements to half-precision support.
  • Patch 2.1.8: Reduced, drastically, the load on the system while getting CPU/GPU usage updates.
  • Patch 2.1.7: Fixed a memory / resource leak that may have been causing server shutdowns
  • Patch 2.1.6 and below: Installer fixes

Please see our CUDA Notes for information on setting up, and restrictions around, Nvidia cards and CUDA support.

If you are upgrading: when the dashboard launches, it might be necessary to force-reload (Ctrl+R on Windows) the dashboard to ensure you are viewing the latest version.

Release 2.0.x Beta

  • 2.0.8: Improved analysis process management. Stamp out those errant memory hogging Python processes!
  • 2.0.7: Improved logging, both file based and in the dashboard, module installer/uninstaller bug fixes
  • 2.0.6: Corrected issues with downloadable modules installer
  • Our new Module Registry: download and install modules at runtime via the dashboard
  • Improved performance for the Object Detection modules
  • Optional YOLO 3.1 Object Detection module for older GPUs
  • Optimised RAM use
  • Support for Raspberry Pi 4+. Code and run natively directly on the Raspberry Pi using VSCode natively
  • Revamped dashboard
  • New timing reporting for each API call
  • New, simplified setup and install scripts

Release 1.6.x Beta

  • Optimised RAM use
  • Ability to enable / disable modules and GPU support via the dashboard
  • REST settings API for updating settings on the fly
  • Apple M1/M2 GPU support
  • Workarounds for some Nvidia cards
  • Async processes and logging for a performance boost
  • Breaking: The CustomObjectDetection is now part of ObjectDetectionYolo
  • Performance fix for CPU + video demo
  • Patch 1.6.7: potential memory leak addressed
  • Patch 1.6.8: image handling improvements on Linux, multi-thread ONNX on .NET

Release 1.5.6.2 Beta

  • Docker nVidia GPU support
  • Further performance improvements
  • cuDNN install script to help with nVidia driver and toolkit installation
  • Bug fixes

Release 1.5.6 Beta

  • nVidia GPU support for Windows
  • Perf improvements to Python modules
  • Work on the Python SDK to make creating modules easier
  • Dev installers now drastically simplified for those creating new modules
  • Added SuperResolution as a demo module

Release 1.5 Beta

  • Support for custom models

Release 1.3.x Beta

  • Refactored and improved setup and module addition system
  • Introduction of modulesettings.json files
  • New analysis modules

Release 1.2.x Beta

  • Support for Apple Silicon for development mode
  • Native Windows installer
  • Runs as Windows Service
  • Run in a Docker Container
  • Installs and builds using VSCode in Linux (Ubuntu), macOS and Windows, as well as Visual Studio on Windows
  • General optimisation of the download payload sizes

Previous

  • We started with a proof of concept on Windows 10+ only. Installs we via a simple BAT script, and the code is full of exciting sharp edges. A simple dashboard and playground are included. Analysis is currently Python code only.
  • Version checks are enabled to alert users to new versions.
  • A new .NET implementation scene detection using the YOLO model to ensure the codebase is platform and tech stack agnostic
  • Blue Iris integration completed.

Written By
Software Developer CodeProject Solutions
Canada Canada
The CodeProject team have been writing software, building communities, and hosting CodeProject.com for over 20 years. We are passionate about helping developers share knowledge, learn new skills, and connect. We believe everyone can code, and every contribution, no matter how small, helps.

The CodeProject team is currently focussing on CodeProject.AI Server, a stand-alone, self-hosted server that provides AI inferencing services on any platform for any language. Learn AI by jumping in the deep end with us: codeproject.com/AI.
This is a Organisation

4 members

Comments and Discussions

 
GeneralRe: BlueIris and CodeProject AI timeouts - error 500 Pin
Member 1622164913-Mar-24 3:08
Member 1622164913-Mar-24 3:08 
GeneralRe: BlueIris and CodeProject AI timeouts - error 500 Pin
radiocooke13-Mar-24 17:13
radiocooke13-Mar-24 17:13 
AnswerRe: BlueIris and CodeProject AI timeouts - error 500 -- Open Pin
Matthew Dennis14-Mar-24 7:10
sysadminMatthew Dennis14-Mar-24 7:10 
GeneralRe: BlueIris and CodeProject AI timeouts - error 500 -- Open Pin
radiocooke14-Mar-24 14:05
radiocooke14-Mar-24 14:05 
GeneralRe: BlueIris and CodeProject AI timeouts - error 500 -- Open Pin
radiocooke14-Mar-24 16:21
radiocooke14-Mar-24 16:21 
GeneralRe: BlueIris and CodeProject AI timeouts - error 500 -- Open Pin
radiocooke14-Mar-24 18:07
radiocooke14-Mar-24 18:07 
GeneralRe: BlueIris and CodeProject AI timeouts - error 500 -- Open Pin
Chris Maunder19-Mar-24 18:43
cofounderChris Maunder19-Mar-24 18:43 
QuestionUbuntu 22.04 - ALPR - GPU -- TODO Pin
Rabittn1-Mar-24 3:47
Rabittn1-Mar-24 3:47 
Having trouble with ALPR on Ubuntu 22.04 CPAI version 2.5.4

Is gpu supported on Linux?

Thank you

Adding info:

Server version: 2.5.4
System: Linux
Operating System: Linux (Ubuntu 22.04)
CPUs: Intel(R) Core(TM) i7-4790 CPU @ 3.60GHz (Intel)
1 CPU x 4 cores. 8 logical processors (x64)
GPU (Primary): NVIDIA GeForce GTX 1650 (4 GiB) (NVIDIA)
Driver: 535.154.05, CUDA: 12.2 (up to: 12.2), Compute: 7.5, cuDNN: 9.0.0
System RAM: 4 GiB
Platform: Linux
BuildConfig: Release
Execution Env: Native
Runtime Env: Production
.NET framework: .NET 7.0.16
Default Python: 3.10
Go Version:
Video adapter info:
Xeon E3-1200 v3/4th Gen Core Processor Integrated Graphics Controller (rev 06):
Driver Version
Video Processor
TU117 [GeForce GTX 1650] (rev a1):
Driver Version
Video Processor
System GPU info:
GPU 3D Usage 3%
GPU RAM Usage 1.5 GiB
Global Environment variables:
CPAI_APPROOTPATH = <root>
CPAI_PORT = 32168




Logs
17:35:05:Preparing to install module 'ALPR'
17:35:05:Downloading module 'ALPR'
17:35:06:Installing module 'ALPR'
17:35:06:ALPR: Setting verbosity to loud
17:35:06:ALPR:              Installing CodeProject.AI Analysis Module                
17:35:06:ALPR: ======================================================================
17:35:06:ALPR:                    CodeProject.AI Installer                           
17:35:06:ALPR: ======================================================================
17:35:06:ALPR: 9.08 GiB of 40.01 GiB available on linux
17:35:06:ALPR: os, arch             = linux x86_64
17:35:06:ALPR: systemName, platform = linux, linux
17:35:06:ALPR: SSH                  = false
17:35:06:ALPR: setupMode            = InstallModule
17:35:06:ALPR: executionEnvironment = Production
17:35:06:ALPR: rootDirPath          = /usr/bin/codeproject.ai-server-2.5.4
17:35:06:ALPR: appRootDirPath       = /usr/bin/codeproject.ai-server-2.5.4
17:35:06:ALPR: setupScriptDirPath   = /usr/bin/codeproject.ai-server-2.5.4
17:35:06:ALPR: sdkScriptsDirPath    = /usr/bin/codeproject.ai-server-2.5.4/SDK/Scripts
17:35:06:ALPR: runtimesDirPath      = /usr/bin/codeproject.ai-server-2.5.4/runtimes
17:35:06:ALPR: modulesDirPath       = /usr/bin/codeproject.ai-server-2.5.4/modules
17:35:06:ALPR: downloadDirPath      = /usr/bin/codeproject.ai-server-2.5.4/downloads
17:35:06:ALPR: Installing xz-utils...
17:35:06:ALPR: WARNING: aptWARNING:  aptdoes not have a stable CLI interface.  Use with caution in scripts.does not have a stable CLI interface. 
17:35:06:ALPR: Use with caution in scripts.
17:35:06:ALPR: Hit:1 http://security.ubuntu.com/ubuntu focal-security InRelease
17:35:06:ALPR: Hit:2 http://archive.ubuntu.com/ubuntu jammy InRelease
17:35:06:ALPR: Hit:3 http://archive.ubuntu.com/ubuntu jammy-updates InRelease
17:35:06:ALPR: Hit:4 https://packages.microsoft.com/ubuntu/20.04/prod focal InRelease
17:35:06:ALPR: Hit:5 http://archive.ubuntu.com/ubuntu jammy-security InRelease
17:35:06:ALPR: Hit:6 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64  InRelease
17:35:06:ALPR: Hit:7 https://ppa.launchpadcontent.net/deadsnakes/ppa/ubuntu jammy InRelease
17:35:06:ALPR: General CodeProject.AI setup                                          
17:35:06:ALPR: Setting permissions on downloads folder...Done
17:35:06:ALPR: Setting permissions on runtimes folder...Done
17:35:06:ALPR: Setting permissions on persisted data folder...Done
17:35:06:ALPR: GPU support                                                           
17:35:06:ALPR: CUDA (NVIDIA) Present: Yes (CUDA 11.8, cuDNN 9.0.0)
17:35:06:ALPR: ROCm (AMD) Present:    No
17:35:06:ALPR: MPS (Apple) Present:   No
17:35:06:ALPR: Reading module settingsReading package lists....Modules.ALPR.Name is License Plate Reader in modulesettings.json
17:35:06:ALPR: ..Modules.ALPR.Version is 3.0.2 in modulesettings.json
17:35:06:ALPR: ..Modules.ALPR.LaunchSettings.Runtime is python3.8 in modulesettings.json
17:35:06:ALPR: ..Modules.ALPR.LaunchSettings.RuntimeLocation is Local in modulesettings.json
17:35:07:ALPR: ..Modules.ALPR.LaunchSettings.FilePath is ALPR_adapter.py in modulesettings.json
17:35:07:ALPR: ..Modules.ALPR.GpuOptions.InstallGPU is false in modulesettings.linux.json
17:35:07:ALPR: ..Modules.ALPR.InstallOptions.Platforms is [ "all" ] in modulesettings.json
17:35:07:ALPR: .Done
17:35:07:ALPR: Processing module ALPR 3.0.2                                          
17:35:07:ALPR: moduleName        = License Plate Reader
17:35:07:ALPR: moduleVersion     = 3.0.2
17:35:07:ALPR: runtime           = python3.8
17:35:07:ALPR: runtimeLocation   = Local
17:35:07:ALPR: installGPU        = false
17:35:07:ALPR: pythonVersion     = 3.8
17:35:07:ALPR: virtualEnvDirPath = /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv
17:35:07:ALPR: venvPythonCmdPath = /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/bin/python3.8
17:35:07:ALPR: packagesDirPath   = /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/
17:35:07:ALPR: Installing Python 3.8
17:35:07:ALPR: Python install path is /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38
17:35:07:ALPR: Python 3.8 is already installed
17:35:07:ALPR: Ensuring PIP in base python install...Reading package lists...
17:35:07:ALPR: Building dependency tree...
17:35:07:ALPR: Reading state information...
17:35:07:ALPR: All packages are up to date.
17:35:08:ALPR: W: https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/InRelease: Key is stored in legacy trusted.gpg keyring (/etc/apt/trusted.gpg), see the DEPRECATION section in apt-key(8) for details.
17:35:08:ALPR: Building dependency tree...
17:35:08:ALPR: Reading state information...
17:35:08:ALPR: 0 upgraded, 0 newly installed, 1 reinstalled, 0 to remove and 0 not upgraded.
17:35:08:ALPR: Need to get 0 B/193 kB of archives.
17:35:08:ALPR: After this operation, 0 B of additional disk space will be used.
17:35:09:ALPR: (Reading database ... 
17:35:09:ALPR: (Reading database ... 5%
17:35:09:ALPR: (Reading database ... 10%
17:35:09:ALPR: (Reading database ... 15%
17:35:09:ALPR: (Reading database ... 20%
17:35:09:ALPR: (Reading database ... 25%
17:35:09:ALPR: (Reading database ... 30%
17:35:09:ALPR: (Reading database ... 35%
17:35:09:ALPR: (Reading database ... 40%
17:35:09:ALPR: (Reading database ... 45%
17:35:09:ALPR: (Reading database ... 50%
17:35:09:ALPR: (Reading database ... 55%
17:35:09:ALPR: (Reading database ... 60%
17:35:09:ALPR: (Reading database ... 65%
17:35:09:ALPR: (Reading database ... 70%
17:35:09:ALPR: (Reading database ... 75%
17:35:09:ALPR: (Reading database ... 80%
17:35:09:ALPR: (Reading database ... 85%
17:35:09:ALPR: (Reading database ... 90%
17:35:09:ALPR: (Reading database ... 95%
17:35:09:ALPR: (Reading database ... 100%
17:35:09:ALPR: (Reading database ... 84385 files and directories currently installed.)
17:35:09:ALPR: Preparing to unpack .../python3.8-distutils_3.8.18-1+jammy1_all.deb ...
17:35:10:ALPR: Unpacking python3.8-distutils (3.8.18-1+jammy1) over (3.8.18-1+jammy1) ...
17:35:10:ALPR: Setting up python3.8-distutils (3.8.18-1+jammy1) ...
17:35:10:ALPR: /usr/bin/codeproject.ai-server-2.5.4/SDK/Scripts/utils.sh: line 1181: /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/bin/python3.8: No such file or directory
17:35:10:ALPR: done
17:35:10:ALPR: Upgrading PIP in base python install...Requirement already satisfied: pip in /usr/local/lib/python3.8/dist-packages (24.0)
17:35:10:ALPR: WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
17:35:10:ALPR: done
17:35:10:ALPR: Installing Virtual Environment tools for Linux...
17:35:10:ALPR: Searching for installed dependencies:
17:35:11:ALPR:  -> python3-pip python3-setuptools python3.8-venv Done
17:35:11:ALPR: All dependencies already installed.
17:35:11:ALPR: Creating Virtual Environment (Local)...Install path is /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38
17:35:14:ALPR: Done
17:35:14:ALPR: Checking for Python 3.8...(Found Python 3.8.18) All good
17:35:17:ALPR: Upgrading PIP in virtual environment... done
17:35:19:ALPR: Installing updated setuptools in venv... Done
17:35:19:ALPR: Downloading ocr-en-pp_ocrv4-paddle.zip to /usr/bin/codeproject.ai-server-2.5.4/downloads/ALPR
17:35:19:ALPR: Extracting to paddleocr in this folder
17:35:19:ALPR: Downloading OCR models... already exists...Expanding...Archive:  ocr-en-pp_ocrv4-paddle.zip
17:35:19:ALPR:   inflating: paddleocr/ch_ppocr_mobile_v2.0_cls_infer/inference.pdiparams  
17:35:19:ALPR:   inflating: paddleocr/ch_ppocr_mobile_v2.0_cls_infer/inference.pdiparams.info  
17:35:19:ALPR:   inflating: paddleocr/ch_ppocr_mobile_v2.0_cls_infer/inference.pdmodel  
17:35:19:ALPR:   inflating: paddleocr/en_PP-OCRv3_det_infer/inference.pdiparams  
17:35:19:ALPR:   inflating: paddleocr/en_PP-OCRv3_det_infer/inference.pdiparams.info  
17:35:19:ALPR:   inflating: paddleocr/en_PP-OCRv3_det_infer/inference.pdmodel  
17:35:19:ALPR:    creating: paddleocr/en_PP-OCRv4_rec_infer/
17:35:19:ALPR:   inflating: paddleocr/en_PP-OCRv4_rec_infer/inference.pdiparams  
17:35:19:ALPR:   inflating: paddleocr/en_PP-OCRv4_rec_infer/inference.pdiparams.info  
17:35:19:ALPR:   inflating: paddleocr/en_PP-OCRv4_rec_infer/inference.pdmodel  
17:35:19:ALPR: Done.
17:35:19:ALPR: Moving contents of ocr-en-pp_ocrv4-paddle.zip to paddleocr...done.
17:35:19:ALPR: Installing Python packages for License Plate Reader
17:35:19:ALPR: Installing GPU-enabled libraries: No
17:35:19:ALPR: Ensuring PIP is installed and up to date...
17:35:19:ALPR: Searching for installed dependencies:
17:35:19:ALPR:  -> python3-pip Done
17:35:19:ALPR: All dependencies already installed.
17:35:19:ALPR: Ensuring PIP compatibility ...
17:35:20:ALPR: Looking in links: /tmp/tmp75yep611
17:35:20:ALPR: Requirement already satisfied: setuptools in ./bin/linux/python38/venv/lib/python3.8/site-packages (69.1.1)
17:35:20:ALPR: Requirement already satisfied: pip in ./bin/linux/python38/venv/lib/python3.8/site-packages (24.0)
17:35:20:ALPR: Python packages will be specified by requirements.linux.txt
17:35:21:ALPR:   - Installing PaddelPaddle, the Deep Learning platform...Checking ...Check done...Installing paddlepaddle==2.6.0...Collecting paddlepaddle==2.6.0
17:35:21:ALPR:   Using cached paddlepaddle-2.6.0-cp38-cp38-manylinux1_x86_64.whl.metadata (8.6 kB)
17:35:21:ALPR: Collecting httpx (from paddlepaddle==2.6.0)
17:35:21:ALPR:   Using cached httpx-0.27.0-py3-none-any.whl.metadata (7.2 kB)
17:35:22:ALPR: Collecting numpy>=1.13 (from paddlepaddle==2.6.0)
17:35:22:ALPR:   Using cached numpy-1.24.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.6 kB)
17:35:22:ALPR: Collecting Pillow (from paddlepaddle==2.6.0)
17:35:22:ALPR:   Using cached pillow-10.2.0-cp38-cp38-manylinux_2_28_x86_64.whl.metadata (9.7 kB)
17:35:22:ALPR: Collecting decorator (from paddlepaddle==2.6.0)
17:35:22:ALPR:   Using cached decorator-5.1.1-py3-none-any.whl.metadata (4.0 kB)
17:35:22:ALPR: Collecting astor (from paddlepaddle==2.6.0)
17:35:22:ALPR:   Using cached astor-0.8.1-py2.py3-none-any.whl.metadata (4.2 kB)
17:35:22:ALPR: Collecting opt-einsum==3.3.0 (from paddlepaddle==2.6.0)
17:35:22:ALPR:   Using cached opt_einsum-3.3.0-py3-none-any.whl.metadata (6.5 kB)
17:35:22:ALPR: Collecting protobuf>=3.20.2 (from paddlepaddle==2.6.0)
17:35:22:ALPR:   Using cached protobuf-4.25.3-cp37-abi3-manylinux2014_x86_64.whl.metadata (541 bytes)
17:35:23:ALPR: Collecting anyio (from httpx->paddlepaddle==2.6.0)
17:35:23:ALPR:   Using cached anyio-4.3.0-py3-none-any.whl.metadata (4.6 kB)
17:35:23:ALPR: Collecting certifi (from httpx->paddlepaddle==2.6.0)
17:35:23:ALPR:   Using cached certifi-2024.2.2-py3-none-any.whl.metadata (2.2 kB)
17:35:23:ALPR: Collecting httpcore==1.* (from httpx->paddlepaddle==2.6.0)
17:35:23:ALPR:   Using cached httpcore-1.0.4-py3-none-any.whl.metadata (20 kB)
17:35:23:ALPR: Collecting idna (from httpx->paddlepaddle==2.6.0)
17:35:23:ALPR:   Using cached idna-3.6-py3-none-any.whl.metadata (9.9 kB)
17:35:23:ALPR: Collecting sniffio (from httpx->paddlepaddle==2.6.0)
17:35:23:ALPR:   Using cached sniffio-1.3.1-py3-none-any.whl.metadata (3.9 kB)
17:35:23:ALPR: Collecting h11<0.15,>=0.13 (from httpcore==1.*->httpx->paddlepaddle==2.6.0)
17:35:23:ALPR:   Using cached h11-0.14.0-py3-none-any.whl.metadata (8.2 kB)
17:35:23:ALPR: Collecting exceptiongroup>=1.0.2 (from anyio->httpx->paddlepaddle==2.6.0)
17:35:23:ALPR:   Using cached exceptiongroup-1.2.0-py3-none-any.whl.metadata (6.6 kB)
17:35:23:ALPR: Collecting typing-extensions>=4.1 (from anyio->httpx->paddlepaddle==2.6.0)
17:35:23:ALPR:   Using cached typing_extensions-4.10.0-py3-none-any.whl.metadata (3.0 kB)
17:35:23:ALPR: Using cached paddlepaddle-2.6.0-cp38-cp38-manylinux1_x86_64.whl (125.7 MB)
17:35:24:ALPR: Using cached opt_einsum-3.3.0-py3-none-any.whl (65 kB)
17:35:24:ALPR: Using cached numpy-1.24.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.3 MB)
17:35:24:ALPR: Using cached protobuf-4.25.3-cp37-abi3-manylinux2014_x86_64.whl (294 kB)
17:35:24:ALPR: Using cached astor-0.8.1-py2.py3-none-any.whl (27 kB)
17:35:24:ALPR: Using cached decorator-5.1.1-py3-none-any.whl (9.1 kB)
17:35:24:ALPR: Using cached httpx-0.27.0-py3-none-any.whl (75 kB)
17:35:24:ALPR: Using cached httpcore-1.0.4-py3-none-any.whl (77 kB)
17:35:24:ALPR: Using cached pillow-10.2.0-cp38-cp38-manylinux_2_28_x86_64.whl (4.5 MB)
17:35:24:ALPR: Using cached anyio-4.3.0-py3-none-any.whl (85 kB)
17:35:24:ALPR: Using cached idna-3.6-py3-none-any.whl (61 kB)
17:35:24:ALPR: Using cached sniffio-1.3.1-py3-none-any.whl (10 kB)
17:35:24:ALPR: Using cached certifi-2024.2.2-py3-none-any.whl (163 kB)
17:35:24:ALPR: Using cached exceptiongroup-1.2.0-py3-none-any.whl (16 kB)
17:35:24:ALPR: Using cached h11-0.14.0-py3-none-any.whl (58 kB)
17:35:24:ALPR: Using cached typing_extensions-4.10.0-py3-none-any.whl (33 kB)
17:35:25:ALPR: Installing collected packages: typing-extensions, sniffio, protobuf, Pillow, numpy, idna, h11, exceptiongroup, decorator, certifi, astor, opt-einsum, httpcore, anyio, httpx, paddlepaddle
17:35:35:ALPR: Successfully installed Pillow-10.2.0 anyio-4.3.0 astor-0.8.1 certifi-2024.2.2 decorator-5.1.1 exceptiongroup-1.2.0 h11-0.14.0 httpcore-1.0.4 httpx-0.27.0 idna-3.6 numpy-1.24.4 opt-einsum-3.3.0 paddlepaddle-2.6.0 protobuf-4.25.3 sniffio-1.3.1 typing-extensions-4.10.0
17:35:36:ALPR: (✅ checked) Done
17:35:38:ALPR:   - Installing PaddleOCR, the OCR toolkit based on PaddlePaddle...Checking ...Check done...Installing paddleocr==2.7.0.3...Collecting paddleocr==2.7.0.3
17:35:38:ALPR:   Using cached paddleocr-2.7.0.3-py3-none-any.whl.metadata (26 kB)
17:35:39:ALPR: Collecting shapely (from paddleocr==2.7.0.3)
17:35:39:ALPR:   Using cached shapely-2.0.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (7.0 kB)
17:35:39:ALPR: Collecting scikit-image (from paddleocr==2.7.0.3)
17:35:39:ALPR:   Using cached scikit_image-0.21.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (14 kB)
17:35:39:ALPR: Collecting imgaug (from paddleocr==2.7.0.3)
17:35:39:ALPR:   Using cached imgaug-0.4.0-py2.py3-none-any.whl.metadata (1.8 kB)
17:35:39:ALPR: Collecting pyclipper (from paddleocr==2.7.0.3)
17:35:39:ALPR:   Using cached pyclipper-1.3.0.post5-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl.metadata (9.0 kB)
17:35:39:ALPR: Collecting lmdb (from paddleocr==2.7.0.3)
17:35:39:ALPR:   Using cached lmdb-1.4.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (1.2 kB)
17:35:39:ALPR: Collecting tqdm (from paddleocr==2.7.0.3)
17:35:39:ALPR:   Using cached tqdm-4.66.2-py3-none-any.whl.metadata (57 kB)
17:35:40:ALPR: Collecting numpy (from paddleocr==2.7.0.3)
17:35:40:ALPR:   Using cached numpy-1.24.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.6 kB)
17:35:40:ALPR: Collecting visualdl (from paddleocr==2.7.0.3)
17:35:40:ALPR:   Using cached visualdl-2.5.3-py3-none-any.whl.metadata (25 kB)
17:35:41:ALPR: Collecting rapidfuzz (from paddleocr==2.7.0.3)
17:35:41:ALPR:   Using cached rapidfuzz-3.6.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (11 kB)
17:35:41:ALPR: Collecting opencv-python<=4.6.0.66 (from paddleocr==2.7.0.3)
17:35:41:ALPR:   Using cached opencv_python-4.6.0.66-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (18 kB)
17:35:41:ALPR: Collecting opencv-contrib-python<=4.6.0.66 (from paddleocr==2.7.0.3)
17:35:41:ALPR:   Using cached opencv_contrib_python-4.6.0.66-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (18 kB)
17:35:41:ALPR: Collecting cython (from paddleocr==2.7.0.3)
17:35:41:ALPR:   Using cached Cython-3.0.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.2 kB)
17:35:42:ALPR: Collecting lxml (from paddleocr==2.7.0.3)
17:35:42:ALPR:   Using cached lxml-5.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.5 kB)
17:35:42:ALPR: Collecting premailer (from paddleocr==2.7.0.3)
17:35:42:ALPR:   Using cached premailer-3.10.0-py2.py3-none-any.whl.metadata (15 kB)
17:35:42:ALPR: Collecting openpyxl (from paddleocr==2.7.0.3)
17:35:42:ALPR:   Using cached openpyxl-3.1.2-py2.py3-none-any.whl.metadata (2.5 kB)
17:35:42:ALPR: Collecting attrdict (from paddleocr==2.7.0.3)
17:35:42:ALPR:   Using cached attrdict-2.0.1-py2.py3-none-any.whl.metadata (6.7 kB)
17:35:42:ALPR: Collecting PyMuPDF<1.21.0 (from paddleocr==2.7.0.3)
17:35:42:ALPR:   Using cached PyMuPDF-1.20.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (8.5 kB)
17:35:42:ALPR: Collecting Pillow>=10.0.0 (from paddleocr==2.7.0.3)
17:35:42:ALPR:   Using cached pillow-10.2.0-cp38-cp38-manylinux_2_28_x86_64.whl.metadata (9.7 kB)
17:35:42:ALPR: Collecting pyyaml (from paddleocr==2.7.0.3)
17:35:42:ALPR:   Using cached PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB)
17:35:42:ALPR: Collecting python-docx (from paddleocr==2.7.0.3)
17:35:42:ALPR:   Using cached python_docx-1.1.0-py3-none-any.whl.metadata (2.0 kB)
17:35:43:ALPR: Collecting beautifulsoup4 (from paddleocr==2.7.0.3)
17:35:43:ALPR:   Using cached beautifulsoup4-4.12.3-py3-none-any.whl.metadata (3.8 kB)
17:35:43:ALPR: Collecting fonttools>=4.24.0 (from paddleocr==2.7.0.3)
17:35:43:ALPR:   Using cached fonttools-4.49.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (159 kB)
17:35:43:ALPR: Collecting fire>=0.3.0 (from paddleocr==2.7.0.3)
17:35:43:ALPR:   Using cached fire-0.5.0-py2.py3-none-any.whl
17:35:43:ALPR: Collecting pdf2docx (from paddleocr==2.7.0.3)
17:35:43:ALPR:   Using cached pdf2docx-0.5.8-py3-none-any.whl.metadata (3.2 kB)
17:35:43:ALPR: Collecting six (from fire>=0.3.0->paddleocr==2.7.0.3)
17:35:43:ALPR:   Using cached six-1.16.0-py2.py3-none-any.whl.metadata (1.8 kB)
17:35:43:ALPR: Collecting termcolor (from fire>=0.3.0->paddleocr==2.7.0.3)
17:35:43:ALPR:   Using cached termcolor-2.4.0-py3-none-any.whl.metadata (6.1 kB)
17:35:43:ALPR: Collecting soupsieve>1.2 (from beautifulsoup4->paddleocr==2.7.0.3)
17:35:43:ALPR:   Using cached soupsieve-2.5-py3-none-any.whl.metadata (4.7 kB)
17:35:43:ALPR: Collecting scipy (from imgaug->paddleocr==2.7.0.3)
17:35:43:ALPR:   Using cached scipy-1.10.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (58 kB)
17:35:44:ALPR: Collecting matplotlib (from imgaug->paddleocr==2.7.0.3)
17:35:44:ALPR:   Using cached matplotlib-3.7.5-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.metadata (5.7 kB)
17:35:44:ALPR: Collecting imageio (from imgaug->paddleocr==2.7.0.3)
17:35:44:ALPR:   Using cached imageio-2.34.0-py3-none-any.whl.metadata (4.9 kB)
17:35:44:ALPR: Collecting networkx>=2.8 (from scikit-image->paddleocr==2.7.0.3)
17:35:44:ALPR:   Using cached networkx-3.1-py3-none-any.whl.metadata (5.3 kB)
17:35:44:ALPR: Collecting tifffile>=2022.8.12 (from scikit-image->paddleocr==2.7.0.3)
17:35:44:ALPR:   Using cached tifffile-2023.7.10-py3-none-any.whl.metadata (31 kB)
17:35:44:ALPR: Collecting PyWavelets>=1.1.1 (from scikit-image->paddleocr==2.7.0.3)
17:35:44:ALPR:   Using cached PyWavelets-1.4.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (1.9 kB)
17:35:44:ALPR: Collecting packaging>=21 (from scikit-image->paddleocr==2.7.0.3)
17:35:44:ALPR:   Using cached packaging-23.2-py3-none-any.whl.metadata (3.2 kB)
17:35:44:ALPR: Collecting lazy_loader>=0.2 (from scikit-image->paddleocr==2.7.0.3)
17:35:44:ALPR:   Using cached lazy_loader-0.3-py3-none-any.whl.metadata (4.3 kB)
17:35:44:ALPR: Collecting et-xmlfile (from openpyxl->paddleocr==2.7.0.3)
17:35:44:ALPR:   Using cached et_xmlfile-1.1.0-py3-none-any.whl.metadata (1.8 kB)
17:35:44:ALPR: Collecting opencv-python-headless>=4.5 (from pdf2docx->paddleocr==2.7.0.3)
17:35:44:ALPR:   Using cached opencv_python_headless-4.9.0.80-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (20 kB)
17:35:44:ALPR: Collecting typing-extensions (from python-docx->paddleocr==2.7.0.3)
17:35:44:ALPR:   Using cached typing_extensions-4.10.0-py3-none-any.whl.metadata (3.0 kB)
17:35:44:ALPR: Collecting cssselect (from premailer->paddleocr==2.7.0.3)
17:35:44:ALPR:   Using cached cssselect-1.2.0-py2.py3-none-any.whl.metadata (2.2 kB)
17:35:45:ALPR: Collecting cssutils (from premailer->paddleocr==2.7.0.3)
17:35:45:ALPR:   Using cached cssutils-2.9.0-py3-none-any.whl.metadata (9.1 kB)
17:35:45:ALPR: Collecting requests (from premailer->paddleocr==2.7.0.3)
17:35:45:ALPR:   Using cached requests-2.31.0-py3-none-any.whl.metadata (4.6 kB)
17:35:45:ALPR: Collecting cachetools (from premailer->paddleocr==2.7.0.3)
17:35:45:ALPR:   Using cached cachetools-5.3.3-py3-none-any.whl.metadata (5.3 kB)
17:35:45:ALPR: Collecting bce-python-sdk (from visualdl->paddleocr==2.7.0.3)
17:35:45:ALPR:   Using cached bce_python_sdk-0.9.4-py3-none-any.whl.metadata (318 bytes)
17:35:45:ALPR: Collecting flask>=1.1.1 (from visualdl->paddleocr==2.7.0.3)
17:35:45:ALPR:   Using cached flask-3.0.2-py3-none-any.whl.metadata (3.6 kB)
17:35:45:ALPR: Collecting Flask-Babel>=3.0.0 (from visualdl->paddleocr==2.7.0.3)
17:35:45:ALPR:   Using cached flask_babel-4.0.0-py3-none-any.whl.metadata (1.9 kB)
17:35:45:ALPR: Collecting protobuf>=3.20.0 (from visualdl->paddleocr==2.7.0.3)
17:35:45:ALPR:   Using cached protobuf-4.25.3-cp37-abi3-manylinux2014_x86_64.whl.metadata (541 bytes)
17:35:46:ALPR: Collecting pandas (from visualdl->paddleocr==2.7.0.3)
17:35:46:ALPR:   Using cached pandas-2.0.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (18 kB)
17:35:46:ALPR: Collecting rarfile (from visualdl->paddleocr==2.7.0.3)
17:35:46:ALPR:   Using cached rarfile-4.1-py3-none-any.whl.metadata (4.4 kB)
17:35:46:ALPR: Collecting psutil (from visualdl->paddleocr==2.7.0.3)
17:35:46:ALPR:   Using cached psutil-5.9.8-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (21 kB)
17:35:46:ALPR: Collecting Werkzeug>=3.0.0 (from flask>=1.1.1->visualdl->paddleocr==2.7.0.3)
17:35:46:ALPR:   Using cached werkzeug-3.0.1-py3-none-any.whl.metadata (4.1 kB)
17:35:46:ALPR: Collecting Jinja2>=3.1.2 (from flask>=1.1.1->visualdl->paddleocr==2.7.0.3)
17:35:46:ALPR:   Using cached Jinja2-3.1.3-py3-none-any.whl.metadata (3.3 kB)
17:35:46:ALPR: Collecting itsdangerous>=2.1.2 (from flask>=1.1.1->visualdl->paddleocr==2.7.0.3)
17:35:46:ALPR:   Using cached itsdangerous-2.1.2-py3-none-any.whl.metadata (2.9 kB)
17:35:46:ALPR: Collecting click>=8.1.3 (from flask>=1.1.1->visualdl->paddleocr==2.7.0.3)
17:35:46:ALPR:   Using cached click-8.1.7-py3-none-any.whl.metadata (3.0 kB)
17:35:46:ALPR: Collecting blinker>=1.6.2 (from flask>=1.1.1->visualdl->paddleocr==2.7.0.3)
17:35:46:ALPR:   Using cached blinker-1.7.0-py3-none-any.whl.metadata (1.9 kB)
17:35:46:ALPR: Collecting importlib-metadata>=3.6.0 (from flask>=1.1.1->visualdl->paddleocr==2.7.0.3)
17:35:46:ALPR:   Using cached importlib_metadata-7.0.1-py3-none-any.whl.metadata (4.9 kB)
17:35:46:ALPR: Collecting Babel>=2.12 (from Flask-Babel>=3.0.0->visualdl->paddleocr==2.7.0.3)
17:35:46:ALPR:   Using cached Babel-2.14.0-py3-none-any.whl.metadata (1.6 kB)
17:35:46:ALPR: Collecting pytz>=2022.7 (from Flask-Babel>=3.0.0->visualdl->paddleocr==2.7.0.3)
17:35:46:ALPR:   Using cached pytz-2024.1-py2.py3-none-any.whl.metadata (22 kB)
17:35:47:ALPR: Collecting pycryptodome>=3.8.0 (from bce-python-sdk->visualdl->paddleocr==2.7.0.3)
17:35:47:ALPR:   Using cached pycryptodome-3.20.0-cp35-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.4 kB)
17:35:47:ALPR: Collecting future>=0.6.0 (from bce-python-sdk->visualdl->paddleocr==2.7.0.3)
17:35:47:ALPR:   Using cached future-1.0.0-py3-none-any.whl.metadata (4.0 kB)
17:35:47:ALPR: Collecting contourpy>=1.0.1 (from matplotlib->imgaug->paddleocr==2.7.0.3)
17:35:47:ALPR:   Using cached contourpy-1.1.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.9 kB)
17:35:47:ALPR: Collecting cycler>=0.10 (from matplotlib->imgaug->paddleocr==2.7.0.3)
17:35:47:ALPR:   Using cached cycler-0.12.1-py3-none-any.whl.metadata (3.8 kB)
17:35:47:ALPR: Collecting kiwisolver>=1.0.1 (from matplotlib->imgaug->paddleocr==2.7.0.3)
17:35:47:ALPR:   Using cached kiwisolver-1.4.5-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl.metadata (6.4 kB)
17:35:47:ALPR: Collecting pyparsing>=2.3.1 (from matplotlib->imgaug->paddleocr==2.7.0.3)
17:35:47:ALPR:   Using cached pyparsing-3.1.1-py3-none-any.whl.metadata (5.1 kB)
17:35:47:ALPR: Collecting python-dateutil>=2.7 (from matplotlib->imgaug->paddleocr==2.7.0.3)
17:35:47:ALPR:   Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl.metadata (8.4 kB)
17:35:47:ALPR: Collecting importlib-resources>=3.2.0 (from matplotlib->imgaug->paddleocr==2.7.0.3)
17:35:47:ALPR:   Using cached importlib_resources-6.1.2-py3-none-any.whl.metadata (3.9 kB)
17:35:47:ALPR: Collecting tzdata>=2022.1 (from pandas->visualdl->paddleocr==2.7.0.3)
17:35:47:ALPR:   Using cached tzdata-2024.1-py2.py3-none-any.whl.metadata (1.4 kB)
17:35:48:ALPR: Collecting charset-normalizer<4,>=2 (from requests->premailer->paddleocr==2.7.0.3)
17:35:48:ALPR:   Using cached charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (33 kB)
17:35:48:ALPR: Collecting idna<4,>=2.5 (from requests->premailer->paddleocr==2.7.0.3)
17:35:48:ALPR:   Using cached idna-3.6-py3-none-any.whl.metadata (9.9 kB)
17:35:48:ALPR: Collecting urllib3<3,>=1.21.1 (from requests->premailer->paddleocr==2.7.0.3)
17:35:48:ALPR:   Using cached urllib3-2.2.1-py3-none-any.whl.metadata (6.4 kB)
17:35:48:ALPR: Collecting certifi>=2017.4.17 (from requests->premailer->paddleocr==2.7.0.3)
17:35:48:ALPR:   Using cached certifi-2024.2.2-py3-none-any.whl.metadata (2.2 kB)
17:35:48:ALPR: Collecting zipp>=0.5 (from importlib-metadata>=3.6.0->flask>=1.1.1->visualdl->paddleocr==2.7.0.3)
17:35:48:ALPR:   Using cached zipp-3.17.0-py3-none-any.whl.metadata (3.7 kB)
17:35:48:ALPR: Collecting MarkupSafe>=2.0 (from Jinja2>=3.1.2->flask>=1.1.1->visualdl->paddleocr==2.7.0.3)
17:35:48:ALPR:   Using cached MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.0 kB)
17:35:49:ALPR: Using cached paddleocr-2.7.0.3-py3-none-any.whl (465 kB)
17:35:49:ALPR: Using cached fonttools-4.49.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.7 MB)
17:35:49:ALPR: Using cached opencv_contrib_python-4.6.0.66-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.1 MB)
17:35:49:ALPR: Using cached numpy-1.24.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.3 MB)
17:35:49:ALPR: Using cached opencv_python-4.6.0.66-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (60.9 MB)
17:35:50:ALPR: Using cached pillow-10.2.0-cp38-cp38-manylinux_2_28_x86_64.whl (4.5 MB)
17:35:50:ALPR: Using cached PyMuPDF-1.20.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (8.8 MB)
17:35:50:ALPR: Using cached attrdict-2.0.1-py2.py3-none-any.whl (9.9 kB)
17:35:50:ALPR: Using cached beautifulsoup4-4.12.3-py3-none-any.whl (147 kB)
17:35:50:ALPR: Using cached Cython-3.0.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB)
17:35:50:ALPR: Using cached imgaug-0.4.0-py2.py3-none-any.whl (948 kB)
17:35:50:ALPR: Using cached scikit_image-0.21.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (13.9 MB)
17:35:50:ALPR: Using cached lmdb-1.4.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (298 kB)
17:35:50:ALPR: Using cached lxml-5.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (8.0 MB)
17:35:50:ALPR: Using cached openpyxl-3.1.2-py2.py3-none-any.whl (249 kB)
17:35:50:ALPR: Using cached pdf2docx-0.5.8-py3-none-any.whl (132 kB)
17:35:50:ALPR: Using cached python_docx-1.1.0-py3-none-any.whl (239 kB)
17:35:50:ALPR: Using cached premailer-3.10.0-py2.py3-none-any.whl (19 kB)
17:35:50:ALPR: Using cached pyclipper-1.3.0.post5-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl (682 kB)
17:35:50:ALPR: Using cached PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (736 kB)
17:35:50:ALPR: Using cached rapidfuzz-3.6.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.4 MB)
17:35:51:ALPR: Using cached shapely-2.0.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.5 MB)
17:35:51:ALPR: Using cached tqdm-4.66.2-py3-none-any.whl (78 kB)
17:35:51:ALPR: Using cached visualdl-2.5.3-py3-none-any.whl (6.3 MB)
17:35:51:ALPR: Using cached flask-3.0.2-py3-none-any.whl (101 kB)
17:35:51:ALPR: Using cached flask_babel-4.0.0-py3-none-any.whl (9.6 kB)
17:35:51:ALPR: Using cached imageio-2.34.0-py3-none-any.whl (313 kB)
17:35:51:ALPR: Using cached lazy_loader-0.3-py3-none-any.whl (9.1 kB)
17:35:51:ALPR: Using cached networkx-3.1-py3-none-any.whl (2.1 MB)
17:35:51:ALPR: Using cached opencv_python_headless-4.9.0.80-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (49.6 MB)
17:35:52:ALPR: Using cached packaging-23.2-py3-none-any.whl (53 kB)
17:35:52:ALPR: Using cached protobuf-4.25.3-cp37-abi3-manylinux2014_x86_64.whl (294 kB)
17:35:52:ALPR: Using cached PyWavelets-1.4.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.9 MB)
17:35:52:ALPR: Using cached scipy-1.10.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (34.5 MB)
17:35:52:ALPR: Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
17:35:52:ALPR: Using cached soupsieve-2.5-py3-none-any.whl (36 kB)
17:35:52:ALPR: Using cached tifffile-2023.7.10-py3-none-any.whl (220 kB)
17:35:52:ALPR: Using cached bce_python_sdk-0.9.4-py3-none-any.whl (329 kB)
17:35:52:ALPR: Using cached cachetools-5.3.3-py3-none-any.whl (9.3 kB)
17:35:52:ALPR: Using cached cssselect-1.2.0-py2.py3-none-any.whl (18 kB)
17:35:52:ALPR: Using cached cssutils-2.9.0-py3-none-any.whl (398 kB)
17:35:52:ALPR: Using cached et_xmlfile-1.1.0-py3-none-any.whl (4.7 kB)
17:35:52:ALPR: Using cached matplotlib-3.7.5-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (9.2 MB)
17:35:53:ALPR: Using cached pandas-2.0.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.4 MB)
17:35:53:ALPR: Using cached psutil-5.9.8-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (288 kB)
17:35:53:ALPR: Using cached rarfile-4.1-py3-none-any.whl (28 kB)
17:35:53:ALPR: Using cached requests-2.31.0-py3-none-any.whl (62 kB)
17:35:53:ALPR: Using cached termcolor-2.4.0-py3-none-any.whl (7.7 kB)
17:35:53:ALPR: Using cached typing_extensions-4.10.0-py3-none-any.whl (33 kB)
17:35:53:ALPR: Using cached Babel-2.14.0-py3-none-any.whl (11.0 MB)
17:35:53:ALPR: Using cached blinker-1.7.0-py3-none-any.whl (13 kB)
17:35:53:ALPR: Using cached certifi-2024.2.2-py3-none-any.whl (163 kB)
17:35:53:ALPR: Using cached charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (141 kB)
17:35:53:ALPR: Using cached click-8.1.7-py3-none-any.whl (97 kB)
17:35:53:ALPR: Using cached contourpy-1.1.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (301 kB)
17:35:53:ALPR: Using cached cycler-0.12.1-py3-none-any.whl (8.3 kB)
17:35:53:ALPR: Using cached future-1.0.0-py3-none-any.whl (491 kB)
17:35:53:ALPR: Using cached idna-3.6-py3-none-any.whl (61 kB)
17:35:53:ALPR: Using cached importlib_metadata-7.0.1-py3-none-any.whl (23 kB)
17:35:53:ALPR: Using cached importlib_resources-6.1.2-py3-none-any.whl (34 kB)
17:35:53:ALPR: Using cached itsdangerous-2.1.2-py3-none-any.whl (15 kB)
17:35:53:ALPR: Using cached Jinja2-3.1.3-py3-none-any.whl (133 kB)
17:35:53:ALPR: Using cached kiwisolver-1.4.5-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.2 MB)
17:35:53:ALPR: Using cached pycryptodome-3.20.0-cp35-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.1 MB)
17:35:53:ALPR: Using cached pyparsing-3.1.1-py3-none-any.whl (103 kB)
17:35:53:ALPR: Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB)
17:35:53:ALPR: Using cached pytz-2024.1-py2.py3-none-any.whl (505 kB)
17:35:53:ALPR: Using cached tzdata-2024.1-py2.py3-none-any.whl (345 kB)
17:35:53:ALPR: Using cached urllib3-2.2.1-py3-none-any.whl (121 kB)
17:35:53:ALPR: Using cached werkzeug-3.0.1-py3-none-any.whl (226 kB)
17:35:53:ALPR: Using cached MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (26 kB)
17:35:53:ALPR: Using cached zipp-3.17.0-py3-none-any.whl (7.4 kB)
17:35:55:ALPR: Installing collected packages: pytz, pyclipper, lmdb, zipp, urllib3, tzdata, typing-extensions, tqdm, termcolor, soupsieve, six, rarfile, rapidfuzz, pyyaml, pyparsing, PyMuPDF, pycryptodome, psutil, protobuf, Pillow, packaging, numpy, networkx, MarkupSafe, lxml, lazy_loader, kiwisolver, itsdangerous, idna, future, fonttools, et-xmlfile, cython, cycler, cssutils, cssselect, click, charset-normalizer, certifi, cachetools, blinker, Babel, Werkzeug, tifffile, shapely, scipy, requests, PyWavelets, python-docx, python-dateutil, openpyxl, opencv-python-headless, opencv-python, opencv-contrib-python, Jinja2, importlib-resources, importlib-metadata, imageio, fire, contourpy, beautifulsoup4, bce-python-sdk, attrdict, scikit-image, premailer, pdf2docx, pandas, matplotlib, flask, imgaug, Flask-Babel, visualdl, paddleocr
17:36:25:ALPR: Successfully installed Babel-2.14.0 Flask-Babel-4.0.0 Jinja2-3.1.3 MarkupSafe-2.1.5 Pillow-10.2.0 PyMuPDF-1.20.2 PyWavelets-1.4.1 Werkzeug-3.0.1 attrdict-2.0.1 bce-python-sdk-0.9.4 beautifulsoup4-4.12.3 blinker-1.7.0 cachetools-5.3.3 certifi-2024.2.2 charset-normalizer-3.3.2 click-8.1.7 contourpy-1.1.1 cssselect-1.2.0 cssutils-2.9.0 cycler-0.12.1 cython-3.0.8 et-xmlfile-1.1.0 fire-0.5.0 flask-3.0.2 fonttools-4.49.0 future-1.0.0 idna-3.6 imageio-2.34.0 imgaug-0.4.0 importlib-metadata-7.0.1 importlib-resources-6.1.2 itsdangerous-2.1.2 kiwisolver-1.4.5 lazy_loader-0.3 lmdb-1.4.1 lxml-5.1.0 matplotlib-3.7.5 networkx-3.1 numpy-1.24.4 opencv-contrib-python-4.6.0.66 opencv-python-4.6.0.66 opencv-python-headless-4.9.0.80 openpyxl-3.1.2 packaging-23.2 paddleocr-2.7.0.3 pandas-2.0.3 pdf2docx-0.5.8 premailer-3.10.0 protobuf-4.25.3 psutil-5.9.8 pyclipper-1.3.0.post5 pycryptodome-3.20.0 pyparsing-3.1.1 python-dateutil-2.9.0.post0 python-docx-1.1.0 pytz-2024.1 pyyaml-6.0.1 rapidfuzz-3.6.1 rarfile-4.1 requests-2.31.0 scikit-image-0.21.0 scipy-1.10.1 shapely-2.0.3 six-1.16.0 soupsieve-2.5 termcolor-2.4.0 tifffile-2023.7.10 tqdm-4.66.2 typing-extensions-4.10.0 tzdata-2024.1 urllib3-2.2.1 visualdl-2.5.3 zipp-3.17.0
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/__pycache__ already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/idna already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/pillow.libs already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/numpy.libs already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/protobuf-4.25.3.dist-info already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/numpy-1.24.4.dist-info already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/pillow-10.2.0.dist-info already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/typing_extensions.py already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/certifi-2024.2.2.dist-info already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/idna-3.6.dist-info already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/PIL already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/numpy already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/certifi already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/google already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/typing_extensions-4.10.0.dist-info already exists. Specify --upgrade to force replacement.
17:36:25:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/bin already exists. Specify --upgrade to force replacement.
17:36:27:ALPR: (✅ checked) Done
17:36:28:ALPR:   - Installing imutils, the image utilities library...Checking ...Check done...Installing imutils...Collecting imutils
17:36:28:ALPR:   Using cached imutils-0.5.4-py3-none-any.whl
17:36:29:ALPR: Installing collected packages: imutils
17:36:30:ALPR: Successfully installed imutils-0.5.4
17:36:30:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/bin already exists. Specify --upgrade to force replacement.
17:36:31:ALPR: (✅ checked) Done
17:36:32:ALPR:   - Installing Pillow, a Python Image Library...Checking ...Check done...Already installed
17:36:33:ALPR:   - Installing OpenCV, the Computer Vision library for Python...Checking ...Check done...Already installed
17:36:34:ALPR:   - Installing NumPy, a package for scientific computing...Checking ...Check done...Already installed
17:36:34:ALPR: Installing Python packages for the CodeProject.AI Server SDK
17:36:34:ALPR: Ensuring PIP is installed and up to date...
17:36:34:ALPR: Searching for installed dependencies:
17:36:34:ALPR:  -> python3-pip Done
17:36:34:ALPR: All dependencies already installed.
17:36:34:ALPR: Ensuring PIP compatibility ...
17:36:35:ALPR: Looking in links: /tmp/tmpbtotuc2_
17:36:35:ALPR: Requirement already satisfied: setuptools in ./bin/linux/python38/venv/lib/python3.8/site-packages (69.1.1)
17:36:35:ALPR: Requirement already satisfied: pip in ./bin/linux/python38/venv/lib/python3.8/site-packages (24.0)
17:36:36:ALPR: Python packages will be specified by requirements.txt
17:36:37:ALPR:   - Installing Pillow, a Python Image Library...Checking ...Check done...Already installed
17:36:38:ALPR:   - Installing Charset normalizer...Checking ...Check done...Already installed
17:36:39:ALPR:   - Installing aiohttp, the Async IO HTTP library...Checking ...Check done...Installing aiohttp...Collecting aiohttp
17:36:39:ALPR:   Using cached aiohttp-3.9.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (7.4 kB)
17:36:39:ALPR: Collecting aiosignal>=1.1.2 (from aiohttp)
17:36:39:ALPR:   Using cached aiosignal-1.3.1-py3-none-any.whl.metadata (4.0 kB)
17:36:40:ALPR: Collecting attrs>=17.3.0 (from aiohttp)
17:36:40:ALPR:   Using cached attrs-23.2.0-py3-none-any.whl.metadata (9.5 kB)
17:36:40:ALPR: Collecting frozenlist>=1.1.1 (from aiohttp)
17:36:40:ALPR:   Using cached frozenlist-1.4.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (12 kB)
17:36:40:ALPR: Collecting multidict<7.0,>=4.5 (from aiohttp)
17:36:40:ALPR:   Using cached multidict-6.0.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.2 kB)
17:36:40:ALPR: Collecting yarl<2.0,>=1.0 (from aiohttp)
17:36:40:ALPR:   Using cached yarl-1.9.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (31 kB)
17:36:40:ALPR: Collecting async-timeout<5.0,>=4.0 (from aiohttp)
17:36:40:ALPR:   Using cached async_timeout-4.0.3-py3-none-any.whl.metadata (4.2 kB)
17:36:40:ALPR: Collecting idna>=2.0 (from yarl<2.0,>=1.0->aiohttp)
17:36:40:ALPR:   Using cached idna-3.6-py3-none-any.whl.metadata (9.9 kB)
17:36:40:ALPR: Using cached aiohttp-3.9.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.3 MB)
17:36:40:ALPR: Using cached aiosignal-1.3.1-py3-none-any.whl (7.6 kB)
17:36:40:ALPR: Using cached async_timeout-4.0.3-py3-none-any.whl (5.7 kB)
17:36:40:ALPR: Using cached attrs-23.2.0-py3-none-any.whl (60 kB)
17:36:40:ALPR: Using cached frozenlist-1.4.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (240 kB)
17:36:40:ALPR: Using cached multidict-6.0.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (129 kB)
17:36:40:ALPR: Using cached yarl-1.9.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (308 kB)
17:36:40:ALPR: Using cached idna-3.6-py3-none-any.whl (61 kB)
17:36:41:ALPR: Installing collected packages: multidict, idna, frozenlist, attrs, async-timeout, yarl, aiosignal, aiohttp
17:36:41:ALPR: Successfully installed aiohttp-3.9.3 aiosignal-1.3.1 async-timeout-4.0.3 attrs-23.2.0 frozenlist-1.4.1 idna-3.6 multidict-6.0.5 yarl-1.9.4
17:36:41:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/idna already exists. Specify --upgrade to force replacement.
17:36:41:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/idna-3.6.dist-info already exists. Specify --upgrade to force replacement.
17:36:43:ALPR: (✅ checked) Done
17:36:43:ALPR:   - Installing aiofiles, the Async IO Files library...Checking ...Check done...Installing aiofiles...Collecting aiofiles
17:36:43:ALPR:   Using cached aiofiles-23.2.1-py3-none-any.whl.metadata (9.7 kB)
17:36:43:ALPR: Using cached aiofiles-23.2.1-py3-none-any.whl (15 kB)
17:36:44:ALPR: Installing collected packages: aiofiles
17:36:44:ALPR: Successfully installed aiofiles-23.2.1
17:36:45:ALPR: (✅ checked) Done
17:36:46:ALPR:   - Installing py-cpuinfo to allow us to query CPU info...Checking ...Check done...Installing py-cpuinfo...Collecting py-cpuinfo
17:36:46:ALPR:   Using cached py_cpuinfo-9.0.0-py3-none-any.whl.metadata (794 bytes)
17:36:46:ALPR: Using cached py_cpuinfo-9.0.0-py3-none-any.whl (22 kB)
17:36:47:ALPR: Installing collected packages: py-cpuinfo
17:36:47:ALPR: Successfully installed py-cpuinfo-9.0.0
17:36:47:ALPR: WARNING: Target directory /usr/bin/codeproject.ai-server-2.5.4/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/bin already exists. Specify --upgrade to force replacement.
17:36:48:ALPR: (✅ checked) Done
17:36:49:ALPR:   - Installing Requests, the HTTP library...Checking ...Check done...Already installed
17:36:49:ALPR: Executing post-install script for License Plate Reader
17:36:49:ALPR: Applying PaddleOCR patch
17:36:49:ALPR: SELF TEST START ======================================================
17:36:54:ALPR: Running verify PaddlePaddle program ... 
17:36:54:ALPR: PaddlePaddle works well on 1 CPU.
17:36:54:ALPR: PaddlePaddle is installed successfully! Let's start deep learning with PaddlePaddle now.
17:36:54:ALPR: Self-test passed
17:36:54:ALPR: SELF TEST END   ======================================================
17:36:54:ALPR: Module setup time 00:01:48
17:36:54:ALPR:                 Setup complete                                        
17:36:54:ALPR: Total setup time 00:01:48
17:36:54:Module ALPR installed successfully.


modified 4-Mar-24 16:31pm.

AnswerRe: Ubuntu 22.04 - ALPR - GPU -- TODO Pin
Sean Ewington4-Mar-24 10:33
staffSean Ewington4-Mar-24 10:33 
QuestionWindows 2.5.6 is now available Pin
Sean Ewington29-Feb-24 8:57
staffSean Ewington29-Feb-24 8:57 
AnswerRe: Windows 2.5.6 is now available Pin
rdxny229-Feb-24 13:36
rdxny229-Feb-24 13:36 
AnswerRe: Windows 2.5.6 is now available Pin
bashby29-Feb-24 15:26
bashby29-Feb-24 15:26 
GeneralRe: Windows 2.5.6 is now available Pin
Sean Ewington1-Mar-24 5:21
staffSean Ewington1-Mar-24 5:21 
AnswerRe: Windows 2.5.6 is now available Pin
Klauss2146591-Mar-24 9:42
Klauss2146591-Mar-24 9:42 
GeneralRe: Windows 2.5.6 is now available Pin
Mike Lud1-Mar-24 9:52
communityengineerMike Lud1-Mar-24 9:52 
AnswerRe: Windows 2.5.6 is now available Pin
shane.x11-Mar-24 15:43
shane.x11-Mar-24 15:43 
QuestionUnable to download module 'ObjectDetectionCoral -- Investigating Pin
Michael Brewer 202328-Feb-24 14:38
Michael Brewer 202328-Feb-24 14:38 
AnswerRe: Unable to download module 'ObjectDetectionCoral Pin
Sean Ewington29-Feb-24 11:25
staffSean Ewington29-Feb-24 11:25 
Admin2.5.5 is on hold Pin
Chris Maunder28-Feb-24 12:54
cofounderChris Maunder28-Feb-24 12:54 
GeneralRe: 2.5.5 is on hold Pin
Rick Marshall 202328-Feb-24 17:17
Rick Marshall 202328-Feb-24 17:17 
GeneralRe: 2.5.5 is on hold Pin
Jeff Iverson1-Mar-24 9:12
Jeff Iverson1-Mar-24 9:12 
BugYOLOv5 .NET no longer installing... please help -- Resolved Pin
ItsNovi28-Feb-24 4:13
ItsNovi28-Feb-24 4:13 
GeneralRe: YOLOv5 .NET no longer installing... please help Pin
Chris Maunder28-Feb-24 13:01
cofounderChris Maunder28-Feb-24 13:01 
GeneralRe: YOLOv5 .NET no longer installing... please help Pin
ItsNovi29-Feb-24 2:11
ItsNovi29-Feb-24 2:11 
Questionversion 2.5.5 service wont start, please help -- Resolved Pin
Josh36165528-Feb-24 4:02
Josh36165528-Feb-24 4:02 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.