
omni-lpr
io.github.habedi/omni-lpr
An MCP server for automatic license plate recognition
Documentation
Omni-LPR is a self-hostable server that provides automatic license plate recognition (ALPR) capabilities via a REST API and the Model Context Protocol (MCP). It can be used both as a standalone ALPR microservice and as an ALPR toolbox for AI agents and large language models (LLMs).
Why Omni-LPR?
Using Omni-LPR can have the following benefits:
-
Decoupling. Your main application can be in any programming language. It doesn't need to be tangled up with Python or specific ML dependencies because the server handles all of that.
-
Multiple Interfaces. You aren't locked into one way of communicating. You can use a standard REST API from any app, or you can use MCP, which is designed for AI agent integration.
-
Ready-to-Deploy. You don't have to build it from scratch. There are pre-built Docker images that are easy to deploy and start using immediately.
-
Hardware Acceleration. The server is optimized for the hardware you have. It supports generic CPUs (ONNX), Intel CPUs (OpenVINO), and NVIDIA GPUs (CUDA).
-
Asynchronous I/O. It's built on Starlette, which means it has high-performance, non-blocking I/O. It can handle many concurrent requests without getting bogged down.
-
Scalability. Because it's a separate service, it can be scaled independently of your main application. If you suddenly need more ALPR power, you can scale Omni-LPR up without touching anything else.
See the ROADMAP.md for the list of implemented and planned features.
[!IMPORTANT] Omni-LPR is in early development, so bugs and breaking API changes are expected. Please use the issues page to report bugs or request features.
Quickstart
You can get started with Omni-LPR in a few minutes by following the steps described below.
1. Install the Server
You can install Omni-LPR using pip:
pip install omni-lpr
2. Start the Server
When installed, start the server with a single command:
omni-lpr
By default, the server will be listening on http://127.0.0.1:8000.
You can confirm it's running by accessing the health check endpoint:
curl http://127.0.0.1:8000/api/health
# Sample expected output: {"status": "ok", "version": "0.3.4"}
3. Recognize a License Plate
Now you can make a request to recognize a license plate from an image. The example below uses a publicly available image URL.
curl -X POST \
-H "Content-Type: application/json" \
-d '{"path": "https://www.olavsplates.com/foto_n/n_cx11111.jpg"}' \
http://127.0.0.1:8000/api/v1/tools/detect_and_recognize_plate_from_path/invoke
You should receive a JSON response with the detected license plate information.
Usage
Omni-LPR exposes its capabilities as "tools" that can be called via a REST API or over the MCP.
Available Tools
The server provides tools for listing models, recognizing plates from image data, and recognizing plates from a path.
-
list_models: Lists the available detector and OCR models. -
Tools that process image data (provided as Base64 or file upload):
recognize_plate: Recognizes text from a pre-cropped license plate image.detect_and_recognize_plate: Detects and recognizes all license plates in a full image.
-
Tools that process an image path (a URL or local file path):
recognize_plate_from_path: Recognizes text from a pre-cropped license plate image at a given path.detect_and_recognize_plate_from_path: Detects and recognizes plates in a full image at a given path.
For more details on how to use the different tools and provide image data, please see the API Documentation.
REST API
The REST API provides a standard way to interact with the server. All tool endpoints are available under the /api/v1
prefix. Once the server is running, you can access interactive API documentation in the Swagger UI
at http://127.0.0.1:8000/api/v1/apidoc/swagger.
MCP Interface
The server also exposes its tools over the MCP for integration with AI agents and LLMs. The MCP endpoint is available at http://127.0.0.1:8000/mcp/, via streamable HTTP.
You can use a tool like MCP Inspector to explore the available MCP tools.
Integration
You can connect any client that supports the MCP protocol to the server. The following examples show how to use the server with LM Studio.
LM Studio Configuration
{
"mcpServers": {
"omni-lpr-local": {
"url": "http://127.0.0.1:8000/mcp/"
}
}
}
Tool Usage Examples
The screenshot of using the list_models tool in LM Studio to list the available models for the APLR.
The screenshot below shows using the detect_and_recognize_plate_from_path tool in LM Studio to detect and recognize
the license plate from an image available on the web.
Documentation
Omni-LPR documentation is available here.
Examples
Check out the examples directory for usage examples.
Contributing
Contributions are always welcome! Please see CONTRIBUTING.md for details on how to get started.
License
Omni-LPR is licensed under the MIT License (see LICENSE).
Acknowledgements
- This project uses the awesome fast-plate-ocr and fast-alpr Python libraries.
- The project logo is from SVG Repo.
omni-lprpip install omni-lpr