E
Eyetelli
Technology
·6 min read

Self-Hosted vs Cloud: Where Should Face Data Live?

Cloud processing is easier to build, but self-hosted processing is better for biometric data. Here's why we chose self-hosted-first architecture.

The default: cloud processing

Most face recognition systems work the same way: a camera captures an image, uploads it to a cloud server, the server runs recognition, and sends back a result. This is the easiest architecture to build. One powerful server handles everything. Clients are thin. Updates are centralized.

For many applications, cloud processing is fine. But for biometric data — especially face images — it introduces problems that are hard to solve.

The problem with uploading face data

When face images are uploaded to a cloud server, several things happen that you might not want:

Data leaves your premises. The moment a face image is transmitted over the internet, it's outside your control. Even with encryption in transit, the data now exists on a server you don't physically control.

Latency increases. Every recognition requires a round trip to the cloud. On a slow connection, this means seconds of delay. In a busy entrance, that delay stacks up.

Internet dependency. If your internet goes down, your attendance system goes down with it. For a business that needs reliable tracking, this is a deal-breaker.

Bandwidth costs scale. Streaming video or uploading images to the cloud uses significant bandwidth, especially with multiple cameras. This cost grows linearly with the number of cameras.

Self-hosted processing: keep data where it belongs

Self-hosted processing takes the opposite approach. Instead of sending images to a cloud server, you run the processing power where the cameras are. Eyecore runs on your server, performs recognition locally, and only sends lightweight event data (who, when, where) to the cloud dashboard.

Face images never leave your hardware. Eyecore processes them locally and discards the raw images after recognition. Only attendance events sync upstream.

Works offline. If the internet goes down, Eyecore keeps running. Events are stored locally and sync when connectivity returns. No gap in your attendance records.

Low latency. Recognition happens on the local network. No round trip to a distant server. Real-time performance even with multiple cameras.

The tradeoffs

Self-hosted processing isn't free. It requires a server at each location — an Ubuntu machine with a GPU. Updates need to be pushed to distributed devices rather than a single server. And the server needs enough processing power to run AI models in real time.

These are engineering challenges, not fundamental limitations. Modern GPUs can run face detection and recognition in real time on modest hardware. Over-the-air updates keep the software current. And the privacy and reliability benefits far outweigh the added complexity.

Why we chose self-hosted-first

We built Eyetelli as a self-hosted-first system because we believe it's the right architecture for biometric data. Your employees' faces shouldn't be uploaded to servers you don't control. Your attendance system shouldn't stop working when the internet hiccups. And your bandwidth shouldn't scale with the number of cameras.

The cloud dashboard exists for convenience — viewing attendance, running reports, managing settings. But the intelligence runs on your server, on your hardware, under your control.

The bottom line

Cloud processing is easier to build. Self-hosted processing is better for the people using it. We chose to do the harder thing because it's the right thing for a system that handles biometric data.