Project Details
Technology Stack
- Embedded hardware: Custom Arduino, Raspberry Pi
- Motion control: Servo motors, custom drivers
- Image processing: Custom image acquisition library, integration with proprietary object recognition engine
- Connectivity: Bluetooth Low Energy (BLE), Ethernet, PoE
- Smart device integration: Milight API, TV via COM port
- Voice & audio: Smart speaker control, voice call routing via LAN
Please keep in mind that this minimum viable product (MVP) solution was developed in partnership with a UK-based startup several years ago. Our technology stack has since grown to include modern libraries and toolkits for object, motion, and face recognition (such as TensorFlow Lite, YOLOv8, and others), as well as advanced IoT boards and edge AI modules.
Team Composition
1 Project Manager, 2 Embedded engineers, 1 Computer vision specialist, 1 Mobile developer, 1 QA engineer
Project Duration
16 weeks / ~1,600 man-hours
Methodology & Engagement Model
Fixed Price
Customer’s Product
A UK-based smart home startup envisioned a premium intelligent lighting fixture that also functions as a high-performance user recognition and automation hub. The system would use video-based object recognition to track residents across rooms, learn their routines, and dynamically trigger personalized automation scenarios, such as lighting and climate control, media playback, and voice call handling.
To secure funding and validate product-market fit, the client collaborated with Expanice to create a working minimum viable product (MVP) that demonstrated the most complex technical aspects, including user recognition, motion tracking, and multi-device automation orchestration. As a company with extensive experience in smart home solutions, we gladly accepted the challenge.
What We Did

The Expanice team created a complete MVP that included embedded hardware, smart control modules, and mobile demo apps to simulate the desired smart home experience.
The solution's key components were:
- Camera motion control system. We enabled 360-degree rotation and tilt of Sony HD cameras embedded in the light fixture by using custom Arduino boards and drivers. The motion controller tracked the user's movements to ensure they were always in frame.
- Custom image processing stack. To achieve peak performance, we created a custom lightweight image acquisition library optimized for embedded CPU usage. This library captured and preprocessed video frames before sending them to a proprietary recognition engine that returned a confidence score for user identification.
- Precision motion sensing. After testing Doppler radar and PIR sensors, we chose a video sensor + Raspberry Pi setup that provided faster response times, better noise suppression, and precise directional sensing.
- BLE integration: BLE enabled low-power communication between the user's smartphone, smart lights, and other connected devices, laying the groundwork for automated routines and presence detection.
- Smart home feature demo: included:
- Milight and TV set integration via COM port, triggering personalized scenarios
- Smart speakers to enable hands-free call pickup and call roaming as the user moved across rooms (accomplished via Ethernet/PoE due to Bluetooth's roaming limitations)
Key Results
- Full-stack MVP enables smart home automation through user identification
- Continuous tracking with high-confidence video recognition up to 12 meters
- Successful demonstrations of personalized lighting, media, and voice scenarios
- Positioned the client for raising the next funding round
Challenges We Navigated
Achieving Smooth Object Tracking in Real-Time
Maintaining camera focus on a moving user—without jitter or lag—was a critical MVP requirement. Expanice developed low-latency motor control logic and optimized image capture to ensure smooth, accurate tracking over a 12-meter range, even in complex lighting conditions.
Handling Audio Roaming Between Rooms
Some smart speaker systems don’t support Bluetooth audio handoff. To address this, we created an Ethernet-based audio routing system that allowed audio streams to follow the user from room to room while their presence was tracked using video.
What’s Next
The MVP demonstrated the viability of the client’s vision and helped secure early-stage investor interest. Planned next steps include:
- Expansion into multi-user behavior profiling
- Integration with third-party ecosystems (Apple HomeKit, Google Home)
- Edge AI optimization for object detection
- Production-ready industrial design for the smart fixture