Job Description
This role owns the full perception pipeline for an embedded robotics platform operating in real-world construction environments. The successful candidate will integrate AI-based object recognition models — covering people, equipment, and obstacles — into real-time embedded systems with deterministic, resource-constrained runtime requirements. The position sits at the intersection of embedded systems engineering and applied AI, requiring both deep hardware-awareness and practical deployment experience. This is not an AI research or cloud-side ML role; the focus is on building perception systems that work reliably on the device, in the field.
Job Requirements
Requirement
- Design and implement full perception pipelines for embedded robotics platforms
- Integrate AI-based object recognition models into real-time embedded systems
- Architect memory-efficient, deterministic runtime systems for edge deployment
- Optimise inference performance under resource constraints (memory, compute, thermal) across heterogeneous embedded accelerators
- Integrate multi-sensor inputs including camera, LiDAR, and IMU
- Connect perception outputs reliably to control and safety systems
- Analyse latency, memory usage, and numerical stability across the pipeline
- Ensure robust operation across diverse edge cases and real-world field environments
Qualifications
- Hands-on embedded software development experience in modern C/C++
- Deep understanding of real-time systems, scheduling, and performance optimisation
- Proven experience deploying AI/ML models in edge or embedded environments
- Development experience in resource-constrained embedded computing environments
- Working knowledge of computer vision fundamentals: camera models, projection, coordinate transformations
- Familiarity with model export and deployment pipelines (e.g. ONNX) and quantisation concepts
- Ability to navigate accuracy/latency/hardware trade-off decisions in production
Preferred
- AI model optimisation experience with heterogeneous embedded accelerators (DSP, NPU, or equivalent)
- Robotics or autonomous systems development background
- LiDAR processing or basic sensor fusion experience
- Safety-critical or deterministic system design experience



