Ouster Ottawa, ON
Software Developer
Software Developer Intern
- Implemented polygon-based zone filtering for LiDAR point clouds in C++, enabling real-time exclusion of detections within configurable spatial boundaries to eliminate false positives in undesired areas.
- Implemented and tuned an advanced kinematic model Extended Kalman Filter in C++ for vehicle motion estimation, fusing LiDAR point cloud measurements to predict future states and improve tracking continuity. Later collaborated on a significant retune and refinement of the filter to further improve tracking accuracy.
- Hardened the point cloud clustering pipeline in C++ to maintain robust object grouping under sparse and missing LiDAR returns, reducing segmentation fragmentation in low-return scenarios.
- Implemented a Gaussian Naive Bayes classifier in C++ for real-time vehicle classification from LiDAR-derived spatial features, improving classification accuracy.
- Improved bounding box orientation alignment for detected vehicles through an iterative search over point cloud returns, yielding more spatially accurate object representations.
- Implemented a prediction-to-ground-truth matching algorithm using a distance-based cost matrix for the internal tracking evaluation framework, enabling MOTA and MOTP metric computation against labelled datasets.
- Championed integration testing culture within the team, establishing the integration test project from its initial package structure through to a robust suite covering production Docker images. Introduced reusable fixtures for product installation to minimize code duplication and accelerate test authoring.
- Architected a scalable, distributed perception system to balance computational load across multiple nodes. Secured inter-node communication using mutual TLS client certificate authentication.
- Implemented an OAuth 2.0 Authorization Code Flow with PKCE using Keycloak to secure the perception product stack, enabling fine-grained access control and user management for customers.
- Redesigned and re-implemented the software installation and distribution system across Debian, RPM, and container-based delivery targets, establishing a reliable and maintainable packaging pipeline.
- Enforced Python code quality standards across the codebase by integrating Ruff linting and automated formatting checks into the CI pipeline.
- Architected a production-grade deep learning inference pipeline utilizing a C++ backend for model execution and a Rust-based REST API for efficient data routing and request management.
- Extended the deep learning inference pipeline with automatic multi-GPU distribution, enabling data-parallel inference across all available GPUs without manual configuration.
- Designed and delivered a modular, Rust-based event-driven system under tight deadlines, enabling configurable condition-action pipelines (e.g., triggering webhooks when objects satisfy spatial or temporal criteria). Iterated significantly on the initial release in response to customer feedback, reducing system complexity, standardizing output payloads across all modules for easier downstream consumption, and broadening extensibility.
- Profiled the perception system with Linux perf to identify network and kernel bottlenecks. Evaluated, validated, and integrated Intel TBB to parallelize previously serial processing stages, achieving a 10x throughput improvement in many cases.
- Led the team migration to devcontainer-based development workflows, providing consistent, per-project isolated environments that eliminated environment drift and reduced onboarding friction.
- Established reliable, reproducible GCP VM infrastructure for integration testing using Packer for image provisioning and Terraform for infrastructure-as-code orchestration.
- Architected and co-maintained modular Jenkins-based CI/CD pipelines, integrating automated code quality checks and expanding C++ static analysis coverage. Optimized pipeline execution time to improve developer feedback cycles and ensure reliable image publishing.
- Refactored legacy Rust code with extensive unchecked error handling on external library boundaries, introducing mock and fake implementations to enable deterministic, reliable unit tests for the affected functionality.
- Developed Argo Workflows pipelines to automate the previously manual perception performance evaluation process, enabling scalable parallel job execution across datasets and dramatically reducing the time required to assess tracking quality.