hamdan.tc: ~
$ whoami
Hamdan Chaudhry
> SDE II · Amazon Robotics
$

From writing firmware for the ESA ExoMars Rover to shipping full-stack systems at Amazon Robotics — I build software that bridges the gap between the digital and physical world.

Skills

Technologies and domains I work with

Languages & Frameworks

Python C++ C TypeScript Lua React NoSQL

Cloud & DevOps

AWS CloudWatch Serverless Microservices CI/CD Git

Robotics & Embedded

ROS OpenCV NVIDIA Jetson Embedded Systems Arduino Raspberry Pi FPGA

Experience

Professional journey and key contributions

Amazon Robotics

Software Development Engineer II

June 2022 – Present | Toronto, ON

  • Own and develop the post-manufacturing certification service that gates shipment of every drive unit robot produced in the US — validating hundreds of units per day across manufacturing sites
  • Drove significant increases in certification automation and service reliability, reducing manual operator overhead and strengthening the robustness of a shipment-critical pipeline
  • Led large-scale simulation work to trial different certification workflow patterns, evaluating performance trade-offs to improve process efficiency and surface bottlenecks for floor technicians
  • Extended the platform to support autonomous robot validation, enabling a new class of robots to move through the certification pipeline end-to-end
  • Built and maintained full-stack tooling with a React/TypeScript frontend, Python Flask APIs, and an AWS serverless backend, with CloudWatch instrumentation for real-time observability
MDA

MDA

Software Engineer

January 2021 – May 2022 | Brampton, ON

  • Developed and validated firmware in C and Lua for the ESA ExoMars Rover's actuator drive electronics, enabling precise movement and joint control for a Mars-bound system
  • Built a computer vision pipeline using OpenCV and MATLAB to automate robotic arm control for aerospace wing panel sealing, replacing a previously manual process
  • Implemented fiducial marker-based 3D pose estimation using AprilTags and ISAAC ROS on NVIDIA Jetson hardware, enabling accurate end effector positioning for a space robotic arm system
  • Developed image processing and computer vision algorithms in C++ and Python for embedded robotic systems running on ROS
UW

Autonomoose, University of Waterloo

Software Research Assistant

May – August 2017 | Waterloo, ON

  • Implemented motion planning algorithms for an autonomous vehicle simulator in Python
  • Developed a behavioural planner for autonomous navigation based on rule-based execution and dynamic systems modeling
T

Tesla

Firmware Engineer Intern

January 2016 – April 2016 | Palo Alto, CA

  • Contributed to the Firmware Validation Automation team responsible for verifying Model X firmware builds against a Hardware-in-the-Loop (HIL) test framework prior to release
  • Owned body systems test coverage — doors, locking mechanisms, and neutral-to-drive shifting — authoring new automated test scripts and extending hardware configurations to close subsystem gaps
  • Simulated vehicle component signals over CAN and LIN data buses to drive automated validation of embedded firmware across body system controllers

Projects

A selection of projects demonstrating my work across robotics, embedded systems, and software

Spoteria
#01 Computer Vision

A smart mirror that watches your workout and tells you when your form breaks down — in real time.

Spoteria

Built as a capstone project, Spoteria is a smart mirror that monitors exercise form in real time and delivers colour-coded visual feedback across x, y, and z planes. A Microsoft Kinect captures joint movement data, which was used to train a machine learning model in Turi to distinguish correct form from incorrect. An actuator adjusts the screen height automatically so the mirror works for seated and standing exercises alike. Core software was written in C# and C, with Arduino handling actuator control.

Watch demo  |  Project site

C# C Machine Learning Kinect Arduino Actuator Control
RaspBot
#02 Robotics

A Raspberry Pi robot that navigates obstacle courses autonomously using computer vision.

RaspBot

RaspBot is a semi-autonomous ground robot built around a Raspberry Pi. Using OpenCV for real-time image processing and a suite of sensors for environmental awareness, the robot navigates miniature road-style courses without human input. The navigation logic is implemented in C++ and Python, with computer vision driving lane detection and obstacle avoidance decisions.

C++ Python Raspberry Pi OpenCV Sensors
Cyclops ROV
#03 Robotics

A semi-autonomous underwater vehicle with a PD controller for stable depth and heading.

Cyclops ROV

Cyclops is an Arduino-powered ROV designed for autonomous underwater navigation. A 9-DoF IMU feeds sensor fusion algorithms to estimate yaw, pitch, and roll, while a pressure sensor handles depth and light sensors provide environmental context. A PD controller coordinates the four-motor drivetrain to hold heading and depth accurately. The project went through rapid hardware and software iteration — every prototype test informed the next mechanical or firmware revision.

C Arduino Sensor Fusion IMU PD Control
Free Fly
#04 Robotics

Fly a Crazyflie quadcopter with nothing but your hands, using LEAP Motion gesture tracking.

Free Fly

An extension of the Maestro project, Free Fly takes gesture-based control into three dimensions. A LEAP Motion controller tracks hand position and orientation in 3D space, mapping movements to the yaw, roll, and pitch of a Crazyflie quadcopter. The interface is designed to feel natural — tilting your hand banks the drone, raising it climbs. Built in Python using the LEAP Motion SDK and Crazyflie Python API.

Python LEAP Motion Crazyflie Gesture Control
YouEye
#05 Computer Vision

A wearable assistant that narrates your surroundings in real time for visually impaired users.

YouEye

Built at Hack the North, YouEye is a vision-to-speech assistant for people who are visually impaired. A forward-facing camera captures frames continuously, which are sent to Google Cloud Vision API for object identification. The system measures estimated distance and escalates its spoken feedback accordingly — quiet confirmation when the path is clear, a caution warning as objects approach, and a direct alert when they are close. The front end was built with HTML, CSS, and JavaScript.

View on Devpost

Python Google Vision API JavaScript Text-to-Speech Hackathon
.WAV Media Player
#06 Embedded

A fully functional audio player built on an FPGA — custom drivers, interrupts, and all.

WAV Media Player

Programmed a DE1-SoC FPGA board from scratch to function as a .wav media player. This meant writing custom device drivers for every peripheral, handling SD card I/O at a low level, implementing audio processing pipelines, and managing hardware interrupts for playback controls. The Altera toolchain (Quartus, QSYS, NIOS II) was used throughout. Every layer of the stack — from register-level debugging to synchronisation between peripherals — was handled in C.

C FPGA Altera / NIOS II Device Drivers Audio DSP
Fuel Cell Car
#07 Embedded

A fuel-cell powered miniature car that navigates a maze using only sensor input — no shortcuts.

Fuel Cell Car

Programmed an MSP430 microcontroller to drive a miniature hydrogen fuel-cell car through a maze. Ultrasonic, touch, and light intensity sensors feed a decision algorithm that handles turning, reversing, and deadend recovery. The tight energy budget of the fuel cell made efficiency a hard constraint — every unnecessary manoeuvre cost precious run time, so the navigation logic was heavily optimised for minimal movement.

C MSP430 Ultrasonic Sensors Autonomous Nav
Maestro 🏆
#09 Robotics

Gesture-driven robot control via LEAP Motion and ROS — 1st place at Deloitte Tech Exchange.

Developed at the Deloitte Tech Exchange competition, Maestro is a gesture interface that maps hand movements in 3D space to robot navigation commands. Using a LEAP Motion controller and the LEAP SDK, hand position and orientation are translated into speed, rotation direction, and linear movement for a ROS-controlled robot. The team took first place overall out of all competing universities.

ROS Python LEAP Motion Gesture Control 1st Place
Circuit Bot
#10 Embedded

A line-following robot built entirely from first principles — custom PCB, sensors, and all signal conditioning.

Circuit Bot

Circuit Bot was built from the ground up on a custom PCB — no off-the-shelf sensor modules. Hall-effect, thermistor, current, light intensity, and optical encoder sensors were each designed from first principles and verified with an oscilloscope. Op-amp filter circuits were built to clean up noisy signals before they reached the microcontroller. The navigation firmware, written in C, uses the optical encoders and light sensors to follow a line course with consistent accuracy.

C PCB Design Op-Amps Soldering Optical Encoders

Contact

Open to senior SDE and tech lead roles in robotics, systems, or full-stack software — feel free to reach out