hi there👋, I'm

Chenwan Halley Zhong

23, she/her

MS in Robotics student @ Northwestern University | Musician

about me.

As a robotics student at Northwestern University, I build robots designed to survive the transition from the lab to the real world. My passion lies in creating robust, life-enhancing robotics that solve tangible problems while maintaining a playful, artistic edge. I believe that the future of technology should be as much about soul and aesthetics as it is about sensors and code, and I am committed to developing reliable platforms that make a more beautiful, functional world possible.

projects.

Vocal2Piano: End-to-End Autonomous Robotic Piano Accompaniment System

This system integrates Machine Learning with precision mechatronics to perform real-time piano accompaniment from vocal input. I developed a MIR pipeline that transcribes audio into MIDI, which is then mapped by high-performance C++ firmware to a coordinated 60-actuator array. The hardware features a modular CAD assembly validated through motion simulation and 3D printing, powered by custom PCBs designed to manage high-current inductive loads through dedicated power isolation and transient suppression.

  • Embedded Systems
  • PCB
  • Max/MSP
  • CAD
  • DSP
  • Machine Learning
  • Audio Processing

Bug Catcher: ROS 2 Vision-Based Dynamic Sorting (FER Panda)

Engineered a modular ROS 2 and MoveIt 2 framework for the Franka Emika Panda to enable autonomous Hex-Bug sorting. A monocular sky camera feeds an HSV color classifier and AprilTag-based calibration pipeline that establishes a static TF tree, grounding all camera detections in robot base coordinates for reliable pick-and-place planning. For moving targets, a direct joint trajectory injection pathway bypasses MoveIt entirely, publishing IK-solved commands at 20 Hz to track and intercept live bugs in real time.

  • ROS 2
  • MoveIt 2
  • Rviz
  • Computer Vision
  • OpenCV
  • IK/FK
  • Motion Planning

Interactive Oropharyngeal-Swab Robot System: High-Precision Pandemic Response Platform

Developed an autonomous oropharyngeal-swab robotic system for high-efficiency, contactless pandemic response, integrating a multi-modal pipeline for precision and safety. We engineered a gesture-based HRI using Leap Motion to enable intuitive, touchless control over the sampling process. The system utilizes a vision-based detection pipeline for target localization and optimized motion planning algorithms to execute collision-free trajectories within the oral cavity. The integrated platform successfully streamlined the sampling cycle to 42 seconds while maintaining robust data synchronization through a custom mobile application.

  • Gesture Recognition
  • Embedded Systems
  • HRI
  • Medical Robotics
  • Computer Vision
  • IK/FK
  • Motion Planning
  • Leap Motion

technical skills.

  • Software:

    Python (PyTorch, OpenCV), C, C++, C#, MATLAB, Git, Linux, Bash

  • Robotics & Simulation:

    ROS 2, SLAM, CoppeliaSim, Gazebo, RViz

  • Hardware & Fab:

    CAD Modeling (SolidWorks, Fusion 360), 3D Printing (SLA, FDM, SLS, MJF), CNC Machining & Milling, Laser Cutting, Rapid Prototyping

  • Embedded Systems:

    PCB Design (Altium/KiCad), Microcontrollers (STM32, ESP32, Arduino, Teensy, Raspberry Pi), Soldering, Communication Protocols (I2C, SPI, UART)

  • Digital Arts:

    Unity & Unreal Engine, Ableton Live (Max/MSP), Logic Pro, FL Studio

  • Languages:

    English (native), Mandarin Chinese (native), Cantonese, French, Spanish, German

github contributions.

Open to collaboration.

I'm always interested in new opportunities and exciting projects. Whether you have a project in mind or just want to chat about tech, I'd love to hear from you.

Currently available for internships and full-time job opportunities

Response time: Usually within 24 hours

© 2026 Chenwan Halley Zhong