Lingyi Zhou
Portfolio Website
All My Works

Ex. AR Engineer @ Snap Inc.
Ex. Unity Engineer @ Unity

MFA Design and Technology @ Parsons
MA Game Design & Development @ Columbia
XR Developer
AR Development in Lens Studio & Effect House.
VR Development in Unity for Oculus Quest and Apple Vision Pro
Game Developer
Unity Development for desktop and mobile games
Shader knowledge and development experience
3D Designer
Blender and Maya for stylized and realistic modeling and rendering  
Cinema4D for animation and procedural modelling  
Full Stack Engineer
React front end + FastAPI Backend + Postgre database
Graphic Designer
Adobe Indesign, Adobe Illustrator

AR Drawing – Research Project at Snap Inc.

During my internship at Snapchat, I built an experimental AR experience that lets users draw in 3D mid-air using only hand gestures—no controllers. A custom procedural mesh system in Lens Studio generates real-time strokes from tracked hand positions, supports gesture-based erasing, and allows creators to apply custom stroke textures for expressive results on mobile and Snap Spectacles.

Tech Stack

  • Platform: Lens Studio / Snap Spectacles
  • Languages: JavaScript, Lens Studio Scripting
  • Features: Procedural Mesh, Hand Tracking, AR Interaction
  • Role: AR Developer Intern
  • Year: 2022

My Contributions

  • Published official procedural mesh assets in Lens Studio using real-time hand tracking and procedural mesh, reducing setup time by 80% (from 1 week to less than 1 day)
  • Showcased procedural mesh asset by developing an AR drawing filter with over 200,000 plays in the first month, inspiring future spatial drawing interactions
  • Authored official developer-facing documentation for procedural mesh features, reducing onboarding and learning time by 80% for Lens Studio community creators

View Official Documentation

Demo Video

Watch the AR drawing experience in action, demonstrating real-time hand tracking, 3D stroke generation, and gesture-based interactions in Lens Studio.

Feature 1: Draw with Customized Texture

Strokes are generated as dynamic meshes whose material can be swapped at runtime. Artists can apply paint, neon, ribbon, or particle-driven textures to change mood and style without reauthoring code.

Feature 2: Gesture Erase

A second gesture switches the tool into erase mode. Strokes are tracked, segmented, and selectively culled so users can sculpt drawings in space.

Documentation

Link

I wrote internal notes and prototype scripts that later informed community creator guidance on procedural mesh workflows. For public reference, Snap's official documentation on Procedural Mesh in Lens Studio outlines the core setup. I also recorded a short tutorial walk-through demonstrating the setup process, hand tracking input, and real-time mesh updates—see the demo below.