Hi, I’m Somik — I build real-time perception and autonomy systems.

I’m an AI & Perception Engineer at NextLeap Aeronautics, where I design and deploy real-time vision and control pipelines for UAVs. My work spans low-latency video streaming, embedded inference, multi-threaded optimizations, sensor integration, and perception-guided flight under edge-compute constraints.

Before joining NextLeap, I completed my MS at NYU, where I worked on Visual Place Recognition (VPR) and built sequence-based localization algorithms for GPS-denied environments. That experience shaped my interest in robust perception, SLAM-style pipelines, and algorithmic evaluation at scale.

Across both industry and research, I’ve built systems involving: * Real-time computer vision and ML inference * Embedded deployment on Jetson (Xavier/Orin) and Raspberry Pi * Multi-robot control, CLF/CBF controllers, and PID motion systems * Vision-based navigation and early-stage GNSS-denied autonomy * CUDA/GPU optimization for performance-critical pipelines

I enjoy working on problems where correctness, latency, and reliability matter—and where the solution requires both strong engineering fundamentals and iterative experimentation.

If you're building ML, robotics, or high-performance perception systems, I’d love to connect.