Building physical AI for science — Robot Learning · Embodied AI · AI Agents
Researcher @ Massachusetts General Hospital · Harvard University · Boston, MA
I build learning-based robots and AI agents that act in the physical world — and I'm increasingly applying these tools to biology and neuroscience, where embodiment matters as much as intelligence.
- Robot Learning — RL, sim-to-real, mobile manipulation, foundation models for robots
- AI Agents — LLM/VLM agents that can perceive, plan, and use tools to control physical systems
- Physical AI for Science — bringing the above to wet-lab, microscopy, and brain-inspired settings
Robot Learning & Embodied AI
awesome-isaac-gym— Curated resources for GPU-accelerated robot learning · 1.2k ⭐dual_ur5_husky_mujoco— Mobile manipulation simulation: dual UR5 arms on a Husky base in MuJoComujoco-mcp— Control MuJoCo simulations from any LLM via the Model Context Protocol
AI Agents
gemini-robotics-er-playground— Hands-on playground for Gemini Robotics-ERchatgpt2agent— Turning chat models into tool-using agents
LinkedIn · X / Twitter · Google Scholar
"The best way to predict the future is to build it — one robot, one agent, one experiment at a time."





