
Robot Learning project is focused on training a robot with VR or MR approaches.

Zhenliang Zhang
Research Scientist of AI
My research interests include wearable computing, machine learning, Cognitive Reasoning, and mixed/virtual reality.
Related
- [IROS 2020] Graph-based hierarchical knowledge representation for robot task transfer from virtual to physical world
- [TURC 2019] VRGym: A virtual testbed for physical and interactive AI
- [ICRA 2019] High-fidelity grasping in virtual reality using a glove-based system
- [VR 2020] Extracting and transferring hierarchical knowledge to robots using virtual reality
Publications
[IROS 2020] Graph-based hierarchical knowledge representation for robot task transfer from virtual to physical world
We study the hierarchical knowledge transfer problem using a cloth-folding task, wherein the agent is first given a set of human demonstrations in the virtual world using an Oculus Headset, and later transferred and validated on a physical Baxter robot.
[TURC 2019] VRGym: A virtual testbed for physical and interactive AI
We propose VRGym, a virtual reality testbed for realistic human-robot interaction. Different from existing toolkits and virtual reality environments, the VRGym emphasizes on building and training both physical and interactive agents for robotics, machine learning, and cognitive science.