Zhenliang Zhang | BIGAI
Zhenliang Zhang | BIGAI
Home
Publications
Projects
Contact
CoIn-Lab
Light
Dark
Automatic
Publications
Type
Conference paper
Journal article
Date
2023
2021
2020
2019
2018
2017
2016
2015
[Virtual Reality 2023] DexHand: dexterous hand manipulation motion synthesis for virtual reality
We propose a neural network-based finger movement generation approach, enabling the generation of plausible hand motions interacting with objects.
Haiyan Jiang
,
Dongdong Weng
,
Zhen Song
,
Xiaonuo Dongye
,
Zhenliang Zhang
Cite
[VR 2023] Building symmetrical reality systems for cooperative manipulation
In this paper, we discuss the symmetrical reality-based human-robot interaction and show the general pipeline of building the symmetrical reality system with two minds.
Zhenliang Zhang
Cite
Project
[Engineering 2023] A reconfigurable data glove for reconstructing physical and virtual grasps
We present a reconfigurable data glove design to capture different modes of human hand-object interactions, which are critical in training embodied artificial intelligence (AI) agents for fine manipulation tasks.
Hangxin Liu
,
Zeyu Zhang
,
Ziyuan Jiao
,
Zhenliang Zhang
,
Mingchen Li
,
Chenfanfu Jiang
,
Yixin Zhu
,
Song-Chun Zhu
Cite
Project
[VR 2021] Symmetrical cognition between physical humans and virtual agents
In this paper, we discuss the cognition problem from some perspectives, i.e., attention, perception, pattern recognition, and communication.
Zhenliang Zhang
Cite
Project
[ISMAR 2020] Understanding physical common sense in symmetrical reality
We emphasize the bi-directional mechanical control within the symmetrical reality framework and why free wills of machines can break the common sense.
Zhenliang Zhang
Cite
Project
[ISMAR 2020] Machine intelligence matters: Rethink human-robot collaboration based on symmetrical reality
We introduce the contents of the symmetrical reality-based human-robot collaboration and interpret the humanrobot collaboration from the perspective of equivalent interaction.
Zhenliang Zhang
,
Xuejiao Wang
Cite
Project
[IROS 2020] Graph-based hierarchical knowledge representation for robot task transfer from virtual to physical world
We study the hierarchical knowledge transfer problem using a cloth-folding task, wherein the agent is first given a set of human demonstrations in the virtual world using an Oculus Headset, and later transferred and validated on a physical Baxter robot.
Zhenliang Zhang
,
Yixin Zhu
,
Song-Chun Zhu
Cite
Project
[VR 2020] Extracting and transferring hierarchical knowledge to robots using virtual reality
We study the knowledge transfer problem by training the task of folding clothes in the virtual world using an Oculus Headset and …
Zhenliang Zhang
,
Jie Guo
,
Dongdong Weng
,
Yue Liu
,
Yongtian Wang
Cite
Project
[VR 2020] Exploring the differences of visual discomfort caused by long-term immersion between virtual environments and physical environments
To investigate the effects of visual discomfort caused by long-term immersing in virtual environments (VEs), we conducted a comparative …
Jie Guo
,
Dongdong Weng
,
Hui Fang
,
Zhenliang Zhang
,
Jiaming Ping
,
Yue Liu
,
Yongtian Wang
Cite
[IROS 2019] Toward an efficient hybrid interaction paradigm for object manipulation in optical see-through mixed reality
Human-computer interaction (HCI) plays an important role in the near-field mixed reality, in which the hand-based interaction is one of …
Zhenliang Zhang
,
Dongdong Weng
,
Jie Guo
,
Yue Liu
,
Yongtian Wang
Cite
Project
[ISMAR 2019] Mixed reality office system based on maslow’s hierarchy of needs: Towards the long-term immersion in virtual environments
In a mixed reality (MR) environment that combines the physical objects with the virtual environments, users’ feelings are …
Jie Guo
,
Dongdong Weng
,
Zhenliang Zhang
,
Haiyan Jiang
,
Yue Liu
,
Yongtian Wang
,
Henry Been-Lirn Duh
Cite
Project
[Sensors 2019] HiFinger: One-handed text entry technique for virtual environments based on touches between fingers
We present a text entry technique called HiFinger, which is an eyes-free, one-handed wearable text entry technique for immersive …
Haiyan Jiang
,
Dongdong Weng
,
Zhenliang Zhang
,
Feng Chen
Cite
[TURC 2019] VRGym: A virtual testbed for physical and interactive AI
We propose VRGym, a virtual reality testbed for realistic human-robot interaction. Different from existing toolkits and virtual reality environments, the VRGym emphasizes on building and training both physical and interactive agents for robotics, machine learning, and cognitive science.
Xu Xie
,
Hangxin Liu
,
Zhenliang Zhang
,
Yuxing Qiu
,
Feng Gao
,
Siyuan Qi
,
Yixin Zhu
,
Song-Chun Zhu
Cite
Project
[ICRA 2019] High-fidelity grasping in virtual reality using a glove-based system
This paper presents a design that jointly provides hand pose sensing, hand localization, and haptic feedback to facilitate real-time …
Hangxin Liu
,
Zhenliang Zhang
,
Xu Xie
,
Yixin Zhu
,
Yue Liu
,
Yongtian Wang
,
Song-Chun Zhu
Cite
Code
Project
[VR 2019] Symmetrical reality: Toward a unified framework for physical and virtual reality
In this paper, we review the background of physical reality, virtual reality, and some traditional mixed forms of them. Based on the …
Zhenliang Zhang
,
Cong Wang
,
Dongdong Weng
,
Yue Liu
,
Yongtian Wang
Cite
Project
[VR 2019] Evaluation of maslows hierarchy of needs on long-term use of HMDs - A case study of office environment
Long-term exposure to VR will become more and more important, but what we need for long term immersion to meet users fundamental needs …
Jie Guo
,
Dongdong Weng
,
Zhenliang Zhang
,
Yue Liu
,
Yongtian Wang
Cite
[SID 2019] Vision-tangible interactive display method for mixed and virtual reality: Toward the human-centered editable reality
Building a human-centered editable world can be fully realized in a virtual environment. Both mixed reality (MR) and virtual reality …
Zhenliang Zhang
,
Yue Li
,
Jie Guo
,
Dongdong Weng
,
Yue Liu
,
Yongtian Wang
Cite
[SID 2019] Subjective and objective evaluation of visual fatigue caused by continuous and discontinuous use of HMDs
During continuous use of displays, a short rest can relax users’ eyes and relieve visual fatigue. As one of the most important …
Jie Guo
,
Dongdong Weng
,
Zhenliang Zhang
,
Yue Liu
,
Henry B.L. Duh
,
Yongtian Wang
Cite
[PhotonicsAsia 2018] Depth-aware interactive display method for vision-tangible mixed reality
Vision-tangible mixed reality (VTMR) is a further development of the traditional mixed reality. It provides an experience of directly …
Zhenliang Zhang
,
Yue Li
,
Jie Guo
,
Dongdong Weng
,
Yue Liu
,
Yongtian Wang
Cite
Project
[ISMAR 2018] Inverse augmented reality: A virtual agent's perspective
We propose a framework called inverse augmented reality (IAR) which describes the scenario that a virtual agent living in the virtual …
Zhenliang Zhang
,
Dongdong Weng
,
Haiyan Jiang
,
Yue Liu
,
Yongtian Wang
Cite
Project
[ISMAR 2018] HiKeyb: High-efficiency mixed reality system for text entry
Text entry is an imperative issue to be addressed in current entry systems for virtual environments (VEs). The entry method using a …
Haiyan Jiang
,
Dongdong Weng
,
Zhenliang Zhang
,
Yihua Bao
,
Yufei Jia
,
Mengman Nie
Cite
Project
[SID 2018] Task-driven latent active correction for physics-inspired input method in near-field mixed reality applications
Calibration accuracy is one of the most important factors to affect the user experience in mixed reality applications. For a typical …
Zhenliang Zhang
,
Yue Li
,
Jie Guo
,
Dongdong Weng
,
Yue Liu
,
Yongtian Wang
Cite
[Optical Engineering 2018] Enhancing data acquisition for calibration of optical see-through head-mounted displays
Single point active alignment method is a widely used calibration method for optical-see-through head-mounted displays (OST-HMDs) since …
Zhenliang Zhang
,
Dongdong Weng
,
Jie Guo
,
Yue Liu
,
Yongtian Wang
,
Hua Huang
Cite
Project
[ChineseCHI 2018] Employing different viewpoints for remote guidance in a collaborative augmented environment
This paper details the design, implementation and an initial evaluation of a collaborative platform named OptoBridge, which is aimed at …
Hongling Sun
,
Yue Liu
,
Zhenliang Zhang
,
Xiaoxu Liu
,
Yongtian Wang
Cite
[VR 2018] Physics-inspired input method for near-field mixed reality applications using latent active correction
Calibration accuracy is one of the most important factors to affect the user experience in mixed reality applications. For a typical …
Zhenliang Zhang
,
Yue Li
,
Dongdong Weng
,
Yue Liu
,
Yongtian Wang
Cite
Project
[VR 2018] Inverse virtual reality: Intelligence-driven mutually mirrored world
Since artificial intelligence has been integrated into virtual reality, a new branch of virtual reality, which is called inverse …
Zhenliang Zhang
,
Benyang Cao
,
Jie Guo
,
Dongdong Weng
,
Yue Liu
,
Yongtian Wang
Cite
Project
[VR 2018] Evaluation of hand-based interaction for near-field mixed reality with optical see-through head-mounted displays
Hand-based interaction is one of the most widely-used interaction modes in the applications based on optical see-through head-mounted …
Zhenliang Zhang
,
Benyang Cao
,
Dongdong Weng
,
Yue Liu
,
Yongtian Wang
,
Hua Huang
Cite
Project
[ISMAR 2017] An accurate calibration method for optical see-through head-mounted displays based on actual eye-observation model
Single point active alignment method (SPAAM) has become the basic calibration method for optical-see-through head-mounted displays …
Zhenliang Zhang
,
Dongdong Weng
,
Jie Guo
,
Yue Liu
,
Yongtian Wang
Cite
Project
[VR 2017] RIDE: Region-induced data enhancement method for dynamic calibration of optical see-through head-mounted displays
The most commonly used single point active alignment method (SPAAM) is based on a static pinhole camera model, in which it is assumed …
Zhenliang Zhang
,
Dongdong Weng
,
Yue Liu
,
Yongtian Wang
,
Xinjun Zhao
Cite
Project
[OzCHI 2016] OptoBridge: Assisting skill acquisition in the remote experimental collaboration
In this paper an experimental teaching platform named OptoBridge is presented which supports the sharing of the collaborative space for …
Hongling Sun
,
Zhenliang Zhang
,
Yue Liu
,
Henry BL Duh
Cite
[ICVRV 2016] A modular calibration framework for 3D interaction system based on optical see-through head-mounted displays in augmented reality
With the rapid development of Virtual and Augmented Reality systems, it becomes more and more important to develop an efficient …
Zhenliang Zhang
,
Dongdong Weng
,
Yue Liu
,
Yongtian Wang
Cite
Project
[ICOPEN 2015] 3D optical see-through head-mounted display based augmented reality system and its application
The combination of health and entertainment becomes possible due to the development of wearable augmented reality equipment and …
Zhenliang Zhang
,
Dongdong Weng
,
Yue Liu
,
Xiang Li
Cite
Project
Cite
×