While the Telexistence system with human-scale robot remotely transmits physical movements with interactive sensations, sense of immersion, and presence, it is not easy to install a system into everyday situations due to its costs, complexity, and maintainability. Focusing on considering how “Minimal Telexistence” keeping users’ sense of immersion and presence could be provided in our office or collaborative working environment, we are developing a system that transmits views from multiple omnidirectional cameras in a room to a remotely connected user. While at the local site, it also presents the remote user’s presence with sounds and a 360 directional LED illuminations. For the remote users, the system provides an experience as if freely and quickly walking or rather jumping around the local site with a sense of immersion, while the local users can communicate with the remote user as if he/she really exist here. In this research, we have installed this system in our laboratory that can be connected to remote places, and comprehensively observed our behaviors, actions, and communications. Understanding unique experience and phenomena provided by this system, we analyzed recorded data in the both remote and local site, and situationally interviewed some participants who act in this study. Finally, we describe how the system itself could be improved in the future development and argue what the requirement is for producing “the minimal Telexistence” with any simple implementations.
Publication 2260457 {2260457:KCFZZDB6},{2260457:HBHRFBXW} 1 acm-sigchi-proceedings 50 default 254 https://star.rcast.u-tokyo.ac.jp/wp-content/plugins/zotpress/ %7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22KCFZZDB6%22%2C%22library%22%3A%7B%22id%22%3A2260457%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Izumihara%20et%20al.%22%2C%22parsedDate%22%3A%222019-03-23%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EAtsushi%20Izumihara%2C%20Daisuke%20Uriu%2C%20Atsushi%20Hiyama%2C%20and%20Masahiko%20Inami.%202019.%20ExLeap%3A%20Minimal%20and%20highly%20available%20telepresence%20system%20creating%20leaping%20experience.%20In%20%3Ci%3E2019%20IEEE%20Virtual%20Reality%20%28VR%29%3C%5C%2Fi%3E.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22ExLeap%3A%20Minimal%20and%20highly%20available%20telepresence%20system%20creating%20leaping%20experience%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Atsushi%22%2C%22lastName%22%3A%22Izumihara%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Daisuke%22%2C%22lastName%22%3A%22Uriu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Atsushi%22%2C%22lastName%22%3A%22Hiyama%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Masahiko%22%2C%22lastName%22%3A%22Inami%22%7D%5D%2C%22abstractNote%22%3A%22We%20propose%20%5Cu201cExLeap%5Cu201d%2C%20a%20minimal%20telepresence%20system%20that%20creates%20leaping%20experience.%20Multiple%20%5Cu201cnodes%5Cu201d%20with%20an%20omnidirectional%20camera%20transmit%20the%20video%20to%20clients%2C%20and%20on%20the%20client%2C%20videos%20are%20rendered%20in%203D%20space.%20When%20moving%20to%20another%20node%2C%20by%20crossfading%20two%20videos%2C%20the%20user%20can%20feel%20as%20if%20she%5C%2Fhe%20leaps%20between%20two%20places.%20Each%20node%20consists%20of%20very%20simple%20hardware%2C%20so%20we%20can%20put%20them%20on%20multiple%20places%20we%20want%20to%20go%20to.%20Moreover%2C%20because%20the%20system%20can%20be%20used%2024%5C%2F7%20by%20multi-user%20simultaneously%20and%20is%20very%20easy%20to%20use%2C%20it%20creates%20various%20types%20of%20chances%20of%20communications.%22%2C%22date%22%3A%222019%5C%2F03%5C%2F23%22%2C%22proceedingsTitle%22%3A%222019%20IEEE%20Virtual%20Reality%20%28VR%29%22%2C%22conferenceName%22%3A%222019%20IEEE%20Virtual%20Reality%20%28VR%29%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%226TUZBU3Y%22%5D%2C%22dateModified%22%3A%222019-10-31T10%3A05%3A48Z%22%7D%7D%2C%7B%22key%22%3A%22HBHRFBXW%22%2C%22library%22%3A%7B%22id%22%3A2260457%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22%5Cu6cc9%5Cu539f%20et%20al.%22%2C%22parsedDate%22%3A%222018%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3E%26%23x6CC9%3B%26%23x539F%3B%26%23x539A%3B%26%23x53F2%3B%2C%20%26%23x6A9C%3B%26%23x5C71%3B%26%23x6566%3B%2C%20and%20%26%23x7A32%3B%26%23x898B%3B%26%23x660C%3B%26%23x5F66%3B.%202018.%20%26%23x8907%3B%26%23x6570%3B%26%23x306E%3B360%26%23x5EA6%3B%26%23x30AB%3B%26%23x30E1%3B%26%23x30E9%3B%26%23x9593%3B%26%23x3092%3B%26%23x79FB%3B%26%23x52D5%3B%26%23x3067%3B%26%23x304D%3B%26%23x308B%3B%26%23x30C6%3B%26%23x30EC%3B%26%23x30D7%3B%26%23x30EC%3B%26%23x30BC%3B%26%23x30F3%3B%26%23x30B9%3B%26%23x30B7%3B%26%23x30B9%3B%26%23x30C6%3B%26%23x30E0%3B%26%23x306E%3B%26%23x958B%3B%26%23x767A%3B.%20In%20%3Ci%3E%26%23x7B2C%3B23%26%23x56DE%3B%20%26%23x65E5%3B%26%23x672C%3B%26%23x30D0%3B%26%23x30FC%3B%26%23x30C1%3B%26%23x30E3%3B%26%23x30EB%3B%26%23x30EA%3B%26%23x30A2%3B%26%23x30EA%3B%26%23x30C6%3B%26%23x30A3%3B%26%23x5B66%3B%26%23x4F1A%3B%26%23x5927%3B%26%23x4F1A%3B%3C%5C%2Fi%3E.%20Retrieved%20from%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27http%3A%5C%2F%5C%2Fconference.vrsj.org%5C%2Fac2018%5C%2Fprogram2018%5C%2Fpdf%5C%2F11C-3.pdf%27%3Ehttp%3A%5C%2F%5C%2Fconference.vrsj.org%5C%2Fac2018%5C%2Fprogram2018%5C%2Fpdf%5C%2F11C-3.pdf%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22%5Cu8907%5Cu6570%5Cu306e360%5Cu5ea6%5Cu30ab%5Cu30e1%5Cu30e9%5Cu9593%5Cu3092%5Cu79fb%5Cu52d5%5Cu3067%5Cu304d%5Cu308b%5Cu30c6%5Cu30ec%5Cu30d7%5Cu30ec%5Cu30bc%5Cu30f3%5Cu30b9%5Cu30b7%5Cu30b9%5Cu30c6%5Cu30e0%5Cu306e%5Cu958b%5Cu767a%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22%5Cu539a%5Cu53f2%22%2C%22lastName%22%3A%22%5Cu6cc9%5Cu539f%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22%5Cu6566%22%2C%22lastName%22%3A%22%5Cu6a9c%5Cu5c71%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22%5Cu660c%5Cu5f66%22%2C%22lastName%22%3A%22%5Cu7a32%5Cu898b%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222018%22%2C%22proceedingsTitle%22%3A%22%5Cu7b2c23%5Cu56de%20%5Cu65e5%5Cu672c%5Cu30d0%5Cu30fc%5Cu30c1%5Cu30e3%5Cu30eb%5Cu30ea%5Cu30a2%5Cu30ea%5Cu30c6%5Cu30a3%5Cu5b66%5Cu4f1a%5Cu5927%5Cu4f1a%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%5Cu65e5%5Cu672c%5Cu8a9e%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fconference.vrsj.org%5C%2Fac2018%5C%2Fprogram2018%5C%2Fpdf%5C%2F11C-3.pdf%22%2C%22collections%22%3A%5B%226TUZBU3Y%22%5D%2C%22dateModified%22%3A%222019-02-13T06%3A05%3A07Z%22%7D%7D%5D%7D 1.
Atsushi Izumihara, Daisuke Uriu, Atsushi Hiyama, and Masahiko Inami. 2019. ExLeap: Minimal and highly available telepresence system creating leaping experience. In 2019 IEEE Virtual Reality (VR) .