Judging the Relevance and worth of ideas part 2.pptx
VR Entertainment goes to XR metaverse
1. VR Entertainment goes to XR metaverse
About Akihiko SHIRAI, Ph.D
- VR1.0: Dawn of Virtual Reality
- VR2.0: Era of human research in lab
- VR3.0: Era of contents explosion
- VR4.0: Dawn of XR Metaverse
Conclusion
Twitter@o_ob Video: j.mp/VRSYT
websites: https://akihiko.shirai.as
https://vr.gree.net/lab/
2. Akihiko SHIRAI, Ph.D in a Manga
From my book ”The future of game design - Science in Entertainment systems” (2013)
▶ To be Continued!!
3. Akihiko SHIRAI, Ph.D in a Manga
Manga from my book ”The future of game design - Science in Entertainment systems” (2013)
1983[10y.o.] Studied programming by magazine.
Made 3D flight-sim by character graphics in PC-6001mkII.
1988[15y.o.] Played NES until forget high school admission.
Met camera, publishing (news paper editing) and sciences.
1990[18y.o.] Worked arcade games operator and paperboy.
(Financially, difficult to go university directly)
In 1992 to 1998. I studied in Tokyo Polytechnic University.
"Photo engineering" contains a lot. Optics, functional chemicals,
electronics, laser holographics, printings... As a club activity, I
immersed into art photography.
Fortunately, I met to photoshop and lay tracing program in 1994.
And my first laboratory, Prof Yuichiro Kume give an opportunity
to study virtual reality, I was fevered about real time graphics.
Because it is a latest technology since the video game.
In 1998, I was offered a job at Namco as a planner (rare). However, joined Canon, it was offered a job on the
recommendation of his professor. Then, I assigned to the office administration section at the Fukushima Plant.
4. Akihiko SHIRAI, Ph.D in a Manga
Manga from my book ”The future of game design - Science in Entertainment systems” (2013)
In 2000, I moved to Criterion, Inc. in UK. It was game engine
developer named "RenderWare", which was applied as an
official game engine for PlayStation2. So, I got a chance to
become a game development consultant in a family of canon.
It was very good opportunity to study about game
development and its industry.
After launch fever of PlayStation2, I have some hit titles thanks to the middle ware,
multi-platform technologies and shaders in early GPU environment.
In this era, there were common sense "Once graphics become greater, the games becomes
more fun." but in my mind, I also had a different hypothesis, "Exploring VR experience
makes it more interesting and meaningful".
In 2001, I moved to Tokyo Institute of Technology, as a full time PhD student.
(Prof. Sato’s lab, well-known about SPIDAR interface)
After years of practice about the Japanese mass product industry and its work in IT division,
5. Fantastic Phantom Slipper (1997)
The another key technology is the “Phantom Sensation”, a special
psycho-physical phenomenon on human skin. When two mechanical
stimuli of the same intensity are applied to different locations of
skin surface with appropriate spacing, the two stimuli are fused and
one sensation is perceived. When the intensity of one stimulus
increases, the location of fused sensation shifts to the location of
stronger stimulus. This psycho-physical phenomenon has been known
as phantom sensation, which was found by Bekesy, a Novel Prize
Winner in 1961. This project was realized before the rumble pad of
game controllers as known as vibrotactile in 1997.
The system has been realized with a real-time optical
motion capture system using PSD and Infrared-LEDs. In
the system, two LEDs are fixed on each slipper. The
locations and directions of slippers on the floor are
measured in real-time. Since feet are usually on the
floor, two dimensional measurement is sufficient for
this application.
The similar technology of this is used into Nintendo's
WiiRemote as its image sensor with sensor bar in later.
7. Fantastic Phantom Slipper (1997)
400fps mocap (by own)
Dynamic Phantom Sensation (by own)
DirectX5 graphics (by own)
Hemisphere dome screen (by own)
Vibrator implemented slipper (by own)
by 300MHz CPU + VGA projector
Prototype, handmade, in Lab
8. Tangible Playroom: Penguin Hockey (2001-2003)
Demo at SIGGRAPH KIDS 2003, Best Multilingual and/or Omnilingual award.
"Tangible Playroom" was designed as a future computer
entertainment system for children. It can provide some
Virtual Reality contents with a safe force feedback and a
large floor image. Players can interact with content's world
using their full body. To feel force feedback, player grasps a
tangible (graspable, perceptible by touch) grip like a cork
ball. It is linked to encoder motors by strings. Then it can
input 3D location to system. When player touch to some
objects, system calculate a correct force feedback to player
via string and tangible grip. Our system can provide high
quality force representation with safe rather than metal
linked arm system.
"Penguin Hockey" is a typical demonstration content. It is a
simple 3D hockey game playing with automotive penguins.
The pucks are snowmen. To get points, players should throw
snowmen to the enemy's goal hole. Our software has a real-
time dynamics simulator then children can interact with
computer-generated characters with real reactions.
9. Akihiko SHIRAI, Ph.D in a Manga (PhD to Post-Doc era)
Manga from my book ”The future of game design - Science in Entertainment systems” (2013)
2003-2005 [Post-doc in NHK-ES]
Axi-Vision & Projection Lighting for next-generation digital
TV broadcasting and production environments.
2005-2007 [Post-doc in Laval France]
ENSAM Presence & Innovation Lab, in Laval France.
Studying theme park attraction development and social
deployment through “Laval Virtual ReVolution” exposition.
2008-2010 [Science Communicator in Museum “Miraikan”]
At a national science museum in the Tokyo Bay area, she
learned the specialized skills of communicators, interpreters
and service engineering through her work in the science
communicator training program, and developed exhibits that
incorporate entertainment ideas into the world of science.
10. “Entertainment Systems”
●Definition (in post-doc era)
○Computer systems that was designed to affect to
human amusements.
○Video Games, media arts, real time interactive and
entertainment virtual reality are possible to be included.
○Cinemas and DVDs are also possible to be included but should
focus to “computer system” like interactivity.
11. [Mirakan]Songs of ANAGURA (2008-2010)
In Miraikan permanent exhibition since 2010 - current.
https://www.miraikan.jst.go.jp/sp/anagura/
Concept design and technical basement research.
- 10x16m projection clusters
- Laser tracker clusters
- Players don’t wear any tracking devices
- “Wall of Wisdom” - observation console
- Unity based contents creation
12. ReVolution in Laval Virtual
Akihiko SHIRAI (Session chair of ReVolution, Japan)
● The role of ReVolution
○ Bring experiences on new technology, or futures to Laval Virtual since 2006.
○ For the ReVo exhibitor: chance to get feedback from VR industries and general public.
○ Art festival “RectoVRso” had launched since 2017,
now ReVolution can make focus to technology revolutions from all of the world.
● The theme
[2019] VR5.0 [2018] 1+1=∞ (one plus one equals unlimited)
[2017] TransHumanism++ [2016] REAL
VIRTUALITY
[2015] Kiddy Dream in Virtual Reality [2014] Frontier village in Virtual
Reality
[2013] The NEXT BIG STEP [2012] Virtual Reality
That Moves You
[2011] Converging [2010]
13. Akihiko SHIRAI, Ph.D in a Manga (post Post-Doc era)
Manga from my book ”The future of game design - Science in Entertainment systems” (2013)
2010-2018 [Teacher] Associate Professor
In Information Media in KAIT, Japan.
“Creating people who creates!”
Details:
https://blog.shirai.la/
15. [Labo Project] Real Baby - Real Family
“Real Baby - Real Family: Holdable tangible baby VR” (SIGGRAPH 2017)
This project "Real Baby - Real Family" is a
project aimed at expressing love and family ties
utilizing Virtual Reality system (VR). In this
paper, we described the version presented at
the International collegiate Virtual Reality
Contest (IVRC 2016). It reports system design
and implementation, user evaluation during
exhibition, and future possibilities. The single
player version of "Real Baby" exhibited at the
IVRC 2016 preview was made up of three main
elements. The first element is a holdable baby
mock-up without using HMD. The second is a
generator of baby face based on a user's 2D
facial image. The third elements consists of a
visual, vocal, and haptic feedback as well as
event generation. In the future, we are hoping
to examine alternative expressions of "love in
family ties" with this project.
16. [Display] 2x3D & ExPixel(2011-2016)
“2x3D: Real time shader for simultaneous 2D/3D hybrid theater” (SIGGRAPH ASIA 2012)
“ExPixel: PixelShader for multiplex-image hiding in consumer 3D flat panels” (SIGGRAPH 2014)
17. “Dynamic Persona”
“The future of game design -
science of entertainment
systems”
authored by A.Shirai
Born at 2000.
Her dream is becoming
a princess.
At 2014, in Junior High.
She loves to play
“Pokemon X/Y”.
At 2016, in High school.
She loves to sing
Hatsune Miku in
Karaoke. Her dream is
becoming an idol.
“Pokemon Go“ told her
what is AR.
At 2018, in collage.
She is studying
fashion. Still loves to
sing in Karaoke. But
bit bored to RPGs.
Experienced contents will
be affected to your life.
At 2020,
She want to see
Olympic games with
her cosplay costume.
18. Akihiko SHIRAI, Ph.D (post educator era)
Since 2018 [Director]
GREE VR Studio Laboratory
A researcher in VR entertainment systems and Director of GREE VR Studio in
REALITY, Inc. and invited professor in Digital Hollywood university, graduate
school. He had experienced various posts in this domain (Criterion RenderWare,
NHK-ES, ENSAM Presence & Innovation Laboratory in France, a science
communicator and exhibition planner in Miraikan) and currently contributing
exploring VR live entertainment and XR metaverse in academic research.
https://www.dhw.ac.jp/en/contact.html
[Invited Professor]
Digital Hollywood Univ. Graduate School
19. VR Entertainment goes to XR metaverse
About Akihiko SHIRAI, Ph.D
- VR1.0: Dawn of Virtual Reality
- VR2.0: Era of human research in lab
- VR3.0: Era of contents explosion
- VR4.0: Dawn of XR Metaverse
Conclusion
Twitter@o_ob Video: j.mp/VRSYT
websites: https://akihiko.shirai.as
https://vr.gree.net/lab/
21. GREE, Inc. • Founded in 2004. Headquartered in Tokyo, Japan.
• Social mobile game publisher & platform.
• Listed on Tokyo Stock Exchange.
• 1,603 (Group total; as of June, 2021)
21
27. Virtual YouTuber
3D Avatar Technology
27
Voice Record & Rip Sync
Motion Capture Finger Capture
Hair & Accessary
Face Capture & Blend Shape
Body & Cloth
Xsens MVN Animate - Products - Xsens 3D motion tracking
https://www.xsens.com/products/xsens-mvn-animate/
Perception Neuron
https://www.noitom.com/solutions/perception-neuron
Real Time Rendering
Network Broadcasting
Gifting in mobile + cloud
28. Live demo in front of 1500 seats at Tokyo International Forum
Live demo for only 7 minutes.
This is one of the most difficult demo sessions in SIGGRAPH.
29. [REALITY] Virtual Live App
https://reality.app
REALITY Expands Distribution to 62 Countries and Territories
Worldwide
- The number of active subscribers has increased by approximately
600% year-over-year through global expansion
GREE, Inc. July 9, 2021
https://prtimes.jp/main/html/rd/p/000000187.000021973.html
31. Avatars for Education
“Effectiveness of facial animated avatar and voice transformer in elearning programming course”
Rex Hsieh, Akihiko Shirai and Hisashi Sato, SIGGRAPH '19: ACM SIGGRAPH 2019 Posters
➔ 15 weeks, 186 students in programming class at 1st year of Univ.
➔ Students preferred a female but the group did not have the best grade :-(
➔ Teacher’s Gender can not be chosen by student!
https://dl.acm.org/doi/10.1145/3306214.333854
0
37. Prototype in our lab
https://www.slideshare.net/vrstudiolab/ccse2019-gree-vr-studio-lab-vtuber
38. Amount of applause
from audiences
Amount of cheering
from audiences
Performer's (real) LEDs glow in
sync with the left and right bars
The performers all wear Hapbeat in real, and the
vibrations show how excited the audience is.
The audience can visually see that their cheers are reaching the performers, which
contributes to enhance the excitement.
Vibeshare::Performer at SIGGRAPH ASIA 2019 Real-Time Live!
Please watch a
video here:
https://www.yout
ube.com/watch?
v=yTrDRKazksM
&t=1s
41. VibeShare consists of the following components:
(1) “Performer” shares the audience’s passion with the performer by
implementing haptics, sound, illumination LED, and projection mapping.
(2) “Vote” makes the live programs more dynamic and interactive by
allowing live voting. A live chart and live tagging also increases audience
involvement with the program.
(3) “Emote” converts audience reactions into animated emoticons to
encourage non-verbal communication between international users.
(4) “VirtualGifting” brings a current ecosystem into hybrid live
performance. Developers can design an original gifting system and various
animating virtual gifts for entertaining payment.
(5) “Touch” produces haptic feedback from contact, direction, and
distance by using an algorithm.
(6) “GamePlay” provides a game system that enhances the experience
between player and audience and strengthens the bond between them.
(7) “Director” controls the program by broadcasting “Future Cue Cards,”
which can eliminate broadcast delays for the platforms, and by syncing
audience’s applause and reactions with the performer’s timeline.
(8) “Analyzer” improves your live performance, according to evidence. A
powerful logger and event tracker visualizes the session.
(9) “TimeShift” enriches the archived content by sharing audiences’
emotions beyond real-time.
These components use a WebSocket connection and are selectable by
programs and experiences designed for any form of XR live entertainment.
VibeShare is a set of experimental interactive technologies for XR live
entertainment that shares “non-verbal emotions between performer and
viewer” that were difficult to convey through conventional live streaming.
These components use web socket technologies and selectable by programs
and experiences design for any XR live entertainment.
https://vr.gree.net/lab/vibeshare/
43. Zone 1: introduction and character design
Zone 2: scenario design
Zone 3: filming area
WebXR:Sugoroku method
The 25th International ACM Conference on 3D Web Technology
We applied the “step-by-step” network play system of “Sugoroku” to
the WebXR spatial design using avatars. The player is able to remain in a
zone before moving on to the next stage. In a creative workshop, each
zone has a sequence of steps to follow, such as (1) creating an avatar,
(2) studying a scenario, and then (3) filming. This reduces the
experience time derived from different players' skills.
44. “Sugoroku” is a traditional Japanese board game, of which there
are two forms: one is similar to the Western game backgammon,
while the other has more in common with Snakes and Ladders. In
both cases, the “step-by-step” network organization of the
games—whereby the players have to follow the steps to achieve
their goals—was of great interest to us.
45. The 25th International ACM Conference on 3D Web Technology
E-Sugoroku (1925) Creative Commons
Ōsaka Mainichi Shimbun - Own work
Japanese games "Sugoroku", Board game of train.
http://digitalmuseum.rekibun.or.jp/edohaku/app/collection/detail?id=0199002055&sr=%93%53%93%B9%8B%A3%91%88%82%B7%82%B2%82%EB%82%AD&y1=1907
&y2=1927 http://ymonjo.ysn21.jp/user_data/upload/File/ags/4-3-5-010.pdf
https://en.wikipedia.org/wiki/Sugoroku#/media/File:Sugoroku2500.jpg
Sugoroku method
Sugoroku board game of Japanese train (1925)
47. References: VibeShare and Directional Haptics
1. Yuichiro Kume. 1998. Foot interface: fantastic phantom slipper. In ACM SIGGRAPH 98
Conference abstracts and applications (SIGGRAPH '98). Association for Computing
Machinery, New York, NY, USA, 114. DOI:https://doi.org/10.1145/280953.284801
2. Yusuke Yamazaki, Shoichi Hasegawa, Hironori Mitake, and Akihiko Shirai. 2019. NeckStrap
Haptics: An Algorithm for Non-Visible VR Information Using Haptic Perception on the Neck.
InACM SIGGRAPH 2019 Posters(Los Angeles, California)(SIGGRAPH ’19). Association for
Computing Machinery, New York, NY, USA, Article 60, 2 pages.
https://doi.org/10.1145/3306214.333
3. Yusuke Yamazaki and Akihiko Shirai. 2021. Pseudo Real-Time Live Event: Virtualization for
Nonverbal Live Entertainment and Sharing. In2021: Laval Virtual VRIC, ConVRgence
Proceedings 2021(Laval, France)(VRIC ’21). https://doi.org/10.20870/IJVR.2021.1.1.479
48. References: Map and Walking interactions
1. Makiko Suzuki Harada, Hidenori Watanave and Shuuichi Endou, : "Tuvalu Visualization Project - Net Art on
Digital Globe: Telling the Realities of Remote Places", page 559- 572, 2011.
2. Akihiko Shirai, Kiichi Kobayashi, Masahiro Kawakita, Shoichi Hasegawa, Masayuki Nakajima, and Makoto
Sato. 2004. Entertainment applications of human-scale virtual reality systems. In Proceedings of the 5th
Pacific Rim conference on Advances in Multimedia Information Processing - Volume Part III (PCM'04).
Springer-Verlag, Berlin, Heidelberg, 31–38. DOI:https://doi.org/10.1007/978-3-540-30543-9_5
3. Akihiko Shirai, Yuki Kose, Kumiko Minobe, and Tomoyuki Kimura. 2015. Gamification and construction of
virtual field museum by using augmented reality game "Ingress". In Proceedings of the 2015 Virtual
Reality International Conference (VRIC '15). Association for Computing Machinery, New York, NY, USA,
Article 4, 1–4. DOI:https://doi.org/10.1145/2806173.2806182
49. References: Avatar and metaverse in Education
1. Rex Hsieh, Akihiko Shirai, and Hisashi Sato. 2019. Evaluation of Avatar and Voice Transform in Programming
E-Learning Lectures. In Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents
(IVA '19). Association for Computing Machinery, New York, NY, USA, 197–199.
DOI:https://doi.org/10.1145/3308532.3329430
2. Rex Hsieh, Akihiko Shirai, and Hisashi Sato. 2019. Effectiveness of facial animated avatar and voice
transformer in elearning programming course. In ACM SIGGRAPH 2019 Posters (SIGGRAPH '19). Association for
Computing Machinery, New York, NY, USA, Article 82, 1–2. DOI:https://doi.org/10.1145/3306214.3338540
3. Liudmila Bredikhina, Toya Sakaguchi, and Akihiko Shirai. 2020. Web3D Distance LiveWorkshop for Children in
Mozilla Hubs. In The 25th International Conference on3D Web Technology(Virtual Event, Republic of
Korea)(Web3D ’20). Association for Computing Machinery, New York, NY, USA, Article 27, 2 pages.
https://doi.org/10.1145/3424616.342472
4. Stewart Culin. 1920. THE JAPANESE GAME OF SUGOROKU.The Brooklyn Museum Quarterly 7, 4 (1920), 213–
233. http://www.jstor.org/stable/2645
50. References: XR Live Entertainment
1. Akihiko Shirai. 2019. REALITY: broadcast your virtual beings from
everywhere. In ACM SIGGRAPH 2019 Appy Hour (SIGGRAPH '19). Association
for Computing Machinery, New York, NY, USA, Article 5, 1–2.
DOI:https://doi.org/10.1145/3305365.3329727
2. Bredikhina, Liudmila et al. “Avatar Driven VR Society Trends in Japan.”
2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts
and Workshops (VRW) (2020): 497-503.
3. Akihiko Shirai, et.al. 2019. Global Bidirectional Remote Haptic Live
Entertainment by Virtual Beings. ACM SIGGRAPH ASIA 2019 Real-Time Live!
SIGGRAPH
ASIA 2018
51. Collaboration with “Hapbeat” through internship
Product
Technical Info
Hapbeat Introduction movie
https://www.youtube.com/
watch?v=DAXH5MMpBDY
Paper link to origin of
Hapbeat
Tension-Based Wearable
Vibroacoustic Device for
Music Appreciation
https://bit.ly/3kicJif
Hapbeat-Duo is an easy-to-wear necklace typed device which can present
powerful and impactive vibration.
Hapbeat-Duo has two core-less DC motors linked by a neck strap. The dynamic
range of frequency and amplitude is high enough to be perceived the
modulation. Vibrations on either side of the neck do not resonate with each
other because the connector is not a rigid body but a flexible neck strap,
made of satin ribbon. Therefore, the device can render various distinguishable
vibration waves on both sides of the neck independently.
Hapbeat-Duo
52. Description of HapticMineSweeper
Neck Strap Haptics:
Algorithm for Non-visible
VR inoformation Using
Perceptual Characteristic
on Neck (Siggraph 2019)
Please watch a video here:
https://youtu.be/AgmFWBu
5ZnM
Haptic rendering algorithm that dynamically modulates wave parameters to convey distance, direction, and object type
by utilizing neck perception and the Hapbeat-Duo, a haptic device composed of two actuators linked by a neck strap.
55. VR Entertainment goes to XR metaverse
About Akihiko SHIRAI, Ph.D
- VR1.0: Dawn of Virtual Reality
- VR2.0: Era of human research in lab
- VR3.0: Era of contents explosion
- VR4.0: Dawn of XR Metaverse
Conclusion
Twitter@o_ob Video: j.mp/VRSYT
websites: https://akihiko.shirai.as
https://vr.gree.net/lab/
57. Team1 “Praying Tour”
(Izumi, Massu, Tiger)
Concept : Japanese historical places such as shrines and temples.
Kyoto
Sumida River
Asakusa
Sumida River
Kaminarimon
Fushimiinari Shrine
Kinkakuji Temple
Kiyomizu Temple
Sensoji Temple
58. Team 2 “Instagenic Tour”
(Hiroka Iwamoto, Takaaki Ogiso, Takashi Soda)
Concept : Let's take Instagenic photos in Yokohama!
Kanagawa Historical
Museum
Yokohama Red Brick
Warehouse
Osanbashi
59. Team 3 “Tokyo in the sky”
(Yuki Tanaka, Shunki Tamura, Chen Yu-An)
Concept: Enjoying both day and night view of Tokyo city, experiencing traditional Japanese shopping street
and visitting Japanese temple.
Tokyo Tower
Tokyo Skytree
62. [Review-1] Good sight, but...
金閣寺。スマホ構えている人が映りこんでるのもいいな。一方でミッションとしてどういう組み立てにしているのか気になります。名所旧跡は映
えそうではありますが、「見ているだけ」になりがちなので、体験としての設計をより深く考えてみよう。
Kinkakuji Temple. I like the the people holding their phones. On the other hand, I'm wondering how it's going to be assembled as a mission. The
historic sites may look good, but they tend to be "just looking," so let's think more deeply about designing them as experiences.
63. [Review-1] Asakusa - Atomosphere - Ending
仲見世の賑わいから、修学旅行の生徒さんたちや菊の御門で祈りを捧げる流れ。「場の雰囲気」のコントラストがあり、なかなか良いチョイス。
一方で、エンディングについてはちょっとよくわからなかった。Praying tourとしての幕引きとプレイヤーの行動やマインドを設計しよう。
From the Nakamise cloudy street to the students on a school trip and the prayers with the Imperial seals. It's a good choice because of
the contrast in the atmosphere of the same place. On the other hand, I was a little unclear about the ending; let's design the ending as
a "Praying tour" and the player's actions and mindset.
69. Over all comments:
短い時間でツアーコンテンツの基本を作る体験は達成したと考えます。
We believe that we have achieved the experience of creating the basics of
tour content in a short time.
バーチャルなデートが(性別や年齢関係なく)体験できたことは素晴らしい。
会話も大事。
ツアーナビをしてみるという体験は職業体験としてもよい経験だったのでは?
It's great to have a virtual date experience (regardless of age or gender).
Conversation is also important. Wasn't the experience of trying tour
navigation a good experience as a work experience?
70. How can we enhance it as XR metaverse UX?
XRメタバース体験としてより良くするためには
「ただ見ているだけ、ではない体験を作る」 It is not only seeing, experience.
「個人的な視点」「特別な体験であることがわかる」
Very personal point of view, but player can know it is very special experience.
「みんなで何かを同時に達成する」→「写真を撮る」
Do something in collaborate with other players, “Taking a photo together!”
Actually, it is not same as “Taking a photo of the Tower”.
「体験する前と、体験後は
どのような理解の変化をするか?」
Design before and after the experience.
What changes in understanding will you make?
71. Making some bridges to the program syllabus
Global Awareness for Technology Implementation(GATI 6)
[導入としては...]
・タイの学生とじっくり交流・議論して、考え方を知りたい。
・xR技術と、ニューノーマル社会での適用について、グローバルな知見を広げたい。
・将来タイに訪問するために、タイの友人を作りたい。
[テーマ]「ニューノーマル社会におけるxR技術の社会実装」
つまり、日本&タイの事情をふまえた近未来のxR技術のDXに向けた社会実装のアイデアを考えて、行動に移すためのPoCを作る体験をする。共通のxR技術分野を選び、
Issueを探し、どのように社会実装するのがよいか、コンセプトを考え、それを証明する動画を作成する。分野としては、今回は実は「コロナ禍における観光」を扱
っていますが、みなさんな得意な分野、医療や交通、教育分野などでも良い。日本/タイ社会をそれぞれ対比させながら取り組んでも良いし、1つのソリューション
で、日本とタイの両方の社会でDXを目指すものを示しても良い。他の観点、例えば今回のようにエンタテイメントや文化、デジタルコンテンツから導入しても良い
が、明確なターゲットペルソナと日本とタイのそれぞれの共通/個別のIssueは考慮する。
[As an attractive introduction...]
Having opportunities Thai/Japan students by interacting
and discussing with them. Expanding my global knowledge
about xR technology and its application in the new normal
society. (Want to to make friends in Thailand/Japan in
order to visit there in the future…)
[Theme] "Social implementation of xR technology in the new normal society".
In other words, we will have an experience of social implementation of xR technology for DX in the near future, and create a
PoC for action. We should choose a common xR technology field, consider the issue of Japan and Thailand, come up with
a concept of how best to implement it in society, and create a video to prove it. As for the field, this time we have tackled
"tourism in the Covid disaster," but it could be in your fields like medical, transportation, or education fields. You can work on
it by contrasting societies between Japanese Thailand, or you can show how a solution can be used to achieve DX in both
societies. You can introduce other perspectives, such as entertainment, culture, or digital content, but you should have a
clear target persona and common/particular issues of each.