VTubers originally come from Japan. The term stands for virtual YouTuber, although most VTubers are not currently found exclusively on YouTube. VTubers create content online with a virtual avatar, which is digitally generated. Often the characters are heavily inspired by anime. It is estimated that there were over 10,000 active VTubers in the early 2020s and the number of active VTubers has grown exponentially over the last two years. VTubers use characters customised by artists and bring them to life using programmes such as Live2D. This allows both streamers and YouTubers to hide their real identities to be represented by an avatar. While the main audience base for VTubing currently sits outside of Japan, who love the culture of Japan and anime, there is real potential for this approach to grow beyond its origins. By this we mean creators representing themselves rather than disguising their identity.
1. Typically, the advantages lie in the creation of content. The content belongs to the streamer because it is their own "identity" and personality that appeals to consumers rather than a separate brand. 2. vTubers often separate their personal identity from the avatar they play. This means they can have millions of views and viewers online and go unnoticed in real life. For example, they can go out on the street, create content under a different avatar and attend fan conventions, for example - without attracting the attention that other Creators are exposed to. 3. a content creator can control the avatar together with other people. For example, one VTuber plays the facial expressions and gestures and another speaks the voice. As long as the character is carefully crafted and styling decisions and personality traits are established, actors can also take the reins. This reduces the burden on a single vtuber and may even mean that the avatar's IP could be sold. 4. if Ninja were to sell its brand to another streamer, it would be almost impossible for the new owner to maintain or expand the IP. This problem does not exist in the world of VTubing - a world that is in its early stages and will continue to grow in the coming years.
We are an independent developer studio. In the studio we deal with the development of entertainment software of various types. Our portfolio ranges from business apps, marketing solutions to educational applications. Feel free to contact us at any time.
The Unreal Engine is a 3D game engine from Epic. The range of applications now goes far beyond pure "gaming". In addition to films, the engine is used in particular for real-time applications such as visualisations or VR applications. With regard to the VStreaming project, the Unreal Engine offers an excellent opportunity to create a world that delivers impressive visual results "out of the box". With suitable hardware and blueprint setup, 3D avatars can be controlled via live motion capturing and facetracking and streamed directly via OBS.
Unity is a 2D/3D game engine. The Unity Engine is now also used for a wide range of real-time applications. With Unity, 3D models created in Blender, for example, are converted into the VRM format. With this format, the avatar can be used as a VR model in various software such as VSeeFace, Luppet or others. To translate the avatar from Blender to VRM, the plugin UniVRM and Unity3D 2019.4 LTS or higher Unity version is required.
VSeeFace is a free, highly configurable face and hand tracking VRM and VSFAvatar puppeteering programme. The programme is designed for virtual Youtubers with a focus on robust tracking and high image quality. VSeeFace offers similar functionality to Luppet, 3tene, Wakaru and similar programs. The programme runs on Windows 8 and higher versions. VSeeFace can also send, receive and combine tracking data via the VMC protocol, which also enables perfect iPhone sync support through Waidayo. Face recognition, including gaze, blink, eyebrow and mouth tracking, is done via a conventional webcam. A Leap Motion device is required for optional gesture control.
VTubing requires at least one webcam to control the character. In order to achieve better quality from vstreaming, more expensive hardware must be used. For example, for very accurate face tracking, an IPhone X or higher version is needed. A Leap Motion or HTC Vive tracker helps to visibly move hands and arms in the stream. If money is not an issue, the customer can also opt for a Mocap suit, with which the entire body is tracked.
Hat Ihnen der Artikel gefallen? Sie können gerne unsere Artikel teilen, zitieren oder weiterempfehlen. Bei Fragen können Sie uns gerne kontaktieren oder ein kleines Feedback hinterlassen. Wir freuen uns auf Sie!