multiplayer-vr-keith3

Multiplayer WebXR Readyplayer.me Avatars (part 3)

An explanation of how the Front End and WebGL were built to enable multiplayer WebXR readyplayer.me avatars

Table of Contents

In the last blog post, I detailed how the Back End and Infrastructure were set up in enabling real-time multiplayer WebXR experiences with readyplayer.me avatars (like in this GIF).

In this post I’ll go through the changes made in the Front End to enable multiplayer real-time avatars with Ready Player Me.

Ready Player Me Iframe

The first step is to create the iframe that connects to readyplayer.me’s api and creates a modal that allows users to create and select the avatar they will use on your app.

Once they’ve gone through this process, a url containing a GLB file will be returned to your app, this contains the users Avatar that can then be rendered in WebGL.

We’ve set this url using setAvatar and it will be available in our apps state as avatar (you will see this further on down in this blog post)

				
					import React, { useEffect, useRef, useState } from "react";
import styled from "styled-components";
import avatarStore from "./../../stores/avatar";

export default function RpmPopUp() {
  const iFrameRef = useRef(null);
  const { setAvatar, setUserMode, showIFrame, setShowIFrame } = avatarStore();

  useEffect(() => {
    let iFrame = iFrameRef.current;
    if (iFrame && process.env.ready_player_me) {
      iFrame.src = `https://${process.env.ready_player_me}.readyplayer.me/avatar?frameApi`;
    }
  });

  useEffect(() => {
    window.addEventListener("message", subscribe);
    document.addEventListener("message", subscribe);
    return () => {
      window.removeEventListener("message", subscribe);
      document.removeEventListener("message", subscribe);
    };
  });

  function subscribe(event) {
    const json = parse(event);
    if (json?.source !== "readyplayerme") {
      return;
    }
    // Subscribe to all events sent from Ready Player Me
    // once frame is ready
    if (json.eventName === "v1.frame.ready") {
      let iFrame = iFrameRef.current;
      if (iFrame && iFrame.contentWindow) {
        iFrame.contentWindow.postMessage(
          JSON.stringify({
            target: "readyplayerme",
            type: "subscribe",
            eventName: "v1.**",
          }),
          "*"
        );
      }
    }
    // Get avatar GLB URL
    if (json.eventName === "v1.avatar.exported") {
      setAvatar(
        `${json.data.url}?quality=medium&useDracoMeshCompression=true&useMeshOptCompression=true`
      );
      setUserMode("avatar");
      setShowIFrame(false);
    }
    // Get user id
    if (json.eventName === "v1.user.set") {
      console.log(`User with id ${json.data.id} set:
      ${JSON.stringify(json)}`);
    }
  }

  return (
    <AppIframe>
      <iframe
        allow="camera *; microphone *"
        className="iFrame"
        id="frame"
        width="100%"
        height="90%"
        ref={iFrameRef}
        style={{
          display: `${showIFrame ? "block" : "none"}`,
        }}
        title={"Ready Player Me"}
      />
    </AppIframe>
  );
}

function parse(event) {
  try {
    return JSON.parse(event.data);
  } catch (error) {
    return null;
  }
}

const AppIframe = styled("div")`
  width: 100%;
  height: 100%;
`;
				
			

Avatar Visualisation

Now that we have the 3d model, we can visualise this avatar in WebGL.

Depending on the users settings and the type of GLB provided, we will need to render it differently.

The following logic demonstrates how to do this.

The Avatar Controller

In this file we first check the users preference on userMode, whether it is image (a 2d graphic on a plane) or avatar (the 3d model from readyplayer.me), then we render the appropriate component for that avatar.

An image avatar only requires the image to be displayed, along with the position and rotation co-ordinates.

For a non-VR 3d model avatar, it will render the loaded animations, the loaded glb file, that users movement and body co-ordinates.

For a VR 3d model, it doesn’t require animations or the users movement and loads the left and right hand co-ordinates instead.

 

				
					import React from "react";
import { useLoader } from "@react-three/fiber";
import * as THREE from "three";
import AvatarModel from "./AvatarModel";
import AvatarImage from "./Image";
import { GLTFLoader } from "three/examples/jsm/loaders/GLTFLoader";
import { FBXLoader } from "three/examples/jsm/loaders/FBXLoader";
import { DRACOLoader } from "three/examples/jsm/loaders/DRACOLoader";
import { useFBX } from "@react-three/drei";
import VRAvatarModel from "./VRAvatarModel";
const defaultPosition = {
  position: new THREE.Vector3(0, 0, 0),
  rotation: new THREE.Vector3(0, 0, 0),
};

const Avatar = (props) => {
  const body = props.body ? props.body : defaultPosition;
  const leftHand = props.leftHand ? props.leftHand : defaultPosition;
  const rightHand = props.rightHand ? props.rightHand : defaultPosition;

  if (props.userMode === "image") {
    return (
      <AvatarImage
        position={[body.position.x, 0, body.position.z]}
        rotation={[body.rotation.x, body.rotation.y, body.rotation.z]}
        image={props.image}
      />
    );
  } else if (props.userMode === "avatar" && props.avatar) {
    const gltf = LoadModel(props.avatar);
    const model = gltf.nodes.AvatarRoot ? gltf.nodes : gltf.scene;
    const animations = loadAnimations(["idle", "run", "jump"]);

    if (props.movement && !gltf.nodes.AvatarRoot) {
      return (
        <AvatarModel
          model={model}
          animations={animations}
          body={body}
          movement={props.movement}
        />
      );
    } else if (gltf.nodes.AvatarRoot && !props.activeUser) {
      return (
        <VRAvatarModel
          model={model}
          body={body}
          leftHand={leftHand}
          rightHand={rightHand}
        />
      );
    }
  }
};

const LoadModel = (avatar) => {
  const { nodes, scene } = useLoader(GLTFLoader, avatar, (loader) => {
    const dracoLoader = new DRACOLoader();
    dracoLoader.setDecoderPath("/draco/");
    loader.setDRACOLoader(dracoLoader);
  });
  return { nodes, scene };
};

const loadAnimations = (animationArray) => {
  const clips = animationArray.map((animation) => {
    useFBX.preload(`./${animation}.fbx`);
    const { animations } = useLoader(FBXLoader, `./${animation}.fbx`);
    animations[0].name = animation;
    return animations[0];
  });
  return clips;
};

export default Avatar;
				
			

Avatar Model

Looking further at the logic that loads a 3D model, this component updates the animation for the model based on the movement of a user (if moving forwards the model will run, if space bar is pressed the model will jump).

In addition to this, the position and rotation co-ordinates are used to place the model within the Three.js scene.

				
					import React, { useRef, useEffect } from "react";
import { useAnimations } from "@react-three/drei";
import deviceStore from "../../../stores/device";

const AvatarModel = (props) => {
  const { device } = deviceStore();

  const group = useRef();
  const { actions, mixer } = useAnimations(props.animations, group);
  const paused = actions["jump"] !== undefined ? actions["jump"].paused : true;
  if (device !== "webVR") props.body.position.y = -1.2;

  useEffect(() => {
    if (props.movement && props.movement.jump) {
      if (paused) actions["jump"].stop();
      actions["jump"].play();
      actions["jump"].clampWhenFinished = true;
      actions["jump"].repetitions = 0;
    }
    if (props.movement.forward === true) {
      if (paused) actions["jump"].stop();
      actions["idle"].stop();
      actions["run"].play();
    } else if (props.movement.forward === false) {
      actions["run"].stop();
      actions["idle"].play();
    }
  }, [mixer, props.movement, paused, device]);

  return (
    <primitive
      object={props.model}
      position={[
        props.body.position.x,
        props.body.position.y,
        props.body.position.z,
      ]}
      rotation={[
        props.body.rotation.x,
        props.body.rotation.y,
        props.body.rotation.z,
      ]}
      ref={group}
    />
  );
};

export default AvatarModel;
				
			

Avatar VR Model

For a 3D model that is for VR, it has no need to play animations and instead just updates the co-ordinates within Three.js for where the body and hands are expected to be.

				
					import React, { useRef } from "react";

const VRAvatarModel = (props) => {
  const { AvatarRoot, LeftHand, RightHand } = props.model;
  if (props.leftHand && LeftHand) setHandPosition(LeftHand, props.leftHand);
  if (props.rightHand && RightHand) setHandPosition(RightHand, props.rightHand);

  const group = useRef();

  return (
    <primitive
      object={AvatarRoot}
      position={[
        props.body.position.x,
        props.body.position.y,
        props.body.position.z,
      ]}
      rotation={[
        props.body.rotation.x,
        props.body.rotation.y,
        props.body.rotation.z,
      ]}
      ref={group}
    />
  );
};

const setHandPosition = (hand, { position, rotation }) => {
  hand.position.x = position.x;
  hand.position.y = position.y;
  hand.position.z = position.z;
  hand.rotation.x = rotation.x;
  hand.rotation.y = rotation.y;
  hand.rotation.z = rotation.z;
};

export default VRAvatarModel;
				
			

Image Avatar

The 2D Avatar renders a plane, with the passed in image prop used as a texture to be rendered on top of the plane.

				
					import React, { useRef } from "react";
import { useLoader } from "@react-three/fiber";
import * as THREE from "three";

const AvatarImage = (props) => {
  const avatarMesh = useRef();
  let avatarImage = props.image === undefined ? "jamesmiller.png" : props.image;
  const texture = useLoader(THREE.TextureLoader, `/images/${avatarImage}`);

  return (
    <mesh ref={avatarMesh} {...props}>
      <planeGeometry attach="geometry" args={[0.5, 0.5]} />
      <meshBasicMaterial
        attach="material"
        side={THREE.DoubleSide}
        map={texture}
      />
    </mesh>
  );
};

export default AvatarImage;
				
			

Conclusion

So that’s it, these are the core components that makes up the creation of 3D avatars using readyplayer.me within a real-time multiplayer web app!

The full repo can be found here, hope this has been helpful and in the meantime have fun making 😀

Share this post