Skip to main content
Version: 1.x

Neighborhood Showcase

Here is the converted version of your document:

We have built a simple showcase based on Unity and Mirror Networking where players run around a simple foggy neighborhood and can experience 3D voice based on their position in 3D space and position-independent walkie-talkie with multiple channels.

In addition to voice, this demo leverages UserData to sync the location within the map with ODIN servers. A simple web app is built to show player positions, if they are talking, and which direction they are running. You can also talk to other players directly within the Commander App.

Screenshots

These shots give you an impression of the demo. Try to find your friends and colleagues in this foggy world.

Downloads

You can download the showcase demo with the URLs below. Try it with your friends and have fun :-).

PlatformSizeDownload
Windows x86_641.1 GBDownload
Linux amd641.1 GBDownload
macOS Universal1.1 GBDownload

Commander

In this web app, you can see all players that are active within the demo showcase. Download the demo, run it, and you'll see yourself running around in the app below. This app is built with the Web SDK.

You can run the demo standalone on your mobile device using this link:

Open commander in new tab

As you can see, ODIN can be integrated with ease everywhere. Check out this live demo directly embedded into our developer documentation.

Whitepaper

Some technical details about how everything works.

Showcase Demo

This demo is built with Unity and Mirror Networking. A dedicated server is running in our Fleet Server Hosting Service. The default IP address in the lobby is pointing to this server. But you can also host your own server-client combo by clicking on the Host button in the Lobby.

While building this app, we just followed our own guide on how to integrate ODIN into Unity and Mirror Networking. So, just follow that guide to learn how the basic network topology is designed and how 3D positional audio is integrated in the game. Don't worry, it's a short read, as the whole process is pretty simple and straightforward.

The demo features two ways to talk to each other: 3D positional audio, where volume and direction are adjusted automatically based on the position of the other players in the scene, and a walkie-talkie communication with audio effects and multiple channels. Our Event handling guide describes how to handle multiple voice rooms simultaneously.

Commander App

The commander app is a very basic web application built with Angular 13, integrating our ODIN Web SDK. It's based on our Angular Demo that is available in our Github repository.

Sending Player Positions

To integrate ODIN into Mirror Networking, we leverage User Data to map Mirror Networking IDs to ODIN Peer IDs (so the voice of Player A is attached to the corresponding player game object in the scene).

For the Commander App, we extended the PlayerUserDataJsonFormat class with a few additional properties:

PlayerUserDataJsonFormat.cs
[Serializable]
public class PlayerUserDataJsonFormat : IUserData
{
public string name;
public string seed;

// These are new
public float xPosition;
public float yPosition;
public float heading;
public bool spatialTalking;
public bool walkieTalkieTalking;
}

Then, 10 times a second, we call an UpdatePlayerPosition function in our GameManager singleton which creates a JSON representation of the PlayerUserDataJsonFormat and sends it to the ODIN server, which ensures that these user data are in sync with each client:

Updating player position
public void UpdatePlayerPosition(PlayerController controller)
{
// Update heading, position (top down) and talking indicators
playerUserData.xPosition = controller.transform.position.x;
playerUserData.yPosition = controller.transform.position.z;
playerUserData.heading = controller.transform.eulerAngles.y;
playerUserData.spatialTalking = controller.spatialTalking;
playerUserData.walkieTalkieTalking = controller.walkieTalkieTalking;

// Make sure we only send data if they have changed and only every 100ms
if (Time.time - _lastPositionUpdateTime > updatePositionIntervalInSeconds)
{
var userData = playerUserData.ToUserData();
if (userData != _lastSentUserData)
{
// Send JSON representation of the user data to ODIN server
OdinHandler.Instance.UpdateUserData(userData);
_lastPositionUpdateTime = Time.time;
_lastSentUserData = playerUserData.ToString();
}
}
}

User Data is a very powerful feature. You define a data structure to attach to each peer in the ODIN room, and ODIN ensures this data is shared with each client connected. User Data can be anything, as internally, it’s just a byte array. JSON is a good format as it’s cross-platform, easy to process, and human-readable so you can check via console logs if everything works as expected.

Leveraging player positions

Our web-based commander app connects to the same ODIN room. Once connected, it receives PeerUserDataChanged events containing the JSON representation built within our Unity application. This user data looks like this:

{
"heading": 109.49986267089844,
"name": "John Mclain",
"seed": "14",
"spatialTalking": true,
"walkieTalkieTalking": false,
"xPosition": -21.31228256225586,
"yPosition": -0.24028445780277252
}

Using this data, we map the position in Unity units to the 2D-position on our top-down map shown in the commander. Then, we use the CSS properties left and top to position the location pin and use transform: rotate(..deg) to rotate the direction indicator.

A simple icon within the direction pin reflects if the user is talking.

Commander’s voice

We also wanted the commander to talk to players to showcase ODIN’s multiplatform support. Everyone can talk with everyone, regardless of OS, framework, or ecosystem, within the same ODIN room (with permissions handled by our token system if needed).

warning

We disabled audio out in this demo, so the commander can say something to players in the game but cannot hear them. As players might not be aware of the commander app, we did not want to expose their voice to everyone with an internet connection. Our token system allows detailed permission handling to ensure only those with permission can hear others, but since this is a simple tech demo, we kept it straightforward.

Keep the lower button pressed and start talking. All players in the game will hear your voice with a nice audio effect.

When connecting a room, we create an OdinMedia object but don’t start it yet. Everything is prepared, but no data is sent to ODIN servers:

odinRoom.createMedia(mediaStream).then((media) => {
this.ownMedia = media;
});

When the button is pressed, the startTalking function is called, and once the button is released, the stopTalking function is called, stopping the media. The mic stays active, but no data is sent to ODIN anymore; the user is muted again.

startTalking()
{
if (this.ownMedia) {
this.ownMedia.start();
}
}

stopTalking()
{
if (this.ownMedia) {
this.ownMedia.stop();
}
}

When the commander app starts, the user joins the ODIN room as a “spectator.” We need to handle that within the game differently than if a “real” user joins the room with a player object in the scene. Whenever a peer joins a room, all clients receive a PeerJoined event.

While it might seem a bit complex, it’s very simple:

  • If a peer joins one of the walkie-talkie rooms, we attach the user’s voice (a dynamic AudioSource provided by the ODIN SDK) to a walkie-talkie game object of the player’s “skin.”
  • If a user joins the “ShooterSample” room (just a name), we find a player with a presence in the game and attach the voice to this player object. This way, sound is positioned in 3D space, and Unity processes audio so that volume and direction reflect the position in 3D space.
  • If there is no corresponding player in the scene, this is a spectator (from a web app, iOS/Android app, or even someone who launches the game and joins as a spectator). In this case, we attach the voice to any object in the scene and set spatialBlend to 0.0f so that the volume is always the same (i.e., god’s voice).
OnMediaAdded
public void OnMediaAdded(object sender, MediaAddedEventArgs eventArgs)
{
Room room = sender as Room;
Debug.Log($"ODIN MEDIA ADDED. Room: {room?.Config.Name}, PeerId: {eventArgs?.PeerId}, MediaId: {eventArgs?.Media.Id}");

// Check if this is 3D sound or Walkie Talkie
if (room.Config.Name.StartsWith("WalkieTalkie"))
{
// A player connected Walkie Talkie. Attach to the local players Walkie Talkie
var localPlayerController = GameManager.Instance.GetLocalPlayerController();
if (localPlayerController && localPlayerController.walkieTalkie)
{
PlayerUserDataJsonFormat userData = PlayerUserDataJsonFormat.FromUserData(eventArgs.Peer.UserData);
PlayerController player = GetPlayerForOdinPeer(userData);
if (player)
{
AttachWalkieTalkiePlayback(localPlayerController, player, room, eventArgs.PeerId, eventArgs.Media.Id);
}
else
{
Debug.LogWarning("Attaching Walkie Talkie failed, could not find player");
}
}
}
else
{
// This is 3D sound, find the local player object for this stream and attach the Audio Source to this player
if (!eventArgs.Peer.UserData.IsEmpty())
{
PlayerUserDataJsonFormat userData = PlayerUserDataJsonFormat.FromUserData(eventArgs.Peer.UserData);
PlayerController player = GetPlayerForOdinPeer(userData);
if (player)
{
AttachOdinPlaybackToPlayer(player, room, eventArgs.PeerId, eventArgs.Media.Id);
}
else
{
Debug.Log("Spectator with user data joined");
AttachSpectator(room, eventArgs.PeerId, eventArgs.Media.Id);
}
}
else
{
Debug.Log("Spectator joined");
AttachSpectator(room, eventArgs.PeerId, eventArgs.Media.Id);
}
}

}