Skip to main content
Version: 1.x

Getting Started

At the end of this guide, you'll be able to chat with friends and colleagues with realtime voice positioned in 3D. It will take around 20 minutes of your time to finish this guide.

info

Please note: Although it's not required to understand the basic concepts of ODIN to follow this guide, it's definitely helpful. We try our best to give some explanation in this guide, however, if you are not in a hurry, we recommend reading through the introduction first.

Create an empty Unity project

First, you'll need to create an empty Unity project. Please note that ODIN requires Unity version 2019.4 and later. In this guide we used version 2020.3.17f1, but feel free to use what you have installed. We also used the Unity Hub version 3.0.0 - if you are using an earlier version of the hub your screen might look a bit different, but the steps are basically the same.

Creating a new 3D project in Unity with the Unity Hub.

Please choose the 3D template, choose a folder on your hard disk and click on Create Project. After a few moments the Unity Editor should fire up, presenting an empty scene in a new project.

Install ODIN SDK

Next step is to install the ODIN Unity SDK in the new project.

danger

Important: Please remove files of previous versions before installing a new version. Unity does not remove old files from the installation and this causes issues sometimes. So just remove the package and install the latest one. 4Players ODIN also supports Apple Silicon. Check out our FAQ to learn more on prerequisites for running ODIN with Apple Silicon.

4Players ODIN supports Unity 2019.4 or any later version. The latest version is always available in our Github repository.

There are numerous ways to install ODIN into your project. We recommend using Package Manager.

Install via Asset Store

We are in the process of getting our package back into the Unity Asset Store. In the meantime, please use the Package Manager or the Unity Package to install the SDK.

Install via Unity Package

Please download the latest version of the ODIN Unity SDK as a .unitypackage from the [Github releases page] (https://github.com/4Players/odin-sdk-unity/releases). Just double-click the .unitypackage to import it into your current Unity editor project.

Package Manager

Using the Package Manager will ensure that all dependencies are set up correctly and that you will have the most up-to-date version of the SDK. In most cases, using the Package Manager is the way to go.

To open the Package Manager, navigate to Window and then click Package Manager in your Unity Editor menu bar.

Using our Git Repository

Click the + button in the upper left and select Add package from git URL. Next, enter this URL and hit enter to import the package:

https://github.com/4Players/odin-sdk-unity.git

Using a Tarball Archive

Click the + button in the upper left and select Add package from tarball. Next, select the odin.tgz archive you've downloaded from the Github releases page to import the package.

Install the samples

In the next step, we need to install the samples so we can use them directly in the Unity editor. Open the Package Manager. Click on the menu Window and then Package Manager. A new window will pop up:

The Package manager

Make sure that Packages: In Project is selected in the dropdown (see highlighted box in screenshot above), otherwise click on it and choose Packages: In Project to show all packages installed in this project. In the list, you should see 4Players ODIN.

Importing the samples

Click on the 4Players ODIN item in the list on the left side. You'll see some package details. Expand the Samples collapsable to reveal available examples and click on the Import button to install the samples.

Loading the sample scene

After a couple of seconds, the samples will be installed in your Assets folder as you can see in this screenshot:

ODIN sample

Navigate to the Assets/Samples/4Players ODIN/__VERSION__/Examples/Positional Audio folder in your asset browser.

tip

Please note, __VERSION__ stands for the current version number of the ODIN SDK. As you can see in the screenshot above, the latest version number was 0.4.3 when this guide has been written.

Double click on the sample scene Sample3dScene to load the scene (highlighted in the screenshot) in the Unity editor.

Understanding the scene

The sample scene loaded in the editor

The scene has a couple of Game Objects. Let's dig through them one after the other:

Main Camera : This is the main camera of the scene.

Event System : This sample requires an EventSystem (you can create it in your own project in the menu GameObject -> UI -> Event System)

Directional Light : A light, otherwise it's kind of dark

Player : An instance of the Player prefab. You can find the prefab in the Assets/Samples/4Players ODIN/__VERSION__/Examples/Positional Audio/Player.prefab file. This object represents the player in this scene, i.e. the person sitting in front of the device. We'll have a closer look at this object later in this guide.

ODIN Manager 3D Variant : This object manages communication with ODIN servers and allows you to setup various settings. If your are curious, you can find a manual of all those settings here

Plane : The plane that acts as a simple "floor" in this scene.

Setting an ODIN access key

Select the ODIN Manager 3D Variant GameObject in the hierarchy. The inspector on the right side will reveal all ODIN settings. We now need to set an access key that will allow the client (in this case the Unity Editor) to connect to ODIN servers.

info

This is an important topic, that we don't want to dive in deeper right now, but you'll have to take a minute to understand that later. Check out our guide on this topic here: Understanding Access Keys.

As we want to keep things simple, you can generate an access key right from Unity. Expand the Client Authorization collapsable to reveal the authorization settings:

The client authentication settings

Click on the Manage Access. You'll see a dialog like this:

Generate an access token within Unity

Click on the Generate Access Key to generate an access key for free. This will allow you to connect with up to 25 users. Don't use this access key in production!

Close the Access Key Manager window and you should see the access key set in the Client Authorization inspector.

warning

Some users reported that the generated access key is deleted by the Unity editor after navigating around. This seems to be a Unity bug. As a work-around, after pressing the "Generate Access Key", copy the Access key from the inspector field, empty it with DELETE and copy it in again. This way the access key is stored by the inspector.

Running the sample

Congratulations. Let's test the sample scene. Press the Play button in the Unity Editor. You should see something like this:

The example scene in Play mode

You'll see the Player GameObject from top and you'll notice eventually, that the microphone is recording (for example on Mac you might need to give Unity access to your Microphone).

You can drag & drop the Player cube with your mouse to position it somewhere else on the scene. That will become important later to experience 3D audio.

As always with voice chat, it's boring if you are alone. So let's get some other guys in the room.

Building the project

To get more users, we'll need to build the project so we can share the binary with friends or colleagues. Open the build settings window by navigating to the menu File -> Build Settings.

Unity Build Settings

Click on the Add Open Scenes button to add the current scene to the list. That's important, otherwise, you cannot build the project.

warning

Very important for Mac users: If you want to build for macOS or iOS, you need to set a string for microphone usage in the player settings. Otherwise, Unity editor will fail to build the target with obscure error messages. We have compiled a guide how to set that and you can find it here.

Click on the Build and run button to build the current scene and run it on your machine. You'll need to provide a folder on your hard disk where the binary will be created.

After some time, you should see the application running fullscreen on your machine. Unity typically defaults to running things in fullscreen. It's a bit easier to test with multiple instances of the app if they run in windowed mode. Let's change that setting.

Navigate to the menu Edit -> Project Settings. The Project Settings Window will appear. Choose Player Settings on the left side and expand the Resolution and Presentation collapsable.

Setting windowed mode

Testing ODIN

Ok, now that we optimized our app settings, let's build and run the app again. Choose File -> Build and Run. The app should run in windowed mode now.

Now, get back into the Unity Editor and click on Play.

Sample app running as app and in the editor

Now, finally, something interesting happens: You see another cube in the scene. If your volume is too high, or your mic is too sensitive, you might hear an echo: That is, because both peers are running on the same machine with the same input - which does not make much sense but for testing it's ok.

If you say something, you'll hear your own voice. Now, begin to drag & drop the cubes around and notice how the direction changes depending on the relative position each cube has to each other.

You can send the binary to your friend or colleague to test ODIN in a more realistic scenario.

warning

Please note: The position of the cubes is not synced over the network, as we want to keep things as simple as possible in this example. So, every peer can drag & drop the boxes around and will hear the voices from a direction that he has set up on screen! Please keep that in mind when testing, i.e. if you ask your colleague: "Do you hear me from the right", he might hear you from the front or from the left, as he might have different cube positions on his screen. That shows, that ODIN is a pure client-side and deep engine integration - and thus very flexible and easy to use.

How does it work?

Congratulations, you now have a working example of ODIN integrated with 3D audio. Let's dig a bit deeper to understand how everything works.

info

ODIN is a client-server solution, which means that ODIN clients, i.e. in this case applications built with Unity or the Unity Editor itself are connecting to ODIN servers and joining rooms. Every client connected to the same room can listen to all audio streams of the other clients connected to a room. Every client may send audio in the room, but does not have to. They may be muted or may just listen to the audio streams (i.e. spectators).

These days, everyone is talking about 3D spatial audio (for example Apple AirPods use this technology). What that means is that the audio solution tries to replicate how things work in the real world. Humans know from which direction sound comes from, and they can even estimate the distance because sound is louder if closer and silent if far away.

The opposite would be a phone call with headphones on your ears. It does not make any difference how far the other person on the line (i.e. in the same room in ODIN terminology) is, it's always coming from the same "direction", it sounds like right in front of you and the volume is the same.

ODIN is deeply integrated into the game engine instead of being a layer above. ODIN, therefore, uses core Unity technology:

Every peer and attached media on ODIN servers need to be replicated within Unity at runtime as shown in the diagram below. Every peer in ODIN is typically replicated (depending on the use case) in Unity as a GameObject, this can be just a player name in a list in a chat application or a Player object in a multiplayer game. Every media (i.e. microphone) attached to a peer within an ODIN room is replicated in Unity with a

PlaybackComponent

which is typically a child of the player object being the

AudioSource

of the microphone linked to that peer.

To build this hierarchy, we need to listen to certain events and handle them. Don't worry, it's super easy and just requires a couple of lines of code.

In the next section, we'll have a closer look at how this is done:

The ODIN Manager Object

The first object that we'll have a closer look at is the OdinManager 3D Variant object. You already know that object as you used it to set your access key.

The scripts attached to the Player object

If you look closer, you'll notice there are a couple of scripts attached to this GameObject:

Odin Editor Config : The Odin Editor Config script allows you to set up ODIN within your game and handles client authorization and is part of the ODIN SDK.

Odin Handler : This script does all the event handling and exposes various settings for the microphone and audio effects. See the manual of Odin Handler for documentation of all the settings available and how to use the API. This class is also part of the ODIN SDK.

Microphone Reader : This class is part of the ODIN SDK and captures the audio input from the microphone connected to the player's PC and sends it to ODIN servers once the client joined an ODIN room. See the Microphone Reader for more info on this component.

Simple Push To Talk : This script is part of the sample code and uses the ODIN SDK to implement joining a room and "Push to Talk" (i.e. the microphone is always muted, and you need to press a button to unmute while talking). We'll have a closer look at this script in the next paragraph.

Simple Push To Talk

Let's have a look at the SimplePushToTalk.cs script provided with the sample. It uses various APIs provided by the ODIN SDK to join a room and to implement push-to-talk logic.

Basic Push to Talk
using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;

using OdinNative.Unity.Audio;

namespace OdinNative.Unity.Samples
{
public class SimplePushToTalk : MonoBehaviour
{
public string RoomName;
[SerializeField]
public KeyCode PushToTalkHotkey;
public bool UsePushToTalk = true;
public MicrophoneReader AudioSender;

private void Reset()
{
RoomName = "default";
PushToTalkHotkey = KeyCode.C;
UsePushToTalk = true;
}

// Start is called before the first frame update
void Start()
{
if (AudioSender == null)
AudioSender = FindObjectOfType<MicrophoneReader>();

OdinHandler.Instance.JoinRoom(RoomName);
}

// Update is called once per frame
void Update()
{
if (AudioSender)
AudioSender.RedirectCapturedAudio = UsePushToTalk ? Input.GetKey(PushToTalkHotkey) : true;
}
}
}

As said before, we need to join a room so that we can chat with other peers in that same room. To do this, the SimplePushToTalk script takes the RoomName parameter given in the Inspector and connects to that room with the Odin Handler instance.

The OdinManager 3d Variant also has a Microphone Reader script attached. This script handles microphone input and sends it to all ODIN rooms that the player has joined. But, as we only want to send audio input when a specific button is pressed, we'll get a reference to the MicrophoneReader instance in the Start function and set the

RedirectCapturedAudio

to true if audio should be sent (i.e. if Push to Talk is activated and the selected button is pressed) or false if no audio should be sent to the server.

So, that's it. As you can see, the ODIN 3D Manager handles a couple of settings like client authorization (access key), handles microphone input, and sends data to the server if the user has pressed a button.

info

Please note: Per default, ODIN has

RedirectCapturedAudio

set to true which means that microphone input is always sent to ODIN servers. You need to change the setting in the inspector or use a script like shown above to customize that to your needs. Push to talk is just one example.

The Player Object

The second object that we want to look closer is the Player object in the scene. Highlight it, and you'll notice a couple of scripts attached to this object:

The scripts attached to the Player object

Audio Listener : In many games, the audio listener is attached to the camera. In this case, we want the players "ear" to be at the players' position, which is important as we want Unity to calculate audio from other sources relative to the players' position and not that steady camera above the scene. Please note, the AudioListener is a standard Unity component.

Drag Object : This is just a simple script that allows you to drag & drop the cube around. We'll not go into detail on how this is implemented, but feel free to have a look in your preferred IDE.

Odin 3d Trigger : This script handles some ODIN events and creates the peer cubes. We'll dive into this script in a minute.

Odin 3d Trigger

The Odin3dTrigger.cs script is handling a couple of ODIN events and creates the peers whenever a peer connects. It's a bit longer, so we go through it step by step.

In the Start function, we'll listen to some ODIN events. In this example, we do it by code, but please feel free to connect them via the Unity Inspector (see Odin Handler manual for more info on that).

Handling events
void Start()
{
OdinHandler.Instance.OnCreatedMediaObject.AddListener(Instance_OnCreatedMediaObject);
OdinHandler.Instance.OnDeleteMediaObject.AddListener(Instance_OnDeleteMediaObject);
OdinHandler.Instance.OnRoomLeft.AddListener(Instance_OnRoomLeft);

var SelfData = OdinHandler.Instance.GetUserData();
//Set Player
GameObject player = GameObject.FindGameObjectsWithTag("Player").FirstOrDefault();
if (player != null)
{
TextMesh label = player.GetComponentInChildren<TextMesh>();
label.text = CustomUserDataJsonFormat.FromUserData(SelfData)?.name ?? player.name;
}
}

So, we listen on the

OnCreatedMediaObject

,

OnDeleteMediaObject

, and

OnRoomLeft

events:

OnDeleteMediaObject : Triggered whenever a peer stopped sending audio data

OnRoomLeft : Triggered when a peer left a room.

We also get the User Data of the user and set the text label of the player object to the player name (if available).

Now, as you can see, we need to have some callback methods for these events, let's have a look step by step:

Handling On Media Created Events

This event is triggered whenever a peer joined a room and started to send audio data. There is a difference between a peer joining a room and media added to that room for that peer because peers may only join a room without sending their own data (i.e. spectators). So, to simplify things we just handle the case when a player joined the room starts sending audio (which is the

OnCreatedMediaObject

event):

The OnMediaAdded implementation
GameObject CreateObject()
{
return Instantiate(prefab, new Vector3(0, 0.5f, 6), Quaternion.identity);
}

private void Instance_OnCreatedMediaObject(string roomName, ulong peerId, int mediaId)
{
Room room = OdinHandler.Instance.Rooms[roomName];
if (room == null || room.Self == null || room.Self.Id == peerId) return;

var peerContainer = CreateObject();

//Add PlaybackComponent to new dummy PeerCube
PlaybackComponent playback = OdinHandler.Instance.AddPlaybackComponent(peerContainer, room.Config.Name, peerId, mediaId);

//Some AudioSource test settings
playback.PlaybackSource.spatialBlend = 1.0f;
playback.PlaybackSource.rolloffMode = AudioRolloffMode.Linear;
playback.PlaybackSource.minDistance = 1;
playback.PlaybackSource.maxDistance = 10;

//set dummy PeerCube label
var data = CustomUserDataJsonFormat.FromUserData(room.RemotePeers[peerId]?.UserData);
playback.gameObject.GetComponentInChildren<TextMesh>().text = data == null ?
$"Peer {peerId} (Media {mediaId})" :
$"{data.name} (Peer {peerId} Media {mediaId})";

PeersObjects.Add(playback.gameObject);
}

This callback does four things:

Some experiments

Let's do some experiments with the sample that we have right now to showcase the power of ODIN and to give you an idea of what you can do with it in your game.

In the Odin3dTrigger.cs script, in the Instance_OnCreatedMediaObject function adjust some settings made to the PlaybackComponent.PlaybackSource. As this is a standard Unity AudioSource object, you'll find the component in the Unity documentation. Here are a couple of things you can try:

Switching to "gods voice"

Change the line where spatialBlend is set into this:

Disabling 3D audio
playback.PlaybackSource.spatialBlend = 0.0f; //Disable 3D audio

This just disables 3D audio and every ODIN AudioSource in the scene will have always the same volume. That is basically the same experience as if all players would use an external voice chat solution like TeamSpeak or Discord. The voice is "decoupled" completely from the gameplay. It may be useful if you want to replicate phone calls or CB communication between team members.

Please note that spatialBlend is a float value. You can also set it to 0.5f to have something in between.

Adjusting voice attenuation

Right now, peers can only be heard up to 10 units. Reduce that to 2 and you'll only hear peers that are very close to you. You can use these properties to implement "sound damping" dependent on the current location of the peer relative to the player.

Setting voice attenuation
playback.PlaybackSource.minDistance = 1;
playback.PlaybackSource.maxDistance = 2;

Summary

StepTodoDescription
1ODIN SDKInstall the ODIN SDK into your project.
2Add ODIN ManagerDrag & Drop the ODIN Manager Prefab into your scene.
3Access KeyCreate and set an access key in the Unity inspector.
4Handle ODIN eventsHandle some ODIN events like shown above or described in the Event Handling guide and use

AddPlaybackComponent

to create the other players' audio sources and attach them to player objects in your scene.

That's it. There is not much more to do to integrate ODIN into any project. Of course, there are many APIs to customize the experience to your own liking. As you note, ODIN integration is pure Unity. You just work with Unity as you always do, not strange APIs or sending additional data to other servers.

Of course, if you are working on a multiplayer game, things are a little bit more complex, but just a little bit. We have created a guide where you create a complete multiplayer game with Mirror Networking and add ODIN to it. After a couple of minutes, you'll have the same example as this one, but player positions will be synced between all peers.

Next steps

Now that you have a working example find out more about the components ODIN has to offer.