Skip to main content

Vue 3

This guide walks through integrating the ODIN Voice Web SDK into a Vue 3 application. It uses the Composition API with <script setup> and TypeScript. The code below is based on the tested example in the ODIN Web SDK repository.

info

This example uses @4players/odin-tokens to generate tokens client-side for simplicity. In production, tokens should be generated on your backend server to protect your access key. See Getting Started for details.

Dependencies

npm install @4players/odin @4players/odin-tokens

Full Example

App.vue
<template>
<div>
<p v-if="error">{{ error }}</p>

<template v-if="!isConnected">
<button @click="joinRoom" :disabled="isConnecting">
{{ isConnecting ? 'Connecting...' : 'Join Voice Chat' }}
</button>
</template>

<template v-else>
<button @click="toggleMute">{{ isMuted ? 'Unmute' : 'Mute' }}</button>
<button @click="leaveRoom">Leave</button>

<h3>Peers ({{ peers.length }})</h3>
<ul>
<li v-for="peer in peers" :key="peer.id">
{{ peer.isLocal ? 'You' : peer.userId }}
<span v-if="speakingPeers.has(peer.id)"> (speaking)</span>
</li>
</ul>
</template>
</div>
</template>

<script setup lang="ts">
import { ref, shallowRef, onUnmounted } from 'vue';
import { Room, DeviceManager, setOutputDevice, type AudioInput as AudioInputType } from '@4players/odin';
import { TokenGenerator } from '@4players/odin-tokens';

// Type for tracking peer information in component state
interface PeerInfo {
id: number;
userId: string;
isLocal: boolean;
}

// ---------------------------------------------------------------------------
// Reactive state
// ---------------------------------------------------------------------------

const isConnected = ref(false);
const isConnecting = ref(false);
const peers = ref<PeerInfo[]>([]);
// Track which peers are currently speaking via their peer ID
const speakingPeers = ref<Set<number>>(new Set());
const isMuted = ref(false);
const error = ref<string | null>(null);

// Use shallowRef for SDK objects — Vue should not deeply observe them.
// These are complex class instances that don't need reactive proxy tracking.
const room = shallowRef<Room | null>(null);
const audioInput = shallowRef<AudioInputType | null>(null);

// ---------------------------------------------------------------------------
// Join a voice room
// ---------------------------------------------------------------------------

const joinRoom = async () => {
error.value = null;
isConnecting.value = true;

try {

// Generate a token (in production, fetch this from your backend)
const generator = new TokenGenerator('YOUR_ACCESS_KEY');
const token = await generator.createToken('default', 'vue-user');

// Create a new Room instance for this session
room.value = new Room();

// ----- Register event handlers BEFORE joining -----

// Called once the room has been successfully joined
room.value.onJoined = () => {
isConnected.value = true;
isConnecting.value = false;
};

// Called when we leave or get disconnected from the room
room.value.onLeft = () => {
isConnected.value = false;
isConnecting.value = false;
peers.value = [];
};

// Called whenever a peer joins — including ourselves and peers already present
room.value.onPeerJoined = (payload) => {
// Compare with room.ownPeerId to identify the local peer
const isLocal = payload.peer.id === room.value!.ownPeerId;

peers.value = [
...peers.value.filter((p) => p.id !== payload.peer.id),
{ id: payload.peer.id, userId: payload.peer.userId, isLocal },
];

// Attach a per-peer audio activity handler to track who is speaking
payload.peer.onAudioActivity = ({ media }) => {
const next = new Set(speakingPeers.value);
if (media.isActive) {
next.add(payload.peer.id);
} else {
next.delete(payload.peer.id);
}
// Replace the Set to trigger Vue reactivity
speakingPeers.value = next;
};
};

// Called when a peer leaves the room
room.value.onPeerLeft = (payload) => {
peers.value = peers.value.filter((p) => p.id !== payload.peer.id);
const next = new Set(speakingPeers.value);
next.delete(payload.peer.id);
speakingPeers.value = next;
};

// Set the output device — required to hear other peers
await setOutputDevice({});

// Join the room with the token
await room.value.join(token, { gateway: 'https://gateway.odin.4players.io' });

// Create a microphone input. Defaults: system mic with echo cancellation,
// noise suppression, and automatic gain control enabled.
audioInput.value = await DeviceManager.createAudioInput();

// Attach the microphone to the room — audio transmission starts here
await room.value.addAudioInput(audioInput.value);
} catch (e) {
error.value = e instanceof Error ? e.message : 'Failed to join room';
isConnecting.value = false;
}
};

// ---------------------------------------------------------------------------
// Leave the room
// ---------------------------------------------------------------------------

const leaveRoom = () => {
if (audioInput.value) {
// Remove the audio input from the room (stops the encoder)
room.value?.removeAudioInput(audioInput.value);
// Close the AudioInput (releases the microphone)
audioInput.value.close();
audioInput.value = null;
}
if (room.value) {
// Disconnect from the room
room.value.leave();
room.value = null;
}
// Reset all state
isConnected.value = false;
peers.value = [];
speakingPeers.value = new Set();
isMuted.value = false;
};

// ---------------------------------------------------------------------------
// Mute / Unmute toggle
// ---------------------------------------------------------------------------

const toggleMute = async () => {
if (!audioInput.value) return;

if (isMuted.value) {
// Unmute: restore volume first, then re-add to room to resume encoding
await audioInput.value.setVolume(1);
await room.value?.addAudioInput(audioInput.value);
} else {
// Mute: remove from room to stop the encoder (saves CPU),
// then use 'muted' to stop the MediaStream so the browser's
// recording indicator disappears
room.value?.removeAudioInput(audioInput.value);
await audioInput.value.setVolume('muted');
}
isMuted.value = !isMuted.value;
};

// ---------------------------------------------------------------------------
// Cleanup on unmount — close audio and leave room
// ---------------------------------------------------------------------------

onUnmounted(() => {
leaveRoom();
});
</script>

Step-by-Step Breakdown

Plugin

The audio plugin is registered automatically by the SDK when needed -- no manual initialization is required. See Customize the Plugin if you need to customize the plugin.

warning

Modern browsers require a user gesture (click/tap) before an AudioContext can be started. Ensure joinRoom is called from a button click handler (e.g. @click="joinRoom").

Reactive State with ref and shallowRef

Vue's ref() is used for simple reactive values (booleans, arrays, sets). For SDK objects (Room, AudioInput), shallowRef() is used instead — this prevents Vue from deeply observing complex class instances that have internal state the proxy system should not intercept.

Event Handlers

All event handlers must be registered before calling room.join(). The example registers four handlers:

HandlerPurpose
room.onJoinedUpdate isConnected ref when successfully joined
room.onLeftReset refs when disconnected
room.onPeerJoinedTrack peers; attach per-peer onAudioActivity for speaking detection
room.onPeerLeftRemove peer from refs

The per-peer

PeerEvents

onAudioActivity handler receives { media } where media.isActive indicates whether the peer is currently speaking. To trigger Vue reactivity when updating a Set, a new Set instance must be assigned.

Joining and Audio

Two critical calls must happen in order:

  1. setOutputDevice({}) — Must be called before room.join() to hear other peers
  2. room.addAudioInput(audioInput) — Attaches the microphone and starts audio transmission

Muting

The example uses the full mute approach for maximum resource savings. See Muting & Volume Control for all available approaches.

Cleanup

onUnmounted calls leaveRoom() to ensure the microphone is released and the room connection is closed when the component is destroyed.