Pre-warm the audio connection for iOS?

Can someone point me to example code with the livekit js client sdk where the audio connection is pre-warmed?

Right now, setting audio={true} on the component always results in visible lag, unfortunately, almost always coinciding with the first few words of the agent’s intro speech (on iOS this happens like 90% of the time), so the first few words are always cut off.

This one is easy, with LiveKit JS, the key is not just audio={true}. You need to call room.startAudio() from a real user gesture before the agent starts speaking.

Specifically, browsers, especially iOS Safari, may block or delay autoplay; Room.startAudio() must be called in a click/tap handler, and Room.canPlaybackAudio / RoomEvent.AudioPlaybackStatusChanged will tell you whether audio is actually unlocked.

Use a “Join / Start call” button like this:

import { Room, RoomEvent } from "livekit-client";
import { LiveKitRoom, RoomAudioRenderer } from "@livekit/components-react";
import { useMemo, useState } from "react";

export function VoiceRoom({ serverUrl, token }) {
  const room = useMemo(() => new Room(), []);
  const [connected, setConnected] = useState(false);

  async function joinWithPrewarmedAudio() {
    // Must happen directly inside the click/tap handler on iOS/Safari
    await room.startAudio();

    await room.connect(serverUrl, token, {
      autoSubscribe: true,
    });

    setConnected(true);
  }

  return (
    <>
      {!connected && (
        <button onClick={joinWithPrewarmedAudio}>
          Join voice chat
        </button>
      )}

      {connected && (
        <LiveKitRoom room={room} audio={true} video={false}>
          <RoomAudioRenderer />
        </LiveKitRoom>
      )}
    </>
  );
}

I would personally add a fallback button for cases where the browser still blocks playback:

room.on(RoomEvent.AudioPlaybackStatusChanged, () => {
  if (!room.canPlaybackAudio) {
    console.warn("Audio playback blocked. Show an unlock-audio button.");
  }
});

A LiveKit React helper component might help also:

import { StartAudio } from "@livekit/components-react";

<LiveKitRoom room={room}>
  <RoomAudioRenderer />
  <StartAudio label="Tap to enable audio" />
</LiveKitRoom>

StartAudio only appears when browser autoplay policy blocks audio playback.

The “agent intro gets clipped” issue, try:

  • User taps Start / Join
  • Call room.startAudio() immediately
  • Connect to room
  • Wait until the client is connected and audio playback is allowed
  • Then tell the agent to begin its intro speech

Do not let the agent speak immediately on room creation, especially on iOS. Safari often needs the audio context unlocked first.

Hope This Helps.

This makes me think there should really be some example code. In my experience, iOS is aggressive with AVAudioSession control.

Also, just curious, have you tested this code snippet yourself? Where did you get it? I’ve tried like 3 different approaches of my own that didn’t work. My patch fix is to set audio={false} (default), and then on first user unmute, we acquire the audio hardware. Problem is I’ve timed this and it takes ~4 seconds for the app, on a modern iphone, to fully connect to the mic. 4 seconds from when user presses unmute to when they can start talking.