React SDK for real-time AI avatar interactions with GWM-1.
- React 18+
- A Runway API secret (get one here)
- A server-side endpoint to create sessions (API secrets must not be exposed to the client)
npm install @runwayml/avatars-reactAdd an avatar call to your app with just a few lines:
import { AvatarCall } from '@runwayml/avatars-react';
function App() {
return (
<AvatarCall
avatarId="music-superstar"
connectUrl="/api/avatar/connect"
/>
);
}That's it! The component handles session creation, WebRTC connection, and renders a default UI with the avatar video and controls.
You can use preset avatars like music-superstar, cat-character, fashion-designer, cooking-teacher, and more. See the Runway Developer Portal for the full list and creating custom avatars.
Import the optional stylesheet for a polished look out of the box:
import '@runwayml/avatars-react/styles.css';The styles use CSS custom properties for easy customization:
[data-avatar-call] {
--avatar-bg-connecting: #8b5cf6; /* Video background color */
--avatar-radius: 16px; /* Container border radius */
--avatar-control-size: 40px; /* Control button size */
--avatar-end-call-bg: #ff552f; /* End call button color */
--avatar-screen-share-active-bg: #fff; /* Active share button background */
}See examples/ for complete working examples:
nextjs-simple- Minimal single-avatar demo (great starting point)nextjs- Next.js App Router with preset grid and custom avatarsnextjs-client-events- Client event tools (trivia game)nextjs-rpc- Backend RPC + client events (trivia with server-side questions)nextjs-rpc-weather- Backend RPC only (weather assistant)nextjs-server-actions- Next.js with Server Actionsreact-router- React Router v7 framework modeexpress- Express + Vite
Scaffold an example with one command:
npx degit runwayml/avatars-sdk-react/examples/nextjs my-avatar-app
cd my-avatar-app
npm install- Client calls your server endpoint with the
avatarId - Server uses your Runway API secret to create a realtime session via
@runwayml/sdk - Server polls until the session is ready, then returns
sessionIdandsessionKeyto the client - Client establishes a WebRTC connection for real-time video/audio
This flow keeps your API secret secure on the server while enabling low-latency communication.
Your server endpoint receives the avatarId and returns session credentials. Use @runwayml/sdk to create and poll the session:
// /api/avatar/connect (Next.js App Router example)
import Runway from '@runwayml/sdk';
const client = new Runway(); // Uses RUNWAYML_API_SECRET env var
export async function POST(req: Request) {
const { avatarId } = await req.json();
const { id: sessionId } = await client.realtimeSessions.create({
model: 'gwm1_avatars',
avatar: { type: 'runway-preset', presetId: avatarId },
});
// Poll until the session is ready
const deadline = Date.now() + 30_000;
while (Date.now() < deadline) {
const session = await client.realtimeSessions.retrieve(sessionId);
if (session.status === 'READY') {
return Response.json({ sessionId, sessionKey: session.sessionKey });
}
await new Promise((resolve) => setTimeout(resolve, 1_000));
}
return Response.json({ error: 'Session creation timed out' }, { status: 504 });
}For more control over the connection flow:
<AvatarCall
avatarId="music-superstar"
connect={async (avatarId) => {
const res = await fetch('/api/avatar/connect', {
method: 'POST',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({ avatarId }),
});
return res.json();
}}
/>Use the built-in components for custom layouts:
import { AvatarCall, AvatarVideo, ControlBar, UserVideo } from '@runwayml/avatars-react';
<AvatarCall avatarId="music-superstar" connectUrl="/api/avatar/connect">
<div className="call-layout">
<AvatarVideo className="avatar" />
<UserVideo className="self-view" />
<ControlBar className="controls" />
</div>
</AvatarCall>All display components support render props for complete control. AvatarVideo receives a discriminated union with status:
<AvatarVideo>
{(avatar) => {
switch (avatar.status) {
case 'connecting': return <Spinner />;
case 'waiting': return <Placeholder />;
case 'ready': return <VideoTrack trackRef={avatar.videoTrackRef} />;
}
}}
</AvatarVideo>Style components with the namespaced data-avatar-* attributes:
<AvatarCall avatarId="music-superstar" connectUrl="/api/avatar/connect" className="my-avatar" />/* Style avatar video by connection status */
[data-avatar-video][data-avatar-status="connecting"] {
opacity: 0.5;
}
[data-avatar-video][data-avatar-status="ready"] {
opacity: 1;
}
/* Style control buttons */
[data-avatar-control][data-avatar-enabled="false"] {
opacity: 0.5;
}<AvatarCall
avatarId="music-superstar"
connectUrl="/api/avatar/connect"
onEnd={() => console.log('Call ended')}
onError={(error) => console.error('Error:', error)}
/>The avatar can see your webcam feed or screen share, enabling visual interactions — show a plant for identification, hold up a Pokémon card for trivia, get real-time coaching while you play a game, walk through a presentation, or ask for feedback on a design you're working on.
Compatibility: Webcam and screen sharing are supported by all preset avatars and custom avatars that use a preset voice. Custom avatars with a custom voice do not support webcam or screen sharing.
The webcam is enabled by default. The video prop controls whether the camera activates on connect, and the <UserVideo> component renders the local camera feed:
<AvatarCall avatarId="music-superstar" connectUrl="/api/avatar/connect">
<AvatarVideo />
<UserVideo />
<ControlBar />
</AvatarCall>To disable the webcam, set video={false}:
<AvatarCall avatarId="music-superstar" connectUrl="/api/avatar/connect" video={false} />Enable the screen share button by passing showScreenShare to ControlBar, and use <ScreenShareVideo> to display the shared content:
<AvatarCall avatarId="music-superstar" connectUrl="/api/avatar/connect">
<AvatarVideo />
<ScreenShareVideo />
<ControlBar showScreenShare />
</AvatarCall>While sharing is active, the default ControlBar UI shows a sharing banner with a quick Stop action.
You can also start screen sharing automatically by passing a pre-captured stream via initialScreenStream. This is useful when you want to prompt the user for screen share permission before the session connects:
function ScreenShareCall() {
const [stream, setStream] = useState<MediaStream | null>(null);
async function startWithScreenShare() {
const mediaStream = await navigator.mediaDevices.getDisplayMedia({ video: true });
setStream(mediaStream);
}
if (!stream) {
return <button onClick={startWithScreenShare}>Share Screen & Start Call</button>;
}
return (
<AvatarCall
avatarId="music-superstar"
connectUrl="/api/avatar/connect"
initialScreenStream={stream}
>
<AvatarVideo />
<ScreenShareVideo />
<ControlBar showScreenShare />
</AvatarCall>
);
}Use the useLocalMedia hook for full programmatic control over camera and screen sharing:
function MediaControls() {
const {
isCameraEnabled,
isScreenShareEnabled,
toggleCamera,
toggleScreenShare,
} = useLocalMedia();
return (
<div>
<button onClick={toggleCamera}>{isCameraEnabled ? 'Hide Camera' : 'Show Camera'}</button>
<button onClick={toggleScreenShare}>{isScreenShareEnabled ? 'Stop Sharing' : 'Share Screen'}</button>
</div>
);
}Use hooks for custom components within an AvatarCall or AvatarSession. Also available: useClientEvent and useClientEvents for client events, and useTranscription for real-time transcription.
Access session state and controls:
function MyComponent() {
const { state, sessionId, error, end } = useAvatarSession();
if (state === 'connecting') return <Loading />;
if (state === 'error') return <Error message={error.message} />;
return <button onClick={end}>End Call</button>;
}Access the remote avatar's video:
function CustomAvatar() {
const { videoTrackRef, hasVideo } = useAvatar();
return (
<div>
{hasVideo && <VideoTrack trackRef={videoTrackRef} />}
</div>
);
}Control local camera, microphone, and screen sharing:
function MediaControls() {
const {
isMicEnabled,
isCameraEnabled,
isScreenShareEnabled,
toggleMic,
toggleCamera,
toggleScreenShare,
} = useLocalMedia();
return (
<div>
<button onClick={toggleMic}>{isMicEnabled ? 'Mute' : 'Unmute'}</button>
<button onClick={toggleCamera}>{isCameraEnabled ? 'Hide' : 'Show'}</button>
<button onClick={toggleScreenShare}>{isScreenShareEnabled ? 'Stop Sharing' : 'Share Screen'}</button>
</div>
);
}Compatibility: Client events (tool calling) are supported on avatars that use a preset voice. Custom voice avatars do not currently support client events.
Avatars can trigger UI events via tool calls sent over the data channel. Define tools, pass them when creating a session, and subscribe on the client:
// lib/tools.ts — shared between server and client
import { clientTool, type ClientEventsFrom } from '@runwayml/avatars-react/api';
export const showCaption = clientTool('show_caption', {
description: 'Display a caption overlay',
args: {} as { text: string },
});
export const tools = [showCaption];
export type MyEvent = ClientEventsFrom<typeof tools>;// Server — pass tools when creating the session
const { id } = await client.realtimeSessions.create({
model: 'gwm1_avatars',
avatar: { type: 'custom', avatarId: '...' },
tools,
});// Client — subscribe to events inside AvatarCall
import { useClientEvent } from '@runwayml/avatars-react';
import type { MyEvent } from '@/lib/tools';
function CaptionOverlay() {
const caption = useClientEvent<MyEvent, 'show_caption'>('show_caption');
return caption ? <p>{caption.text}</p> : null;
}See the nextjs-client-events example for a full working demo.
Let the avatar click buttons, scroll to sections, and highlight elements on your page. The SDK provides pre-built tool definitions and a component that handles the DOM interactions automatically.
import { pageActionTools } from '@runwayml/avatars-react/api';
const { id } = await client.realtimeSessions.create({
model: 'gwm1_avatars',
avatar: { type: 'runway-preset', presetId: 'music-superstar' },
tools: pageActionTools,
});Combine with your own tools by spreading both arrays:
import { pageActionTools } from '@runwayml/avatars-react/api';
import { clientEventTools } from '@/lib/tools';
tools: [...pageActionTools, ...clientEventTools],import { AvatarCall, AvatarVideo, ControlBar, PageActions } from '@runwayml/avatars-react';
function App() {
return (
<AvatarCall avatarId="music-superstar" connectUrl="/api/avatar/connect">
<AvatarVideo />
<ControlBar />
<PageActions />
</AvatarCall>
);
}The avatar can now reference elements by id or by a data-avatar-target attribute:
<button id="signup">Sign Up</button>
<section data-avatar-target="pricing">...</section>| Action | What it does |
|---|---|
click |
Calls .click() on the target element |
scroll_to |
Scrolls the target into view with smooth scrolling |
highlight |
Pulses an outline around the target, then removes it |
Import the default stylesheet for a ready-made highlight animation:
import '@runwayml/avatars-react/styles.css';Elements are highlighted via the data-avatar-highlighted="true" attribute. Override the defaults with CSS:
[data-avatar-highlighted="true"] {
outline-color: hotpink;
}The animation respects prefers-reduced-motion.
<PageActions
highlightDuration={3000}
scrollBehavior="instant"
scrollBlock="center"
resolveElement={(target) => document.querySelector(`[data-custom="${target}"]`)}
/>| Prop | Default | Description |
|---|---|---|
highlightDuration |
2000 |
Milliseconds before the highlight clears |
scrollBehavior |
'smooth' |
'smooth' or 'instant' |
scrollBlock |
'start' |
'start', 'center', 'end', or 'nearest' |
resolveElement |
by id then data-avatar-target |
Custom function to find the target DOM element |
For advanced use cases, the underlying usePageActions hook accepts the same options:
import { usePageActions } from '@runwayml/avatars-react';
function MyCustomPageActions() {
usePageActions({ highlightDuration: 5000 });
return null;
}For full control over session management, use AvatarSession directly with pre-fetched credentials:
import { AvatarSession, AvatarVideo, ControlBar } from '@runwayml/avatars-react';
function AdvancedUsage({ credentials }) {
return (
<AvatarSession
credentials={credentials}
audio={true}
video={true}
onEnd={() => console.log('Ended')}
onError={(err) => console.error(err)}
>
<AvatarVideo />
<ControlBar />
</AvatarSession>
);
}| Component | Description |
|---|---|
AvatarCall |
High-level component that handles session creation |
AvatarSession |
Low-level wrapper that requires credentials |
AvatarVideo |
Renders the remote avatar video |
UserVideo |
Renders the local user's camera |
ControlBar |
Media control buttons (mic, camera, screen share, end call) |
ScreenShareVideo |
Renders screen share content |
PageActions |
Handles click, scroll, and highlight events from the avatar |
AudioRenderer |
Handles avatar audio playback |
All components and hooks are fully typed:
import type {
AvatarCallProps,
SessionCredentials,
SessionState,
} from '@runwayml/avatars-react';This SDK uses WebRTC for real-time communication. Supported browsers:
- Chrome 74+
- Firefox 78+
- Safari 14.1+
- Edge 79+
Users must grant camera and microphone permissions when prompted.
"Failed to connect" or timeout errors
- Verify your server endpoint is returning the correct credential format
- Check that
RUNWAYML_API_SECRETis set correctly on your server
No video/audio
- Ensure the user has granted camera/microphone permissions
- Check browser console for WebRTC errors
- Verify the device has a working camera/microphone
CORS errors
- Your server endpoint must accept requests from your client's origin
- For local development, ensure both client and server are on compatible origins
This SDK ships with Agent Skills that teach AI coding assistants how to integrate Runway avatars into your app. Install the SDK skill with:
npx skills add runwayml/avatars-sdk-reactFor the full Runway platform — video generation, image generation, audio, knowledge documents, and more:
npx skills add runwayml/skillsOnce installed, agents like Claude Code, Cursor, Codex, Cline, and others will have access to SDK documentation, integration patterns, and best practices.
Drop this into .cursor/rules/runway-avatars.mdc (or your project's AGENTS.md) to give your AI assistant context about the SDK:
# Runway Avatar SDK
When building with `@runwayml/avatars-react`:
- Session creation requires a server endpoint — never expose `RUNWAYML_API_SECRET` to the client
- Use `AvatarCall` for quick setup (handles session creation) or `AvatarSession` for full control with pre-fetched credentials
- Preset avatars use `{ type: 'runway-preset', presetId }`, custom avatars use `{ type: 'custom', avatarId }`
- Client events require a custom avatar with a **preset voice**; backend RPC tools work with any voice type
- Import `clientTool` and `pageActionTools` from `@runwayml/avatars-react/api` (server-safe, no React)
- All hooks (`useAvatarSession`, `useAvatar`, `useLocalMedia`, `useClientEvent`) must be used inside `<AvatarCall>` or `<AvatarSession>`
- Session states flow: `idle` → `connecting` → `active` → `ending` → `ended` (or `error`)
- See https://github.com/runwayml/avatars-sdk-react for full documentation and examplesMIT