React Native: Getting Started
Ship a working Orga session from your mobile app in one guided path. You will install the React Native SDK, configure native permissions, proxy credentials through a secure backend, and render camera/audio controls to verify the full round trip on a physical device.
What you will build
- Native camera/microphone permissions wired for Expo SDK 50+ (or bare RN).
- A backend endpoint that issues ephemeral tokens plus ICE servers.
- An initialized Orga provider and a demo screen with
<OrgaAICameraView>and built-in controls.
Prerequisites
- Node.js 18+, Expo SDK 50+ (development build) or bare React Native project.
- Physical device for testing; simulator emulators lack reliable media support.
- Orga AI account with an API key from the Orga dashboard .
- Secure backend environment variable storage (API key should never live on-device).
Examples use Expo Router with API Routes for clarity. Bare projects can use any backend framework as long as it exposes an HTTPS endpoint for the session config.
Install dependencies
Install the React Native SDK plus the native media helpers it depends on.
npm
npm install @orga-ai/react-native react-native-webrtc react-native-incall-managerExpo Go is not supported because it cannot expose native camera/mic APIs. Create an Expo development client and run on a physical device.
Follow Expo’s guide for development builds .
Configure permissions (Expo)
Update app.json (or app.config.js) so iOS and Android request camera/mic access:
{
"expo": {
"ios": {
"infoPlist": {
"NSCameraUsageDescription": "Allow $(PRODUCT_NAME) to access your camera",
"NSMicrophoneUsageDescription": "Allow $(PRODUCT_NAME) to access your microphone"
}
},
"android": {
"permissions": [
"android.permission.CAMERA",
"android.permission.RECORD_AUDIO"
]
}
}
}Then build and install your development client:
npx expo prebuild
eas build --platform all --profile developmentCreate a secure backend proxy
Mobile apps can never store your API key. Instead, call a backend endpoint that returns an ephemeral token and ICE servers generated via the Orga Node SDK (or another server SDK).
import 'dotenv/config';
import express from 'express';
import cors from 'cors';
import { OrgaAI } from '@orga-ai/node';
const app = express();
app.use(cors());
const orga = new OrgaAI({
apiKey: process.env.ORGA_API_KEY!
});
app.get('/api/orga-client-secrets', async (_req, res) => {
try {
const { ephemeralToken, iceServers } = await orga.getSessionConfig();
res.json({ ephemeralToken, iceServers });
} catch (error) {
console.error('Failed to get session config:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
app.listen(5000, () => console.log('Proxy running on http://localhost:5000'));ORGA_API_KEY=sk_orga_ai_******************************Never embed the API key in the app bundle. Point your mobile client at this backend (e.g., https://api.yourdomain.com/orga-client-secrets) over HTTPS.
Initialize the SDK
Run OrgaAI.init() exactly once—ideally inside your root layout—then wrap the tree with OrgaAIProvider.
import { Stack } from 'expo-router';
import { OrgaAI, OrgaAIProvider } from '@orga-ai/react-native';
OrgaAI.init({
logLevel: 'debug',
model: 'orga-1-beta',
voice: 'alloy',
fetchSessionConfig: async () => {
const response = await fetch('https://api.yourdomain.com/orga-client-secrets');
if (!response.ok) throw new Error('Failed to fetch session config');
const { ephemeralToken, iceServers } = await response.json();
return { ephemeralToken, iceServers };
},
});
export default function RootLayout() {
return (
<OrgaAIProvider>
<Stack />
</OrgaAIProvider>
);
}If you use Expo API Routes (app/api/*), replace the fetch URL with your relative path while developing locally.
Use it inside a screen
Render the built-in camera view and controls to start/stop sessions and flip/toggle media.
import { StyleSheet, View } from 'react-native';
import {
OrgaAICameraView,
OrgaAIControls,
useOrgaAI,
} from '@orga-ai/react-native';
export default function HomeScreen() {
const {
connectionState,
isCameraOn,
isMicOn,
userVideoStream,
startSession,
endSession,
toggleCamera,
toggleMic,
flipCamera,
} = useOrgaAI();
return (
<View style={styles.container}>
<OrgaAICameraView
streamURL={userVideoStream?.toURL()}
containerStyle={styles.cameraContainer}
style={{ width: '100%', height: '100%' }}
>
<OrgaAIControls
connectionState={connectionState}
isCameraOn={isCameraOn}
isMicOn={isMicOn}
onStartSession={startSession}
onEndSession={endSession}
onToggleCamera={toggleCamera}
onToggleMic={toggleMic}
onFlipCamera={flipCamera}
/>
</OrgaAICameraView>
</View>
);
}
const styles = StyleSheet.create({
container: { flex: 1, backgroundColor: '#0f172a' },
cameraContainer: { width: '100%', height: '100%' },
});Build and test on device
Run the development client on the same network as your backend (or expose it publicly via tunneling).
npx expo start --dev-clientAccept camera/mic prompts when they appear. Keep logLevel: 'debug' enabled to watch ICE negotiation and media events in expo logs.
Next steps
- Read the React Native SDK architecture overview.
- Explore the React Native API reference.
- Need troubleshooting help? Use the React Native issues how-to.