livekit_client_advn

Last updated:

0 purchases

livekit_client_advn Image
livekit_client_advn Images
Add to Cart

Description:

livekit client advn

LiveKit Flutter SDK #
Official Flutter SDK for LiveKit. Easily add real-time video and audio to your Flutter apps.
This package is published to pub.dev as livekit_client.
Docs #
More Docs and guides are available at https://docs.livekit.io
Current supported features #



Feature
Subscribe/Publish
Simulcast
Background audio
Screen sharing




Web
🟢
🟢
🟢
🟢


iOS
🟢
🟢
🟢
🟢


Android
🟢
🟢
🟢
🟢


Mac
🟢
🟢
🟢
🟢


Windows
🟢
🟢
🟢
🟢



🟢 = Available
🟡 = Coming soon (Work in progress)
🔴 = Not currently available (Possibly in the future)
Example app #
We built a multi-user conferencing app as an example in the example/ folder. You can join the same room from any supported LiveKit clients.
Installation #
Include this package to your pubspec.yaml
---
dependencies:
livekit_client: <version>
copied to clipboard
iOS #
Camera and microphone usage need to be declared in your Info.plist file.
<dict>
...
<key>NSCameraUsageDescription</key>
<string>$(PRODUCT_NAME) uses your camera</string>
<key>NSMicrophoneUsageDescription</key>
<string>$(PRODUCT_NAME) uses your microphone</string>
copied to clipboard
Your application can still run the voice call when it is switched to the background if the background mode is enabled. Select the app target in Xcode, click the Capabilities tab, enable Background Modes, and check Audio, AirPlay, and Picture in Picture.
Your Info.plist should have the following entries.
<dict>
...
<key>UIBackgroundModes</key>
<array>
<string>audio</string>
</array>
copied to clipboard
Notes
Since xcode 14 no longer supports 32bit builds, and our latest version is based on libwebrtc m104+ the iOS framework no longer supports 32bit builds, we strongly recommend upgrading to flutter 3.3.0+. if you are using flutter 3.0.0 or below, there is a high chance that your flutter app cannot be compiled correctly due to the missing i386 and arm 32bit framework #132 #172.
You can try to modify your {projects_dir}/ios/Podfile to fix this issue.
post_install do |installer|
installer.pods_project.targets.each do |target|
flutter_additional_ios_build_settings(target)

target.build_configurations.each do |config|

# Workaround for https://github.com/flutter/flutter/issues/64502
config.build_settings['ONLY_ACTIVE_ARCH'] = 'YES' # <= this line

end
end
end
copied to clipboard
Android #
We require a set of permissions that need to be declared in your AppManifest.xml. These are required by Flutter WebRTC, which we depend on.
<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.your.package">
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.BLUETOOTH" android:maxSdkVersion="30" />
<uses-permission android:name="android.permission.BLUETOOTH_ADMIN" android:maxSdkVersion="30" />
<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
...
</manifest>
copied to clipboard
Desktop support #
In order to enable Flutter desktop development, please follow instructions here.
On M1 Macs, you will also need to install x86_64 version of FFI:
sudo arch -x86_64 gem install ffi
copied to clipboard
On Windows VS 2019 is needed (link in flutter docs will download VS 2022).
Usage #
Connecting to a room, publish video & audio #
final roomOptions = RoomOptions(
adaptiveStream: true,
dynacast: true,
// ... your room options
)

final room = await LiveKitClient.connect(url, token, roomOptions: roomOptions);
try {
// video will fail when running in ios simulator
await room.localParticipant.setCameraEnabled(true);
} catch (error) {
print('Could not publish video, error: $error');
}

await room.localParticipant.setMicrophoneEnabled(true);
copied to clipboard
Screen sharing #
Screen sharing is supported across all platforms. You can enable it with:
room.localParticipant.setScreenShareEnabled(true);
copied to clipboard
Android
On Android, you would have to define a foreground service in your AndroidManifest.xml.
<manifest xmlns:android="http://schemas.android.com/apk/res/android">
<application>
...
<service
android:name="de.julianassmann.flutter_background.IsolateHolderService"
android:enabled="true"
android:exported="false"
android:foregroundServiceType="mediaProjection" />
</application>
</manifest>
copied to clipboard
iOS
On iOS, a broadcast extension is needed in order to capture screen content from
other apps. See setup guide for instructions.
Advanced track manipulation #
The setCameraEnabled/setMicrophoneEnabled helpers are wrappers around the Track API.
You can also manually create and publish tracks:
var localVideo = await LocalVideoTrack.createCameraTrack();
await room.localParticipant.publishVideoTrack(localVideo);
copied to clipboard
Rendering video #
Each track can be rendered separately with the provided VideoTrackRenderer widget.
VideoTrack? track;

@override
Widget build(BuildContext context) {
if (track != null) {
return VideoTrackRenderer(track);
} else {
return Container(
color: Colors.grey,
);
}
}
copied to clipboard
Audio handling #
Audio tracks are played automatically as long as you are subscribed to them.
Handling changes #
LiveKit client makes it simple to build declarative UI that reacts to state changes. It notifies changes in two ways

ChangeNotifier - generic notification of changes. This is useful when you are building reactive UI and only care about changes that may impact rendering.
EventsListener<Event> - listener pattern to listen to specific events (see events.dart).

This example will show you how to use both to react to room events.
class RoomWidget extends StatefulWidget {
final Room room;

RoomWidget(this.room);

@override
State<StatefulWidget> createState() {
return _RoomState();
}
}

class _RoomState extends State<RoomWidget> {
late final EventsListener<RoomEvent> _listener = widget.room.createListener();

@override
void initState() {
super.initState();
// used for generic change updates
widget.room.addListener(_onChange);

// used for specific events
_listener
..on<RoomDisconnectedEvent>((_) {
// handle disconnect
})
..on<ParticipantConnectedEvent>((e) {
print("participant joined: ${e.participant.identity}");
})
}

@override
void dispose() {
// be sure to dispose listener to stop listening to further updates
_listener.dispose();
widget.room.removeListener(_onChange);
super.dispose();
}

void _onChange() {
// perform computations and then call setState
// setState will trigger a build
setState(() {
// your updates here
});
}

@override
Widget build(BuildContext context) {
// your build function
}
}
copied to clipboard
Similarly, you could do the same when rendering participants. Reacting to changes makes it possible to handle tracks published/unpublished or re-ordering participants in your UI.
class VideoView extends StatefulWidget {
final Participant participant;

VideoView(this.participant);

@override
State<StatefulWidget> createState() {
return _VideoViewState();
}
}

class _VideoViewState extends State<VideoView> {
TrackPublication? videoPub;

@override
void initState() {
super.initState();
widget.participant.addListener(this._onParticipantChanged);
// trigger initial change
_onParticipantChanged();
}

@override
void dispose() {
widget.participant.removeListener(this._onParticipantChanged);
super.dispose();
}

@override
void didUpdateWidget(covariant VideoView oldWidget) {
oldWidget.participant.removeListener(_onParticipantChanged);
widget.participant.addListener(_onParticipantChanged);
_onParticipantChanged();
super.didUpdateWidget(oldWidget);
}

void _onParticipantChanged() {
var subscribedVideos = widget.participant.videoTracks.values.where((pub) {
return pub.kind == TrackType.VIDEO &&
!pub.isScreenShare &&
pub.subscribed;
});

setState(() {
if (subscribedVideos.length > 0) {
var videoPub = subscribedVideos.first;
// when muted, show placeholder
if (!videoPub.muted) {
this.videoPub = videoPub;
return;
}
}
this.videoPub = null;
});
}

@override
Widget build(BuildContext context) {
var videoPub = this.videoPub;
if (videoPub != null) {
return VideoTrackRenderer(videoPub.track as VideoTrack);
} else {
return Container(
color: Colors.grey,
);
}
}
}
copied to clipboard
Mute, unmute local tracks #
On LocalTrackPublications, you could control if the track is muted by setting its muted property. Changing the mute status will generate an onTrackMuted or onTrack Unmuted delegate call for the local participant. Other participant will receive the status change as well.
// mute track
trackPub.muted = true;

// unmute track
trackPub.muted = false;
copied to clipboard
Subscriber controls #
When subscribing to remote tracks, the client has precise control over status of its subscriptions. You could subscribe or unsubscribe to a track, change its quality, or disabling the track temporarily.
These controls are accessible on the RemoteTrackPublication object.
For more info, see Subscriber controls.
Getting help / Contributing #
Please join us on Slack to get help from our devs / community members. We welcome your contributions(PRs) and details can be discussed there.
License #
Apache License 2.0
Thanks #
A huge thank you to flutter-webrtc for making it possible to use WebRTC in Flutter.

License:

For personal and professional use. You cannot resell or redistribute these repositories in their original state.

Files In This Product:

Customer Reviews

There are no reviews.