How to feed custom video source (Snap Camera Kit) to LiveKit on Android?

Hi I am trying to use snapkit camerakit with livekit in android. My local video track is showing correct but when I publishVideoTrack opponent track unable to view or blank. Does anyone succed to use snap camerakit with livekit in Android?
(I have just found this thread about this issue How to feed custom video source (Snap Camera Kit) to LiveKit on Android? · Issue #843 · livekit/client-sdk-android · GitHub)

Sorry, I am not familiar with that Snap camera kit. But generally, if your local preview works but the remote sees a blank track, it usually means:

  • You are rendering CameraKit output to a SurfaceView/TextureView, but not pushing frames into the LiveKit track.

  • Or the track is published without frames being continuously captured.

  • Or the track source/encoding is misconfigured.

.
See Camera & microphone | LiveKit Documentation

From LiveKit’s side, publishing video means creating a LocalVideoTrackstream and continuously supplying frames to it, as shown in the Camera & microphone guide and video publishing section:
https://docs.livekit.io/transport/media/publish/

Are you using CameraKit’s frame processor (YUV/Texture output) or just rendering to a view? If you share how you’re creating and publishing the LocalVideoTrackI can pinpoint exactly what’s missing.

I am using CameraKit’s frame processor(hope you refers cameraKitSession?.processor?.connectOutput() ). I think the issue is likely your first assumption but I couldnt find any solution. Also I have checked logs, I am getting CameraService cameraserver E Conflicts with: Device 1, client package *** (PID 18496, score 0, state 2) it seems snap camera kit and livekit both try to use camera. Below is my relevant code

   var imageProcessorSource by remember { mutableStateOf<CameraXImageProcessorSource>(CameraXImageProcessorSource(context = context, lifecycleOwner = lifecycleOwner)) }

AndroidView(
    modifier = Modifier.fillMaxSize(),
    factory = { ctx ->

        // Inflate the layout for the preview
        LayoutInflater.from(ctx).inflate(R.layout.camera_layout, null).apply {
            val viewStub = findViewById<ViewStub>(R.id.camera_kit_stub)

            cameraKitSession = Session(context = ctx) {
                imageProcessorSource(imageProcessorSource)
                attachTo(viewStub)
            }.apply {
                lenses.repository.observe(
                    LensesComponent.Repository.QueryCriteria.ById(lensId, lensGroupId)
                ) { result ->
                    result.whenHasFirst { requestedLens ->
                        lenses.processor.apply(requestedLens)
                    }
                }

            }

            // The API returns a Closeable that must be managed.
            imageProcessorScope.launch {
                // A. Create our custom capturer
                val cameraKitCapturer = CameraKitCapturer2()// copied livekit BitmapFrameCapturer and Exposed the SurfaceTexture

                localParticipant = room.localParticipant

                track = localParticipant?.createVideoTrack("camera-kit", cameraKitCapturer) as LocalVideoTrack

                room.localParticipant.publishVideoTrack(track!!, options = VideoTrackPublishOptions(source = Track.Source.CAMERA))
                track?.startCapture() // Start capture to ensure the surface is ready
                if(cameraKitCapturer.surfaceTexture == null) Toast.makeText(context, "Surface texture is null", Toast.LENGTH_LONG).show()
                cameraKitCapturer.surfaceTexture?.let {surfaceTexture ->
                    val liveKitOutput = LiveKitOutput(Surface(surfaceTexture))
                    closable = cameraKitSession?.processor?.connectOutput(liveKitOutput)

                }

            }
        }
    },
    update = {
        // E. Apply lenses when lensId changes
        cameraKitSession?.lenses?.repository?.observe(
            LensesComponent.Repository.QueryCriteria.ById(lensId, lensGroupId)
        ) { result ->
            result.whenHasFirst { requestedLens ->
                cameraKitSession?.lenses?.processor?.apply(requestedLens)
            }
        }
    }
)

Your logs confirm the real issue: both Snap CameraKit and LiveKit are trying to open the physical camera. That’s why you see CameraService conflicts with Device 1. Only one camera owner is allowed on Android.

From LiveKit’s perspective, publishing video means continuously feeding frames into a LocalVideoTrack source — it does not require LiveKit to open the camera itself. See the publishing model in the Camera & microphone guide

Right now, your CameraKitCapturer2() is likely backed by Camera2 internally, so LiveKit is competing for the camera.

The correct architecture is:

Snap CameraKit owns the camera → you receive processed frames → you push those frames into a custom LiveKit video source → LiveKit publishes that track.