-
Notifications
You must be signed in to change notification settings - Fork 283
help with camera feed live processing #843
Comments
This is my solution: "Graphics.Blit(renderTexture, videoMaterial);" And you can get image form renderTexture. Texture2D tex = new Texture2D(renderTexture.width, renderTexture.height, TextureFormat.RGB24, false); Hope this helps you and forgive me for being terrible in my English. |
Excuse me, I'm a newbie iin Unity. I have the renderTexture on a RawImage object. I'm confused about the "Graphics.Blit" commands. |
Did you want to get camera image or only Textture2D? You can not use one Textture2D to get raw image. The material is the real raw image. |
Sorry for the delay, I want the current camera image on a Texture2D. |
Sorry for the delay,I was on New Year's holiday. Note: |
Hi there,
I’m working on MS Holoens 2, my goal is to analyze in realtime the frames coming from the webcam feed to process them with Barracuda ML engine. I’m following this tutorial to view to camera feed on a MeshRenderer component: https://microsoft.github.io/MixedReality-WebRTC/manual/unity/helloworld-unity-localvideo.html.
However, my problem is to get a texture2d with the raw images from the camera live streaming. Any suggestions?
Thanks,
Davide
The text was updated successfully, but these errors were encountered: