I'm developing currently a drone simulator in Unity 5 and I've got 4 Unity-Cameras in my Scene. I'm working now on a solution to stream the screen of one virtual camera in Unity, to my android smartphone, in real time. I was thinking about these steps:
Read the screen pixels to an empty Texture2D using Texture2D.ReadPixels
Encode the Texture2D using EncodeToJPG or EncodeToPNG
3. Send the encoded bytes to the device (through a socket to the device; or with this WebRTC solution)
- On the device: read the bytes into an image(using a combination of ByteArrayInputStream and BitmapFactory for example)
I finished already the first two steps. Now I've got no idea how to work with sockets, so I'm using WebRTC. But how can i implement the JavaScript Code in my Unity Project? Can someone help me with that?
I found also a WebRTC API in a forum, but my network skills are not good enough. Can maybe somebody tell me, where I create the peer-to-peer connection in this code?
Forum: https://forum.unity3d.com/threads/unitypeerjs-simple-webrtc-support-for-unity-webgl.310166/
Code (WebRTC API for Unity): https://www.pastiebin.com/embed/5880b2815c96a
You will not be able to stream encoded JPGs with Unity. One major reason is, that the JPEG decoding process blocks the main-thread until the JPEG is decoded and the resulting pixel data is uploaded on the GPU - this causes major performance issues on all platforms. Also currently there is no complete-bullet proof variant of streaming a video from any kind of video sink - somewhere on the Unity Roadmap is to enable video streaming from the builtin video player.
If your Android-App is not Unity dependant, you can use HLS or MPEG-Dash as streaming variant - you will need a video buffer regardless, which can be as tiny as 5 - 10 frames. One variant would be to directly pool video frames to FFMPEG, which allows you to demux it as HLS stream. Other variants, if you don't have very long distances between the source device and the android device would be RTC or RTSP streams.
If you still insist on Unity on Android, you might have luck with this Asset: https://www.assetstore.unity3d.com/en/#!/content/56355 (they offer you a free and a reduced version) - we're using it in production quite a lot and it enables you the system-default encoders inside Unity.
Depending on your usecase, JPEG2000 would work, which then just needs a browser to link to. JPEG2000 allows you basically to keep an open socket that throws out jpeg-data. (Unity does not support this, either.)
Update (2023)
Not much has changed since now. However, there is now a reasonable NewTek NDI out there, that should work across a wide range of Unity versions: https://github.com/keijiro/KlakNDI
NDI usually expects low latency, virtually no package loss scenarios that is primarily optimized for low latency transmission from sender to receiver. It may be a poor choice for remote rendering, in which Unity provides now a solution to stream from Unity to somewhere else: https://docs.unity3d.com/Packages/[email protected]/manual/index.html