Hello everyone,
I want to make a live video stream (projected on a texture) through the Photon Network. To do that, I created a function that calls GetPixels32 on my WebCamTexture (160x120 to limit the data size) which sends me an pixel array. I serialize it and send it with an RPC.
The problem is that I can't call my function too often (at most each 0.5s so we are far away from the 30 or 60fps) otherwise my clients are disconnected (likely because of the number of messages incoming: with my current configuration, I only get QueueIncomingReliableWarning: this client buffers many incoming messages. This is OK temporarily, etc LogMessages).
- Am I doing it right or is there another solution that is more efficient ?
- If not, what about a lossless compression ?
Thanks
I want to make a live video stream (projected on a texture) through the Photon Network. To do that, I created a function that calls GetPixels32 on my WebCamTexture (160x120 to limit the data size) which sends me an pixel array. I serialize it and send it with an RPC.
The problem is that I can't call my function too often (at most each 0.5s so we are far away from the 30 or 60fps) otherwise my clients are disconnected (likely because of the number of messages incoming: with my current configuration, I only get QueueIncomingReliableWarning: this client buffers many incoming messages. This is OK temporarily, etc LogMessages).
- Am I doing it right or is there another solution that is more efficient ?
- If not, what about a lossless compression ?
Thanks