Quote Originally Posted by Maeka View Post
Not sure where else to put this on forums, BUT...

I run the game at 1920x1080 now that I have a display capable of doing that. I have a Ryzen5 1500X, 16GB 2400, and a GTX 1050Ti.

I was streaming with OBS (64-bit) back when I had to run 1360x768 on my previous display, and I wanted to stream for a friend, but I was telling him "eh, I dunno if that will work now that I run higher resolution now".

Well, I changed OBS to tell it to downscale (Bicubic, 16 samples) to 1440x810 and it was OK. Video wasn't skipping, though I was getting minor framerate loss (50-55 FPS). My upload speed is not a factor; I have 10Mbit and OBS was only using 4-6Mbit.

So my question is....

Does the computer have to do extra work to downscale? Would it use less CPU resources to simply stream native resolution at 1920x1080 so that it doesn't have to modify the video output?
Yes. Downscaling and upscaling incur a 2 frame latency penalty, and require a full copy of the memory to be passed through a filter.

If you have a GeForce card you should have OBS using the onboard NVENC so it doesn't bleed off the CPU power. This is a dedicated piece of the GPU and tends to result in no frame drops unless the GPU gets too hot. (The reason I stopped trying to stream FFXIV on the GTX 760 is because while the encoder could keep up, it would stop streaming after a few minutes, even though it still tells you it's streaming, everyone on the other end just sees a black screen.)

My suggestion would be to record at 1920x1080, but sync lock it to p30, and if you need to downscale it for bandwidth reasons, downscale it to 720p30.

Keep in mind that Twitch won't let you stream anything higher than 3.5Mbit, and if you do so, it results in dropped frames which makes the video unwatchable.