I read a facinating post about how to balance bitrate per pixel properly. Basically, it boils down to the fact that you need around 0.1BPP. I use the maximum bitrate that the Twitch ingest servers will accept. (3500kbps). I broadcast at 1280x720 60FPS after downscale from my 1920x1080 144Hz screen. There is a formula to figure out BPP which is:
(bitrate * 1000) / (width * height * fps) = BPP
If we plug in the settings of my stream, we see we only end up with around 0.06BPP, just over half of what would be considered "nice fidelity".
Considering I can not increase the bitrate any further, and I wasn't exactly pleased with how Overwatch was looking on the stream, I have been testing FPS games at 30FPS broadcast and lower motion games still at 60FPS.
I'm comparing the videos back and forth and hope to make a more solid decision soon.
If you want to weigh in, shoot me a message or tweet @sw33tp34. :)