I've noticed a distinct dip in perceived sharpness on many high-bitrate 1440p uploads lately. Is this an encoding inefficiency we should be tracking more closely, or is YouTube's dynamic bitrate adjustment simply failing to optimize for these mid-tier resolutions effectively? Curious if others are benchmarking this drop-off compared to even six months ago.
The Sonic Boom in Lo-Fi Production