Display scaling is essentially "free", so it is preferred and the lower the resolution the game is originally rendered the faster the frame rate (or the lower power consumption) - with no special performance benefit for, say, doubling. If a game is run at non native resolution on a laptop (TVs and desktop displays have their own scalers, but laptop displays generally rely on the GPU), one of these two methods will be used. Both support a technique called "bilinear filtering" though advances in display hardware may provide higher quality. Both of these paths have fully dedicated hardware for the arbitrary resizing and are not likely to be optimized for doubling or halving. GPUs can efficiently scale an image by an arbitrary amount (within limits - display options fall within those limits by design) either using a 3D rendering operation or as the signal is sent from GPU to the display. I'm not only very interested in the actual answer, but perhaps makes it easier to set a resolution for a battery vs performance question :) This idea does apply to Photoshop with increasing and decreasing the resolution: Always increase with halves and doubles. That said, I do think it'd be easier to render at half the resolution than something like 40%.
Sudden thought: This does assume the game environment is rendered at the native resolution and thereafter downscaled to the set resolution, which is probably wrong. Not only is that an easier calculation, but it's also much easier to interpolate, if that's the correct word for it to calculate how the pixel should look like. If we'd include the sheet of paper thought in resolution scaling with games, compared to the native resolution, I'd say 1440x900 could run even better than 1280x800, because the GPU could simply say: reduce all pixels by 50%. But the game offers a variety of resolutions, such as 1440x900, 1680x1050, 2048x1280, 2560x1600 and for some reason also 3360x2100, which is above the native resolution. To go back to the actual situation: My Macbook has a native resolution of 2880x1800. Same goes when I'd give you two sheets of A4 and ask you to create a A3-size rectangle, instead of a not-A3-but-larger-than-A4-shape. If I'd give you a standard A4 sheet of paper, I bet you can more easily fold it in half than 1/3rd. Lets compare the situation with a metaphor first. Use GPU scaling to override monitor behaviour.This might be an odd question, but I was wondering if you could improve GPU performance with games, comparatively, when using doubles/halves of the native resolution, instead of an odd ratio of that. Some may stretch the image, while others may instead pillarbox. If a full-screen game is rendered at the wrong aspect ratio, the effects vary based on the monitor. Most games today will feature built in support for widescreen resolutions, while some games may require tweaking to force such a resolution. Widescreen resolutions can cause problems with older games that were built for a 4:3 (1.33:1) screen, resulting in either a stretched screen or black bars of unused screen space. The majority of computer monitors and TV's produced today are widescreen, and thus feature a native widescreen resolution. Widescreen resolutions are resolutions that are of a 16:9 (1.78:1 more common) or 16:10 (1.60:1 slightly less common) aspect ratio. For a list of games, see games with widescreen resolution support.