I wanted to share my experience with the updated Last of Us Part One game. It recently got updated with FSR 3 and frame generation, so I decided to test it out at both 1080p and 720p to see the difference it makes.
I ran the game on a 30-watt turbo M, and right off the bat, I noticed some differences. However, I must warn you that changing the resolution while in-game on the Ally itself can cause it to crash, so be careful with that.
At 1080p with FSR 3 quality mode and frame generation enabled, which downscales to 720p and back up again, and with a 30-watt plugged-in mode, I was getting 50 to 60 frames per second, even in the worst cases with load spikes and lots of greenery and other stuff going on outside. Interestingly, the Ally was thinking it only had 2 and 1/2 gig of VRAM, even though I had the 6 gig VRAM set as the default, which you want, or at least 5 gig. With these settings, it thinks it needs around 5 gig of VRAM.
If I dropped the settings down to 720p low quality preset with no FSR or other upscaling, indoors, I was dipping down into the 40s quite regularly, even though this is generally a lot better indoors, going up into the 50s, looking very clean, and having absolutely no input latency since there were no upscaling technologies running.
If you want to run the game at 720p with no upscaling, you will be able to sit around the 40/50 mark, and outside, you will occasionally dip down into the 30s, but it still runs very well, even with lots of effects going off with the monsters.
When I turned on 10 frame generation and set the resolution to 1080p, it downscaled back to 720p, and with that frame generation, I was up into the 60s and even 70s in the same section. While playing, I noticed a very minor amount of input delay, but it was nothing unmanageable, and it was a lot better than in some of the other games. Although it didn't look as stable, it definitely felt extremely smooth, and with the Ally's variable refresh rate screen, having that frame generation on really put it in the sweet spot.
Overall, FSR 3 has been implemented very well on Last of Us Part One, and I am using the RG Ally stock drivers and the 6 gig VRAM to get the best performance out of this. You could dip this down under 20 W profile or 25-watt when you're on battery and increase that FSR from quality to balanced or performance, but I don't think you need it, especially as most of us are near power or have a power bank nearby.
During load spikes outside, it does dip into the 50s, but once the shaders compile and everything's happy, I was looking at good, solid 60 frames per second, especially in and out of buildings.