Not only it's realistic, I do it myself for years. All my sites take at least daily backups on Amazon S3. Early in the morning a CRON job runs a Bash script which uses s3cmd to download them to my NAS for archiving and also create a copy of that latest versions in a standard location, with a predictable name.
Another computer on my network runs another CRON job about 30' later to fire up UNiTE. Since the NAS is mounted via NFS to the computer, as far as PHP (and UNiTE) knows the latest backups are local files in a predictable location and a predictable name. So it just goes ahead and restores those backups.
The restored sites are used for development and testing. In fact, to prevent repetition of past mistakes, I use the extrafiles and extrasql sections of UNiTE to change the graphics and colors of the template to remind me that I'm on a test site (don't wanna mix this with the live site!) and disable things like email and other 3rdparty service integrations on the restored site.
But I digress.
The thing that you were missing when considering your use case is how you can download just the latest backup from S3. This can't be done with UNiTE itself but it can be scripted with Bash using s3cmd and awk.
Nicholas K. Dionysopoulos
Lead Developer and Director
🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!