Support

Site Restoration

#21164 Allowed memory size of 268435456 bytes exhausted

Posted in ‘Site restoration’
This is a public ticket

Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.

Environment Information

PHP version
n/a
CMS Type
Other
CMS Version
n/a
Backup Tool Version
n/a
Kickstart version
n/a

Latest post by carcam on Monday, 06 October 2014 10:39 CDT

carcam
I'm trying to restore a jps file from one of our backups and I'm having issues with the extract process. I get this error message:



Invalid AJAX data received:



Warning: gzinflate(): data error in /home/studio/public_html/newbuylifestraw/kickstart.php on line 8834



Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 1311093794 bytes) in /home/studio/public_html/newbuylifestraw/kickstart.php on line 5235

 

After some testing in my local machine I think this error is due to a couple of big files ( around 100MB) in the backup (videos actually). Maximum allowed memory in PHP configuration is 256MB.

I am able to restore the backup on my localhost but as these are pretty big sites, it's not a good practice to extract the files locally and upload the result as it will take a lot of time of my internet connection.

Is there a way to fix this situation and prevent these restoration errors?

Best!!

dlb
You can exclude the two problem files from your backup. The two video files are most likely static files, they don't change. Backing them up to your local computer or to a cloud service would be a one time process. You would then need to restore those two files manually.

This will reduce the size of your backup archive and allow kickstart to extract the archive. The disadvantage is that you need to copy those two files to your site after your restoration is complete.


Dale L. Brackin
Support Specialist


us.gifEnglish: native


Please keep in mind my timezone and cultural differences when reading my replies. Thank you!


????
My time zone is EST (UTC -5) (click here to see my current time in Philadelphia, PA)

carcam
Hi Dale,
thank you very much for your reply. I have also considered that approach but it's a bit more than I want to work today :P

Let me rephrase my question: is there a chance that you could make kickstart to work with big files avoiding these memory errors? If not, I think it would be a great feature allowing file exclussion by file size in akeeba backup as sometimes you do not control what your user uploads to the site ;)

dlb
In most cases, automatic file exclusion would be a really bad idea. What happens if that really big file is vital to your site and it gets excluded from your backup because it grew one byte over the limit? Generally data that "grows" is stored in the database, but not always.

Depending on what extension your users are using to upload files, you may be able to set a maximum file size there.

Kickstart is a pretty simple program. All it does is extract the archive. There isn't much that can be done to make it more efficient and enable it to run in a smaller memory footprint.


Dale L. Brackin
Support Specialist


us.gifEnglish: native


Please keep in mind my timezone and cultural differences when reading my replies. Thank you!


????
My time zone is EST (UTC -5) (click here to see my current time in Philadelphia, PA)

carcam
Hi Dale,
thank you very much for your reply.

Obviously the "automatic file exclussion" needs to have a warning message somewhere ;).

Anyway I guess there is no workaround for this. Thank you very much for your time.

Best!!

nicholas
Akeeba Staff
Manager
File exclusion by size, while possible, is very dangerous. It's easy to implement as a custom Stack Filter placed in administrator/components/com_akeeba/akeeba/filters/stack. You need an INI and a PHP file, the INI having the configuration interface and the PHP file the implementation. You could clone the dateconditional filter already there, modifying lines 49-62 to check for the size instead of the date. Please note that stack filters make the backup really slow. They are called once for each backed up file, i.e. about 5000 times on a small site.

Regarding Kickstart, since you are talking about JPS archives that would be impossible anyway :) JPS contents are totally encrypted. You need to decrypt the file header block to figure out which file it is, then you have to keep on decrypting file blocks to see where the data ends. Since you have a problem with the decryption taking too much memory I guess such a feature would simply drill a spectacularly big hole in the water :D

Alternatively, you can extract the JPS archive locally using Akeeba eXtract Wizard. You can then compress the files as ZIP using Finder (Mac OS X), Windows Explorer (Win XP and later) or Nautilus/Dolphin (Linux). The resulting ZIP file can be extracted by Akeeba Kickstart just fine.

Nicholas K. Dionysopoulos

Lead Developer and Director

πŸ‡¬πŸ‡·Greek: native πŸ‡¬πŸ‡§English: excellent πŸ‡«πŸ‡·French: basic β€’ πŸ• My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

dlb
Yes, we would put a line in the log file to show that a file was skipped. That is easy, the excluded files are logged now. Finding that excluded file is the tricky part. Log files can easily go 20,000 or 30.000 lines, you'll go blind looking if you don't know what to search for.

But that does give me another idea. In the log, you normally get a step break before a large file is backed up. You can search for "large file" and find all of the files that Akeeba thinks are large (controlled in your configuration screen). That may be interesting information if you're looking for potential problem files.


Dale L. Brackin
Support Specialist


us.gifEnglish: native


Please keep in mind my timezone and cultural differences when reading my replies. Thank you!


????
My time zone is EST (UTC -5) (click here to see my current time in Philadelphia, PA)

carcam
ince you have a problem with the decryption taking too much memory I guess such a feature would simply drill a spectacularly big hole in the water :D


O love your analogies!!! :P

Alternatively, you can extract the JPS archive locally using Akeeba eXtract Wizard. You can then compress the files as ZIP using Finder (Mac OS X), Windows Explorer (Win XP and later) or Nautilus/Dolphin (Linux). The resulting ZIP file can be extracted by Akeeba Kickstart just fine.



That sounds like a good improvement to Dale's first suggestion, so I will probably go for it, although I have to say Akeeba Extract Wizard is not as good as kickstart.php as the extract wizard (tried in windows and Mac) rarely can decompress my files :(.

I really would love to see Amazon S3 direct restoration in Akeeba Unite :P

Anyway, thank you very much for your great help and support.

carcam
You can search for "large file" and find all of the files that Akeeba thinks are large (controlled in your configuration screen). That may be interesting information if you're looking for potential problem files.


That's an awesome idea Dale!! I'll look into some way to automate this as it seems a nice thing to take into account.

Thank you very much!!

nicholas
Akeeba Staff
Manager
Akeeba UNiTE doesn't need an S3 import. Being a CLI tool you can use it inside a bash script. Have you heard of the s3cmd command-line utility? It's the cat's meow for managing S3 from the CLI! I have my Raspberry Pi fetch the latest backup of our site every night from S3 using s3cmd, then let UNiTE restore it on a test site. It's working really great.

Nicholas K. Dionysopoulos

Lead Developer and Director

πŸ‡¬πŸ‡·Greek: native πŸ‡¬πŸ‡§English: excellent πŸ‡«πŸ‡·French: basic β€’ πŸ• My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

carcam
I have to admit my complete lack of knowdlede and lack of interest of learning S3 stuff. I only use it because I think it's best storage solution to use with Akeeba Backup as you can instruct kickstart to get the files from there.

It's the cat's meow for managing S3 from the CLI! I have my Raspberry Pi fetch the latest backup of our site every night from S3 using s3cmd, then let UNiTE restore it on a test site.


Anyway you got all my attention naming Raspberry Pi. I'll do further research. Thank you very much!!!

Support Information

Working hours: We are open Monday to Friday, 9am to 7pm Cyprus timezone (EET / EEST). Support is provided by the same developers writing the software, all of which live in Europe. You can still file tickets outside of our working hours, but we cannot respond to them until we're back at the office.

Support policy: We would like to kindly inform you that when using our support you have already agreed to the Support Policy which is part of our Terms of Service. Thank you for your understanding and for helping us help you!