Support

Akeeba Solo

#32594 Cron error

Posted in ‘Akeeba Solo (standalone)’
This is a public ticket

Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.

Environment Information

PHP version
n/a
Akeeba Solo version
n/a

Latest post by CORBISER on Wednesday, 04 March 2020 06:44 CST

CORBISER
Please look at the bottom of this page (under Support Policy Summary) for our support policy summary, containing important information regarding our working hours and our support policy. Thank you!

IMPORTANT! Please remember to ZIP and attach your backup log file. Without it we are unlikely to be able to help you with backup issues. Thank you!

I can do a backup "manually" without problem, but not with a cron job.
The total size is about 3 Gb, and I configure to save in 256 Mb. files, and always stop in j09 file and always this file is 84.570.654 size.

The max time in the server is increased to 400
I send you the log file.

Thanks
Carlos

CORBISER
More info.
I use this "cron job" :
https://www.skaterootsbcn.com/shop/solo/remote.php?view=remote&key=abababab

(The abababab key is not the real one )

Carlos

dlb
Carlos,

Your command for the front end backup is wrong. That command requires wget or curl at the beginning and requires the max-redirects argument for either one of them. You can look at the Scheduled Backups option in Akeeba Backup for the exact format of the command for your site.


Dale L. Brackin
Support Specialist


us.gifEnglish: native


Please keep in mind my timezone and cultural differences when reading my replies. Thank you!


????
My time zone is EST (UTC -5) (click here to see my current time in Philadelphia, PA)

CORBISER
Good morning Dale.
In my server, when I configure the cron-job, I cannot write curl...... or wget.....
It must begin with http....

I tried also with "webcron.org" and "cronless.com" and there is always error....

Any idea ??

Thanks
Carlos

CORBISER
Dale, I talked with my server and tomorrow we will "write" the cron job directly by command line interface, and not by "standard" way on their control panel.
Today I am not at home , and then we will do tomorrow.

Wait my reply and thanks
Carlos

CORBISER
Hi Dale, some time ago, I did a cron job in other server, and before " wget...." that you indicated in your docs, we must write something like "*/... " with the days/hours to execute the cron...
Is correct ??
If not, how I can know when will be executed ??

Carlos

dlb
Please let me know how you make out with the alternate CRON. If that doesn't work, we can take a look at why Webcron isn't working.


Yes, that strange string controls when the CRON job will be executed. That is usually filled in on a form in your cPanel. Only hard core geeks do it by hand. :-)


Dale L. Brackin
Support Specialist


us.gifEnglish: native


Please keep in mind my timezone and cultural differences when reading my replies. Thank you!


????
My time zone is EST (UTC -5) (click here to see my current time in Philadelphia, PA)

CORBISER
Dale, now is working !
With "Putty" and the help of "cronjob generator" (to write correctly the "strange string") , we save the cron jobs directly to the linux server in the command line interface.
And also with the great help of my server tecnical service, that explain me how to write / edit a cron in the editor of Linux.

I run some times the backup "manually" from the command line interface, with no problem, and faster than the control panel of "Solo" , about 10 minutes instead of 35-40 m.

Now I will wait this night the scheduled backup....

I will inform you tomorrow.
Carlos

Support Information

Working hours: We are open Monday to Friday, 9am to 7pm Cyprus timezone (EET / EEST). Support is provided by the same developers writing the software, all of which live in Europe. You can still file tickets outside of our working hours, but we cannot respond to them until we're back at the office.

Support policy: We would like to kindly inform you that when using our support you have already agreed to the Support Policy which is part of our Terms of Service. Thank you for your understanding and for helping us help you!