You are thinking this the wrong way.
Your backup archives have a name similar to this: site-www.example.com-20260120-123456eest-random.jpa.
The prefix is always the same: site-www.example.com-.
The date and time is always in the following format: YYYYMMDD-HHmmssT- where YYYY is a four digit year, MM is a two digit month, DD is a two digit day of the month, HH is a two digit hour of the day in 24-hour format, mm is the two digit minutes, and ss is the two digit seconds. The timezone (T) is of variable length, but always ends in a dash.
Finally, the random suffix always has the exact same length: 16 characters.
Armed with that information you can remove the fixed length prefix and suffix which leaves you with a date in the YYYYMMDD-HHmmssT format. This can be easily converted to date and time.
If you are managing the files manually, e.g. over SFTP, you can list the files ordered by filename descending; this will place them in reverse chronological order (newest first, oldest last) thanks to what I explained above.
If you want to automatically manage removal of old, remotely stored backups archive files you can of course use the Remote Quotas setting in Akeeba Backup.
If you prefer to run a shell script, you can adapt something like the following (it lists files in reverse chronological order and removes all of them except for the last 7 days; note that the www.example.com domain is hardcoded into the script):
#!/bin/bash
# Directory containing the files (change this to your directory)
DIR="/path/to/your/directory"
# Current date and time (for comparison)
current_date=$(date +%s)
# Function to extract the timestamp from the filename and convert to seconds since epoch
get_timestamp() {
local filename=$1
# Extract YYYYMMDD-HHmmssT part (e.g., 20260120-123456ee)
local timestamp_part=$(echo "$filename" | grep -oP 'site-www\.example\.com-\K[^-]+-[^-]+')
# Convert to seconds since epoch
date -d "${timestamp_part:0:8}-${timestamp_part:9:2}-${timestamp_part:11:2} ${timestamp_part:14:2}:${timestamp_part:16:2}:${timestamp_part:18:2}" +%s 2>/dev/null || echo "0"
}
# Find all matching files, sort by filename (newest first), and list them
find "$DIR" -type f -name 'site-www.example.com-*-*.j*' | \
grep -E '^.*-[0-9]{8}-[0-9]{6}[a-zA-Z]{2}-[0-9a-zA-Z.-]{16}\.(jpa|j01|j02|j[0-9]{2})$' | \
sort -r | \
while read file; do
echo "$file"
done
# Find files older than 7 days (based on filename timestamp) and delete them
find "$DIR" -type f -name 'site-www.example.com-*-*.j*' | \
grep -E '^.*-[0-9]{8}-[0-9]{6}[a-zA-Z]{2}-[0-9a-zA-Z.-]{16}\.(jpa|j01|j02|j[0-9]{2})$' | \
while read file; do
timestamp=$(get_timestamp "$file")
# Calculate days difference (7 days = 604800 seconds)
days_diff=$(( (current_date - timestamp) / 86400 ))
if [ "$days_diff" -gt 7 ]; then
echo "Deleting: $file"
rm "$file"
fi
done
Finally, I would like to kindly note that it's easier for me to reply to tickets if you tell me what you want to do, not how you are trying to do it. Reading your ticket I am unsure if your problem is that you are not aware of remote quotas, if you did not think about sorting by name descending, if you don't know how to write a shell or PHP script to remove old files, or something entirely different I cannot divine just by knowing that you're somehow confused about file naming.
Nicholas K. Dionysopoulos
Lead Developer and Director
🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!