.sh script not working with crontab but does run standalone fine. Ubuntu 18.04
up vote
0
down vote
favorite
I'm having trouble getting this script to work with crontab. The script works standalone when ran. I'm running Ubuntu 18.04.
Here is the script. It's a simple backup script which uploads a compressed .zip file of a folder to google cloud storage.
#!/bin/bash
### Config ###
# Backup Name
server_name='demo'
# Bucket Name
bucket_name='demo'
# Array of directories to backup
backup_dirs=(
'/var/www'
)
# Directory to store the backups while being created
temp_backup_dir='/tmp/backups/'
# Today's date for the folder name
todaysdate=`date --date="today" +%d-%m-%Y_%T`
# Today's Date Variable
todays_tmp_backup_dir=$temp_backup_dir$todaysdate
### Backup Script ###
# Check if temp directory is writeable
if [ -w "$temp_backup_dir" ]
then
echo 'Found writeable directory and is working correctly: '$temp_backup_dir
else
echo "Unable to write to: "$temp_backup_dir
exit
fi
echo ''
# Make the backup dir writeable
echo ''
echo 'Making Directory: '$temp_backup_dir
mkdir $todays_tmp_backup_dir
chmod 0777 $todays_tmp_backup_dir
echo ''
# zip the files and put them into the backup temp folder
echo ''
for i in "${backup_dirs[@]}"
do
filename="backup"`echo $i | tr '/' '-'`'.zip'
echo 'Backing up '$i' to '$todays_tmp_backup_dir'/'$filename
zip -r $todays_tmp_backup_dir'/'$filename $i
done
# Upload the files to the Google Cloud Storage Bucket
echo ''
echo 'Syncing '$todays_tmp_backup_dir' to '$bucket_name''
gsutil -m rsync -r -C -d "$todays_tmp_backup_dir" gs://$bucket_name/$todaysdate
# Cleanup of local backup
echo ''
echo 'Removing local backup: '$todays_tmp_backup_dir
rm -R $todays_tmp_backup_dir
echo ''
# Done
echo 'Completed :)'
echo ''
The cron
30 * * * * bash /root/googlestorage.sh
Any help or suggestions will be appreciated. Thanks!
command-line bash scripts cron
New contributor
add a comment |
up vote
0
down vote
favorite
I'm having trouble getting this script to work with crontab. The script works standalone when ran. I'm running Ubuntu 18.04.
Here is the script. It's a simple backup script which uploads a compressed .zip file of a folder to google cloud storage.
#!/bin/bash
### Config ###
# Backup Name
server_name='demo'
# Bucket Name
bucket_name='demo'
# Array of directories to backup
backup_dirs=(
'/var/www'
)
# Directory to store the backups while being created
temp_backup_dir='/tmp/backups/'
# Today's date for the folder name
todaysdate=`date --date="today" +%d-%m-%Y_%T`
# Today's Date Variable
todays_tmp_backup_dir=$temp_backup_dir$todaysdate
### Backup Script ###
# Check if temp directory is writeable
if [ -w "$temp_backup_dir" ]
then
echo 'Found writeable directory and is working correctly: '$temp_backup_dir
else
echo "Unable to write to: "$temp_backup_dir
exit
fi
echo ''
# Make the backup dir writeable
echo ''
echo 'Making Directory: '$temp_backup_dir
mkdir $todays_tmp_backup_dir
chmod 0777 $todays_tmp_backup_dir
echo ''
# zip the files and put them into the backup temp folder
echo ''
for i in "${backup_dirs[@]}"
do
filename="backup"`echo $i | tr '/' '-'`'.zip'
echo 'Backing up '$i' to '$todays_tmp_backup_dir'/'$filename
zip -r $todays_tmp_backup_dir'/'$filename $i
done
# Upload the files to the Google Cloud Storage Bucket
echo ''
echo 'Syncing '$todays_tmp_backup_dir' to '$bucket_name''
gsutil -m rsync -r -C -d "$todays_tmp_backup_dir" gs://$bucket_name/$todaysdate
# Cleanup of local backup
echo ''
echo 'Removing local backup: '$todays_tmp_backup_dir
rm -R $todays_tmp_backup_dir
echo ''
# Done
echo 'Completed :)'
echo ''
The cron
30 * * * * bash /root/googlestorage.sh
Any help or suggestions will be appreciated. Thanks!
command-line bash scripts cron
New contributor
3
Replace the cronjob with30 * * * * bash /root/googlestorage.sh 2>/tmp/err.log
and look into that file after the job has run. Usually it's cron's minimal environment ($PATH, $DISPLAY).
– PerlDuck
Nov 26 at 14:05
That doesn't work on this script. It wouldn't output an error file. I have another cron which syncs to amazon s3 and that works fine with cron, just this script is the one i'm having issues with.
– Tyeio
Nov 26 at 14:22
Why wouldn't it redirect its STDERR to a file? I don't see any code in the script that prevents this.
– PerlDuck
Nov 26 at 14:29
2
Do the following: 1) Create a cronjob for the backup user:* * * * * env > /tmp/cron-env
and let it run once (delete the job after it has run). 2) Open a new terminal (as that user) and typeenv -i $(cat /tmp/cron-env) /bin/sh
. Now you have a shell withcrontab
's (poor) environment. 3) Now issuebash /root/googlestorage.sh
and look how it behaves and what it complains about.
– PerlDuck
Nov 26 at 14:39
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I'm having trouble getting this script to work with crontab. The script works standalone when ran. I'm running Ubuntu 18.04.
Here is the script. It's a simple backup script which uploads a compressed .zip file of a folder to google cloud storage.
#!/bin/bash
### Config ###
# Backup Name
server_name='demo'
# Bucket Name
bucket_name='demo'
# Array of directories to backup
backup_dirs=(
'/var/www'
)
# Directory to store the backups while being created
temp_backup_dir='/tmp/backups/'
# Today's date for the folder name
todaysdate=`date --date="today" +%d-%m-%Y_%T`
# Today's Date Variable
todays_tmp_backup_dir=$temp_backup_dir$todaysdate
### Backup Script ###
# Check if temp directory is writeable
if [ -w "$temp_backup_dir" ]
then
echo 'Found writeable directory and is working correctly: '$temp_backup_dir
else
echo "Unable to write to: "$temp_backup_dir
exit
fi
echo ''
# Make the backup dir writeable
echo ''
echo 'Making Directory: '$temp_backup_dir
mkdir $todays_tmp_backup_dir
chmod 0777 $todays_tmp_backup_dir
echo ''
# zip the files and put them into the backup temp folder
echo ''
for i in "${backup_dirs[@]}"
do
filename="backup"`echo $i | tr '/' '-'`'.zip'
echo 'Backing up '$i' to '$todays_tmp_backup_dir'/'$filename
zip -r $todays_tmp_backup_dir'/'$filename $i
done
# Upload the files to the Google Cloud Storage Bucket
echo ''
echo 'Syncing '$todays_tmp_backup_dir' to '$bucket_name''
gsutil -m rsync -r -C -d "$todays_tmp_backup_dir" gs://$bucket_name/$todaysdate
# Cleanup of local backup
echo ''
echo 'Removing local backup: '$todays_tmp_backup_dir
rm -R $todays_tmp_backup_dir
echo ''
# Done
echo 'Completed :)'
echo ''
The cron
30 * * * * bash /root/googlestorage.sh
Any help or suggestions will be appreciated. Thanks!
command-line bash scripts cron
New contributor
I'm having trouble getting this script to work with crontab. The script works standalone when ran. I'm running Ubuntu 18.04.
Here is the script. It's a simple backup script which uploads a compressed .zip file of a folder to google cloud storage.
#!/bin/bash
### Config ###
# Backup Name
server_name='demo'
# Bucket Name
bucket_name='demo'
# Array of directories to backup
backup_dirs=(
'/var/www'
)
# Directory to store the backups while being created
temp_backup_dir='/tmp/backups/'
# Today's date for the folder name
todaysdate=`date --date="today" +%d-%m-%Y_%T`
# Today's Date Variable
todays_tmp_backup_dir=$temp_backup_dir$todaysdate
### Backup Script ###
# Check if temp directory is writeable
if [ -w "$temp_backup_dir" ]
then
echo 'Found writeable directory and is working correctly: '$temp_backup_dir
else
echo "Unable to write to: "$temp_backup_dir
exit
fi
echo ''
# Make the backup dir writeable
echo ''
echo 'Making Directory: '$temp_backup_dir
mkdir $todays_tmp_backup_dir
chmod 0777 $todays_tmp_backup_dir
echo ''
# zip the files and put them into the backup temp folder
echo ''
for i in "${backup_dirs[@]}"
do
filename="backup"`echo $i | tr '/' '-'`'.zip'
echo 'Backing up '$i' to '$todays_tmp_backup_dir'/'$filename
zip -r $todays_tmp_backup_dir'/'$filename $i
done
# Upload the files to the Google Cloud Storage Bucket
echo ''
echo 'Syncing '$todays_tmp_backup_dir' to '$bucket_name''
gsutil -m rsync -r -C -d "$todays_tmp_backup_dir" gs://$bucket_name/$todaysdate
# Cleanup of local backup
echo ''
echo 'Removing local backup: '$todays_tmp_backup_dir
rm -R $todays_tmp_backup_dir
echo ''
# Done
echo 'Completed :)'
echo ''
The cron
30 * * * * bash /root/googlestorage.sh
Any help or suggestions will be appreciated. Thanks!
command-line bash scripts cron
command-line bash scripts cron
New contributor
New contributor
edited Nov 26 at 14:01
New contributor
asked Nov 26 at 13:56
Tyeio
11
11
New contributor
New contributor
3
Replace the cronjob with30 * * * * bash /root/googlestorage.sh 2>/tmp/err.log
and look into that file after the job has run. Usually it's cron's minimal environment ($PATH, $DISPLAY).
– PerlDuck
Nov 26 at 14:05
That doesn't work on this script. It wouldn't output an error file. I have another cron which syncs to amazon s3 and that works fine with cron, just this script is the one i'm having issues with.
– Tyeio
Nov 26 at 14:22
Why wouldn't it redirect its STDERR to a file? I don't see any code in the script that prevents this.
– PerlDuck
Nov 26 at 14:29
2
Do the following: 1) Create a cronjob for the backup user:* * * * * env > /tmp/cron-env
and let it run once (delete the job after it has run). 2) Open a new terminal (as that user) and typeenv -i $(cat /tmp/cron-env) /bin/sh
. Now you have a shell withcrontab
's (poor) environment. 3) Now issuebash /root/googlestorage.sh
and look how it behaves and what it complains about.
– PerlDuck
Nov 26 at 14:39
add a comment |
3
Replace the cronjob with30 * * * * bash /root/googlestorage.sh 2>/tmp/err.log
and look into that file after the job has run. Usually it's cron's minimal environment ($PATH, $DISPLAY).
– PerlDuck
Nov 26 at 14:05
That doesn't work on this script. It wouldn't output an error file. I have another cron which syncs to amazon s3 and that works fine with cron, just this script is the one i'm having issues with.
– Tyeio
Nov 26 at 14:22
Why wouldn't it redirect its STDERR to a file? I don't see any code in the script that prevents this.
– PerlDuck
Nov 26 at 14:29
2
Do the following: 1) Create a cronjob for the backup user:* * * * * env > /tmp/cron-env
and let it run once (delete the job after it has run). 2) Open a new terminal (as that user) and typeenv -i $(cat /tmp/cron-env) /bin/sh
. Now you have a shell withcrontab
's (poor) environment. 3) Now issuebash /root/googlestorage.sh
and look how it behaves and what it complains about.
– PerlDuck
Nov 26 at 14:39
3
3
Replace the cronjob with
30 * * * * bash /root/googlestorage.sh 2>/tmp/err.log
and look into that file after the job has run. Usually it's cron's minimal environment ($PATH, $DISPLAY).– PerlDuck
Nov 26 at 14:05
Replace the cronjob with
30 * * * * bash /root/googlestorage.sh 2>/tmp/err.log
and look into that file after the job has run. Usually it's cron's minimal environment ($PATH, $DISPLAY).– PerlDuck
Nov 26 at 14:05
That doesn't work on this script. It wouldn't output an error file. I have another cron which syncs to amazon s3 and that works fine with cron, just this script is the one i'm having issues with.
– Tyeio
Nov 26 at 14:22
That doesn't work on this script. It wouldn't output an error file. I have another cron which syncs to amazon s3 and that works fine with cron, just this script is the one i'm having issues with.
– Tyeio
Nov 26 at 14:22
Why wouldn't it redirect its STDERR to a file? I don't see any code in the script that prevents this.
– PerlDuck
Nov 26 at 14:29
Why wouldn't it redirect its STDERR to a file? I don't see any code in the script that prevents this.
– PerlDuck
Nov 26 at 14:29
2
2
Do the following: 1) Create a cronjob for the backup user:
* * * * * env > /tmp/cron-env
and let it run once (delete the job after it has run). 2) Open a new terminal (as that user) and type env -i $(cat /tmp/cron-env) /bin/sh
. Now you have a shell with crontab
's (poor) environment. 3) Now issue bash /root/googlestorage.sh
and look how it behaves and what it complains about.– PerlDuck
Nov 26 at 14:39
Do the following: 1) Create a cronjob for the backup user:
* * * * * env > /tmp/cron-env
and let it run once (delete the job after it has run). 2) Open a new terminal (as that user) and type env -i $(cat /tmp/cron-env) /bin/sh
. Now you have a shell with crontab
's (poor) environment. 3) Now issue bash /root/googlestorage.sh
and look how it behaves and what it complains about.– PerlDuck
Nov 26 at 14:39
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Tyeio is a new contributor. Be nice, and check out our Code of Conduct.
Tyeio is a new contributor. Be nice, and check out our Code of Conduct.
Tyeio is a new contributor. Be nice, and check out our Code of Conduct.
Tyeio is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Ask Ubuntu!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1096178%2fsh-script-not-working-with-crontab-but-does-run-standalone-fine-ubuntu-18-04%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
Replace the cronjob with
30 * * * * bash /root/googlestorage.sh 2>/tmp/err.log
and look into that file after the job has run. Usually it's cron's minimal environment ($PATH, $DISPLAY).– PerlDuck
Nov 26 at 14:05
That doesn't work on this script. It wouldn't output an error file. I have another cron which syncs to amazon s3 and that works fine with cron, just this script is the one i'm having issues with.
– Tyeio
Nov 26 at 14:22
Why wouldn't it redirect its STDERR to a file? I don't see any code in the script that prevents this.
– PerlDuck
Nov 26 at 14:29
2
Do the following: 1) Create a cronjob for the backup user:
* * * * * env > /tmp/cron-env
and let it run once (delete the job after it has run). 2) Open a new terminal (as that user) and typeenv -i $(cat /tmp/cron-env) /bin/sh
. Now you have a shell withcrontab
's (poor) environment. 3) Now issuebash /root/googlestorage.sh
and look how it behaves and what it complains about.– PerlDuck
Nov 26 at 14:39