How do I run a single command at startup using systemd?
I'd like to startup an Apache Spark cluster after boot using the following command:
sudo ./path/to/spark/sbin/start-all.sh
Then run this command when the system prepares to reboot/shutdown:
sudo ./path/to/spark/sbin/stop-all.sh
How can I get started? Is there a basic template I can build on?
I've tried to use an extremely simple (file: /lib/systemd/system/spark.service
):
[Unit]
Description=Spark service
[Service]
ExecStart=sudo ./path/to/spark/sbin/start-all.sh
Which doesn't work.
startup systemd
|
show 3 more comments
I'd like to startup an Apache Spark cluster after boot using the following command:
sudo ./path/to/spark/sbin/start-all.sh
Then run this command when the system prepares to reboot/shutdown:
sudo ./path/to/spark/sbin/stop-all.sh
How can I get started? Is there a basic template I can build on?
I've tried to use an extremely simple (file: /lib/systemd/system/spark.service
):
[Unit]
Description=Spark service
[Service]
ExecStart=sudo ./path/to/spark/sbin/start-all.sh
Which doesn't work.
startup systemd
Have a look at: wiki.ubuntu.com/SystemdForUpstartUsers
– user680858
May 26 '17 at 9:03
Hi @WillemK, I had looked at this page already. This issue I found is I can't just replaceexec
withExecStart=
. Plus, I haven't used upstart before.
– macourtney7
May 26 '17 at 9:07
1
The dot before the path of your script looks extremely suspicious.
– Andrea Lazzarotto
May 26 '17 at 9:09
@AndreaLazzarotto I think OP is trying to run the script the way OP would in the terminal hence the.
...
– George Udosen
May 26 '17 at 9:25
Hi @AndreaLazzarotto, this is correct. Apologies for any confusion caused.
– macourtney7
May 26 '17 at 10:28
|
show 3 more comments
I'd like to startup an Apache Spark cluster after boot using the following command:
sudo ./path/to/spark/sbin/start-all.sh
Then run this command when the system prepares to reboot/shutdown:
sudo ./path/to/spark/sbin/stop-all.sh
How can I get started? Is there a basic template I can build on?
I've tried to use an extremely simple (file: /lib/systemd/system/spark.service
):
[Unit]
Description=Spark service
[Service]
ExecStart=sudo ./path/to/spark/sbin/start-all.sh
Which doesn't work.
startup systemd
I'd like to startup an Apache Spark cluster after boot using the following command:
sudo ./path/to/spark/sbin/start-all.sh
Then run this command when the system prepares to reboot/shutdown:
sudo ./path/to/spark/sbin/stop-all.sh
How can I get started? Is there a basic template I can build on?
I've tried to use an extremely simple (file: /lib/systemd/system/spark.service
):
[Unit]
Description=Spark service
[Service]
ExecStart=sudo ./path/to/spark/sbin/start-all.sh
Which doesn't work.
startup systemd
startup systemd
edited Mar 8 at 22:39
Anthon
2552414
2552414
asked May 26 '17 at 8:54
macourtney7macourtney7
91231021
91231021
Have a look at: wiki.ubuntu.com/SystemdForUpstartUsers
– user680858
May 26 '17 at 9:03
Hi @WillemK, I had looked at this page already. This issue I found is I can't just replaceexec
withExecStart=
. Plus, I haven't used upstart before.
– macourtney7
May 26 '17 at 9:07
1
The dot before the path of your script looks extremely suspicious.
– Andrea Lazzarotto
May 26 '17 at 9:09
@AndreaLazzarotto I think OP is trying to run the script the way OP would in the terminal hence the.
...
– George Udosen
May 26 '17 at 9:25
Hi @AndreaLazzarotto, this is correct. Apologies for any confusion caused.
– macourtney7
May 26 '17 at 10:28
|
show 3 more comments
Have a look at: wiki.ubuntu.com/SystemdForUpstartUsers
– user680858
May 26 '17 at 9:03
Hi @WillemK, I had looked at this page already. This issue I found is I can't just replaceexec
withExecStart=
. Plus, I haven't used upstart before.
– macourtney7
May 26 '17 at 9:07
1
The dot before the path of your script looks extremely suspicious.
– Andrea Lazzarotto
May 26 '17 at 9:09
@AndreaLazzarotto I think OP is trying to run the script the way OP would in the terminal hence the.
...
– George Udosen
May 26 '17 at 9:25
Hi @AndreaLazzarotto, this is correct. Apologies for any confusion caused.
– macourtney7
May 26 '17 at 10:28
Have a look at: wiki.ubuntu.com/SystemdForUpstartUsers
– user680858
May 26 '17 at 9:03
Have a look at: wiki.ubuntu.com/SystemdForUpstartUsers
– user680858
May 26 '17 at 9:03
Hi @WillemK, I had looked at this page already. This issue I found is I can't just replace
exec
with ExecStart=
. Plus, I haven't used upstart before.– macourtney7
May 26 '17 at 9:07
Hi @WillemK, I had looked at this page already. This issue I found is I can't just replace
exec
with ExecStart=
. Plus, I haven't used upstart before.– macourtney7
May 26 '17 at 9:07
1
1
The dot before the path of your script looks extremely suspicious.
– Andrea Lazzarotto
May 26 '17 at 9:09
The dot before the path of your script looks extremely suspicious.
– Andrea Lazzarotto
May 26 '17 at 9:09
@AndreaLazzarotto I think OP is trying to run the script the way OP would in the terminal hence the
.
...– George Udosen
May 26 '17 at 9:25
@AndreaLazzarotto I think OP is trying to run the script the way OP would in the terminal hence the
.
...– George Udosen
May 26 '17 at 9:25
Hi @AndreaLazzarotto, this is correct. Apologies for any confusion caused.
– macourtney7
May 26 '17 at 10:28
Hi @AndreaLazzarotto, this is correct. Apologies for any confusion caused.
– macourtney7
May 26 '17 at 10:28
|
show 3 more comments
1 Answer
1
active
oldest
votes
Your .service
file should look like this:
[Unit]
Description=Spark service
[Service]
ExecStart=/path/to/spark/sbin/start-all.sh
[Install]
WantedBy=multi-user.target
Now do a few more steps to enable and use the .service
file:
Place it in
/etc/systemd/system
folder with say a name ofmyfirst.service
Make that your script executable with:
chmod u+x /path/to/spark/sbin/start-all.sh
Start it:
sudo systemctl start myfirst
Enable it to run at boot:
sudo systemctl enable myfirst
Stop it:
sudo systemctl stop myfirst
Notes:
You don't need to launch Spark with sudo in your service, as the default service user is already root.
Look at the links below for more
systemd
options.
UPDATE
Now what we have above is just rudimentary, here is a complete setup for spark:
[Unit]
Description=Apache Spark Master and Slave Servers
After=network.target
After=systemd-user-sessions.service
After=network-online.target
[Service]
User=spark
Type=forking
ExecStart=/opt/spark-1.6.1-bin-hadoop2.6/sbin/start-all.sh
ExecStop=/opt/spark-1.6.1-bin-hadoop2.6/sbin/stop-all.sh
TimeoutSec=30
Restart=on-failure
RestartSec=30
StartLimitInterval=350
StartLimitBurst=10
[Install]
WantedBy=multi-user.target
To setup the service:
sudo systemctl start spark.service
sudo systemctl stop spark.service
sudo systemctl enable spark.service
Further reading
Please read through the following links. Spark is a complex setup, so you should understand how it integrates with Ubuntu's init service.
https://datasciencenovice.wordpress.com/2016/11/30/spark-stand-alone-cluster-as-a-systemd-service-ubuntu-16-04centos-7/
https://www.digitalocean.com/community/tutorials/understanding-systemd-units-and-unit-files
https://www.freedesktop.org/software/systemd/man/systemd.unit.html
1
You don't need thebash -c
either
– muru
May 26 '17 at 9:13
Noted and updated
– George Udosen
May 26 '17 at 9:16
1
Thanks for this, I've created a file based on what you suggested. Upon runningsudo systemctl start spark
is receive the following error:Failed to start spark.service: Unit spark.service is not loaded properly: Invalid argument. See system logs and 'systemctl status spark.service' for details.
– macourtney7
May 26 '17 at 10:24
The main part ofsystemctl status spark.service
is as follows:Executable path is not absolute
andspark.service: Service lacks both ExecStart= and ExecStop= setting. Refusing.
– macourtney7
May 26 '17 at 10:27
The issues are 1) Spark binary path (should replace what we have in the service file) is needed, 2) Spark has a shut down command what is it. 3) Did you go through the links I gave you. I don't use spark so supply them
– George Udosen
May 26 '17 at 11:05
|
show 1 more comment
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "89"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f919054%2fhow-do-i-run-a-single-command-at-startup-using-systemd%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Your .service
file should look like this:
[Unit]
Description=Spark service
[Service]
ExecStart=/path/to/spark/sbin/start-all.sh
[Install]
WantedBy=multi-user.target
Now do a few more steps to enable and use the .service
file:
Place it in
/etc/systemd/system
folder with say a name ofmyfirst.service
Make that your script executable with:
chmod u+x /path/to/spark/sbin/start-all.sh
Start it:
sudo systemctl start myfirst
Enable it to run at boot:
sudo systemctl enable myfirst
Stop it:
sudo systemctl stop myfirst
Notes:
You don't need to launch Spark with sudo in your service, as the default service user is already root.
Look at the links below for more
systemd
options.
UPDATE
Now what we have above is just rudimentary, here is a complete setup for spark:
[Unit]
Description=Apache Spark Master and Slave Servers
After=network.target
After=systemd-user-sessions.service
After=network-online.target
[Service]
User=spark
Type=forking
ExecStart=/opt/spark-1.6.1-bin-hadoop2.6/sbin/start-all.sh
ExecStop=/opt/spark-1.6.1-bin-hadoop2.6/sbin/stop-all.sh
TimeoutSec=30
Restart=on-failure
RestartSec=30
StartLimitInterval=350
StartLimitBurst=10
[Install]
WantedBy=multi-user.target
To setup the service:
sudo systemctl start spark.service
sudo systemctl stop spark.service
sudo systemctl enable spark.service
Further reading
Please read through the following links. Spark is a complex setup, so you should understand how it integrates with Ubuntu's init service.
https://datasciencenovice.wordpress.com/2016/11/30/spark-stand-alone-cluster-as-a-systemd-service-ubuntu-16-04centos-7/
https://www.digitalocean.com/community/tutorials/understanding-systemd-units-and-unit-files
https://www.freedesktop.org/software/systemd/man/systemd.unit.html
1
You don't need thebash -c
either
– muru
May 26 '17 at 9:13
Noted and updated
– George Udosen
May 26 '17 at 9:16
1
Thanks for this, I've created a file based on what you suggested. Upon runningsudo systemctl start spark
is receive the following error:Failed to start spark.service: Unit spark.service is not loaded properly: Invalid argument. See system logs and 'systemctl status spark.service' for details.
– macourtney7
May 26 '17 at 10:24
The main part ofsystemctl status spark.service
is as follows:Executable path is not absolute
andspark.service: Service lacks both ExecStart= and ExecStop= setting. Refusing.
– macourtney7
May 26 '17 at 10:27
The issues are 1) Spark binary path (should replace what we have in the service file) is needed, 2) Spark has a shut down command what is it. 3) Did you go through the links I gave you. I don't use spark so supply them
– George Udosen
May 26 '17 at 11:05
|
show 1 more comment
Your .service
file should look like this:
[Unit]
Description=Spark service
[Service]
ExecStart=/path/to/spark/sbin/start-all.sh
[Install]
WantedBy=multi-user.target
Now do a few more steps to enable and use the .service
file:
Place it in
/etc/systemd/system
folder with say a name ofmyfirst.service
Make that your script executable with:
chmod u+x /path/to/spark/sbin/start-all.sh
Start it:
sudo systemctl start myfirst
Enable it to run at boot:
sudo systemctl enable myfirst
Stop it:
sudo systemctl stop myfirst
Notes:
You don't need to launch Spark with sudo in your service, as the default service user is already root.
Look at the links below for more
systemd
options.
UPDATE
Now what we have above is just rudimentary, here is a complete setup for spark:
[Unit]
Description=Apache Spark Master and Slave Servers
After=network.target
After=systemd-user-sessions.service
After=network-online.target
[Service]
User=spark
Type=forking
ExecStart=/opt/spark-1.6.1-bin-hadoop2.6/sbin/start-all.sh
ExecStop=/opt/spark-1.6.1-bin-hadoop2.6/sbin/stop-all.sh
TimeoutSec=30
Restart=on-failure
RestartSec=30
StartLimitInterval=350
StartLimitBurst=10
[Install]
WantedBy=multi-user.target
To setup the service:
sudo systemctl start spark.service
sudo systemctl stop spark.service
sudo systemctl enable spark.service
Further reading
Please read through the following links. Spark is a complex setup, so you should understand how it integrates with Ubuntu's init service.
https://datasciencenovice.wordpress.com/2016/11/30/spark-stand-alone-cluster-as-a-systemd-service-ubuntu-16-04centos-7/
https://www.digitalocean.com/community/tutorials/understanding-systemd-units-and-unit-files
https://www.freedesktop.org/software/systemd/man/systemd.unit.html
1
You don't need thebash -c
either
– muru
May 26 '17 at 9:13
Noted and updated
– George Udosen
May 26 '17 at 9:16
1
Thanks for this, I've created a file based on what you suggested. Upon runningsudo systemctl start spark
is receive the following error:Failed to start spark.service: Unit spark.service is not loaded properly: Invalid argument. See system logs and 'systemctl status spark.service' for details.
– macourtney7
May 26 '17 at 10:24
The main part ofsystemctl status spark.service
is as follows:Executable path is not absolute
andspark.service: Service lacks both ExecStart= and ExecStop= setting. Refusing.
– macourtney7
May 26 '17 at 10:27
The issues are 1) Spark binary path (should replace what we have in the service file) is needed, 2) Spark has a shut down command what is it. 3) Did you go through the links I gave you. I don't use spark so supply them
– George Udosen
May 26 '17 at 11:05
|
show 1 more comment
Your .service
file should look like this:
[Unit]
Description=Spark service
[Service]
ExecStart=/path/to/spark/sbin/start-all.sh
[Install]
WantedBy=multi-user.target
Now do a few more steps to enable and use the .service
file:
Place it in
/etc/systemd/system
folder with say a name ofmyfirst.service
Make that your script executable with:
chmod u+x /path/to/spark/sbin/start-all.sh
Start it:
sudo systemctl start myfirst
Enable it to run at boot:
sudo systemctl enable myfirst
Stop it:
sudo systemctl stop myfirst
Notes:
You don't need to launch Spark with sudo in your service, as the default service user is already root.
Look at the links below for more
systemd
options.
UPDATE
Now what we have above is just rudimentary, here is a complete setup for spark:
[Unit]
Description=Apache Spark Master and Slave Servers
After=network.target
After=systemd-user-sessions.service
After=network-online.target
[Service]
User=spark
Type=forking
ExecStart=/opt/spark-1.6.1-bin-hadoop2.6/sbin/start-all.sh
ExecStop=/opt/spark-1.6.1-bin-hadoop2.6/sbin/stop-all.sh
TimeoutSec=30
Restart=on-failure
RestartSec=30
StartLimitInterval=350
StartLimitBurst=10
[Install]
WantedBy=multi-user.target
To setup the service:
sudo systemctl start spark.service
sudo systemctl stop spark.service
sudo systemctl enable spark.service
Further reading
Please read through the following links. Spark is a complex setup, so you should understand how it integrates with Ubuntu's init service.
https://datasciencenovice.wordpress.com/2016/11/30/spark-stand-alone-cluster-as-a-systemd-service-ubuntu-16-04centos-7/
https://www.digitalocean.com/community/tutorials/understanding-systemd-units-and-unit-files
https://www.freedesktop.org/software/systemd/man/systemd.unit.html
Your .service
file should look like this:
[Unit]
Description=Spark service
[Service]
ExecStart=/path/to/spark/sbin/start-all.sh
[Install]
WantedBy=multi-user.target
Now do a few more steps to enable and use the .service
file:
Place it in
/etc/systemd/system
folder with say a name ofmyfirst.service
Make that your script executable with:
chmod u+x /path/to/spark/sbin/start-all.sh
Start it:
sudo systemctl start myfirst
Enable it to run at boot:
sudo systemctl enable myfirst
Stop it:
sudo systemctl stop myfirst
Notes:
You don't need to launch Spark with sudo in your service, as the default service user is already root.
Look at the links below for more
systemd
options.
UPDATE
Now what we have above is just rudimentary, here is a complete setup for spark:
[Unit]
Description=Apache Spark Master and Slave Servers
After=network.target
After=systemd-user-sessions.service
After=network-online.target
[Service]
User=spark
Type=forking
ExecStart=/opt/spark-1.6.1-bin-hadoop2.6/sbin/start-all.sh
ExecStop=/opt/spark-1.6.1-bin-hadoop2.6/sbin/stop-all.sh
TimeoutSec=30
Restart=on-failure
RestartSec=30
StartLimitInterval=350
StartLimitBurst=10
[Install]
WantedBy=multi-user.target
To setup the service:
sudo systemctl start spark.service
sudo systemctl stop spark.service
sudo systemctl enable spark.service
Further reading
Please read through the following links. Spark is a complex setup, so you should understand how it integrates with Ubuntu's init service.
https://datasciencenovice.wordpress.com/2016/11/30/spark-stand-alone-cluster-as-a-systemd-service-ubuntu-16-04centos-7/
https://www.digitalocean.com/community/tutorials/understanding-systemd-units-and-unit-files
https://www.freedesktop.org/software/systemd/man/systemd.unit.html
edited Sep 7 '17 at 1:27
Community♦
1
1
answered May 26 '17 at 9:11
George UdosenGeorge Udosen
21.3k94570
21.3k94570
1
You don't need thebash -c
either
– muru
May 26 '17 at 9:13
Noted and updated
– George Udosen
May 26 '17 at 9:16
1
Thanks for this, I've created a file based on what you suggested. Upon runningsudo systemctl start spark
is receive the following error:Failed to start spark.service: Unit spark.service is not loaded properly: Invalid argument. See system logs and 'systemctl status spark.service' for details.
– macourtney7
May 26 '17 at 10:24
The main part ofsystemctl status spark.service
is as follows:Executable path is not absolute
andspark.service: Service lacks both ExecStart= and ExecStop= setting. Refusing.
– macourtney7
May 26 '17 at 10:27
The issues are 1) Spark binary path (should replace what we have in the service file) is needed, 2) Spark has a shut down command what is it. 3) Did you go through the links I gave you. I don't use spark so supply them
– George Udosen
May 26 '17 at 11:05
|
show 1 more comment
1
You don't need thebash -c
either
– muru
May 26 '17 at 9:13
Noted and updated
– George Udosen
May 26 '17 at 9:16
1
Thanks for this, I've created a file based on what you suggested. Upon runningsudo systemctl start spark
is receive the following error:Failed to start spark.service: Unit spark.service is not loaded properly: Invalid argument. See system logs and 'systemctl status spark.service' for details.
– macourtney7
May 26 '17 at 10:24
The main part ofsystemctl status spark.service
is as follows:Executable path is not absolute
andspark.service: Service lacks both ExecStart= and ExecStop= setting. Refusing.
– macourtney7
May 26 '17 at 10:27
The issues are 1) Spark binary path (should replace what we have in the service file) is needed, 2) Spark has a shut down command what is it. 3) Did you go through the links I gave you. I don't use spark so supply them
– George Udosen
May 26 '17 at 11:05
1
1
You don't need the
bash -c
either– muru
May 26 '17 at 9:13
You don't need the
bash -c
either– muru
May 26 '17 at 9:13
Noted and updated
– George Udosen
May 26 '17 at 9:16
Noted and updated
– George Udosen
May 26 '17 at 9:16
1
1
Thanks for this, I've created a file based on what you suggested. Upon running
sudo systemctl start spark
is receive the following error: Failed to start spark.service: Unit spark.service is not loaded properly: Invalid argument. See system logs and 'systemctl status spark.service' for details.
– macourtney7
May 26 '17 at 10:24
Thanks for this, I've created a file based on what you suggested. Upon running
sudo systemctl start spark
is receive the following error: Failed to start spark.service: Unit spark.service is not loaded properly: Invalid argument. See system logs and 'systemctl status spark.service' for details.
– macourtney7
May 26 '17 at 10:24
The main part of
systemctl status spark.service
is as follows: Executable path is not absolute
and spark.service: Service lacks both ExecStart= and ExecStop= setting. Refusing.
– macourtney7
May 26 '17 at 10:27
The main part of
systemctl status spark.service
is as follows: Executable path is not absolute
and spark.service: Service lacks both ExecStart= and ExecStop= setting. Refusing.
– macourtney7
May 26 '17 at 10:27
The issues are 1) Spark binary path (should replace what we have in the service file) is needed, 2) Spark has a shut down command what is it. 3) Did you go through the links I gave you. I don't use spark so supply them
– George Udosen
May 26 '17 at 11:05
The issues are 1) Spark binary path (should replace what we have in the service file) is needed, 2) Spark has a shut down command what is it. 3) Did you go through the links I gave you. I don't use spark so supply them
– George Udosen
May 26 '17 at 11:05
|
show 1 more comment
Thanks for contributing an answer to Ask Ubuntu!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f919054%2fhow-do-i-run-a-single-command-at-startup-using-systemd%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Have a look at: wiki.ubuntu.com/SystemdForUpstartUsers
– user680858
May 26 '17 at 9:03
Hi @WillemK, I had looked at this page already. This issue I found is I can't just replace
exec
withExecStart=
. Plus, I haven't used upstart before.– macourtney7
May 26 '17 at 9:07
1
The dot before the path of your script looks extremely suspicious.
– Andrea Lazzarotto
May 26 '17 at 9:09
@AndreaLazzarotto I think OP is trying to run the script the way OP would in the terminal hence the
.
...– George Udosen
May 26 '17 at 9:25
Hi @AndreaLazzarotto, this is correct. Apologies for any confusion caused.
– macourtney7
May 26 '17 at 10:28