0

I'm in a situation where I have to deploy an ubuntu server for a Client (they do not have a Sysadmin), and it will serve as an HTTP API backend for an app.

The catch is that after the deployment they no longer allow me to have remote SSH access to it so if something breaks then they will ask me to go on-site to fix it (I prefer avoiding this scenario).

Lets assume the app is secure, and I'm only worried about vulnerabilities present in the system and Server application.

I've been managing quite a lot of ubuntu servers in the last 3 years, and only had a problem with auto update once.

Is it better to leave the server as is, or create a cronjob that updates everything weekly?

0 0 * * 6 (apt-get update && apt-get dist-upgrade -y && apt-get autoremove -y && reboot) > /tmp/autoupdate.log
Sevron
  • 131
  • 1
  • 5
  • Can it be reached from the evil internet? Or is it _only_ local backend? Even for only local backend, security updates might be a must, depending on what's being used etc. Whatever it is, I'd talk to them of the benefits / drawbacks of either solution. And _then_ whatever is the outcome, never use a cron for updates if there are included services like [unattended-upgrades](https://help.ubuntu.com/community/AutomaticSecurityUpdates#Using_the_.22unattended-upgrades.22_package). – Lenniey Aug 30 '17 at 09:58
  • @Lenniey Yes it is exposed to the internet. I will look into the unattended-upgrades package, which looks like a good solution, but what is the reason behind "never use cron for updates" (using cron for updates is actually mentioned in the same page you linked) ? – Sevron Aug 30 '17 at 12:24
  • _if there are included services like unattended-upgrades._ If you don't have any other means, then you can / should / must / whatever use cron. – Lenniey Aug 30 '17 at 12:25

0 Answers0