For obvious reasons we teach users
Always update... updates are good... never forget updates, et cetera!
Apart from the danger of malicious advertisements that make use of that showing users that they need to "update", I want this question to focus on the automatic update features in software specifically.
A lot of applications have introduced automatic update features, where users basically don't have to do anything anymore, in order to get their software completely renewed or patched or fully stuffed with (undesirable) new features. As a security professional my first feeling about that would be "Great, this makes the world a bit safer" especially because you take away the human interaction which often causes failure due to laziness, forgetting to update, forgetting to even check for updates or whatever "human" reason that causes lack of updating and start to patch known vulnerabilities automatically.
People get tons of different mobile app updates, operating system updates and desktop application updates. Doesn't all this updating increase the security risk by possibly introducing vulnerabilities in high tempo on a massive scale?
So, on the one side you have the extreme of no auto-updaters at all, on the other hand you have a scenario wherein mostly everything gets automatically updated. I consider the downside of auto-updaters that they can rapidly introduce new vulnerabilities.
I'd be interesting to see some kind of statistical analysis that shows a possibility of both scenarios.