10

I have a small web application and I just realized that some of my JavaScript libraries are outdated.

An example:

<script src="https://cdnjs.cloudflare.com/ajax/libs/crypto-js/3.1.2/components/core-min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/crypto-js/3.1.2/components/enc-utf16.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/crypto-js/3.1.2/components/enc-base64.js"></script>

I found a new version:

https://cdnjs.cloudflare.com/ajax/libs/crypto-js/3.1.9-1/core.min.js

Can an outdated JavaScript library present a security risk for my app? And one more thing, what is the best way of keeping libraries up-to-date?

S.L. Barth
  • 5,486
  • 8
  • 38
  • 47
user134969
  • 1,298
  • 4
  • 15
  • 24

4 Answers4

14

Using any out of date software is inherently a security risk; any flaws patched between the version you're running and the latest version can be exploited. For example, maybe there's a specific payload you can feed it to cause an infinite loop, potentially causing a Denial of Service (DOS) event, or maybe there's a bug that can somehow break the JavaScript sandbox and wreak havoc on the system.

I don't see any critical security issues mentioned on the Internet, but that doesn't mean that they don't exist or haven't been patched. Using the latest version is always recommended when possible, but this may also include testing your code with the new version to make sure that it doesn't break functionality-wise when you update (for example, major releases are often incompatible with prior major releases).

The best way to keep your libraries up to date is to use the package from a toolchain that provides one-click upgrades. For example, if you copy the library to your local server using npm install, you can then import them directly (cloudflare will will provide protection for your site if you use them). From there, updating your libraries is as easy as npm update from the command-line terminal.

It's strongly recommended that you use a TDD (Test Driven Development) suite, such as Mocha, to make sure that your scripts do not break when libraries are upgraded, as well as generally reducing the chance you'll break your code during development. Also, remember to use a repository (like git or subversion) so that you can roll back changes if the npm update causes stuff to break.

phyrfox
  • 5,724
  • 20
  • 24
  • 2
    Just nitpicking on a sidenote, but Mocha is not a TDD suite, it's simply a test suite / framework. It can be used with or without TDD, they're completely independent. – Seb D. Mar 28 '17 at 09:53
  • 11
    Breaking the sandbox environment is not a security vulnerability of the Javascript library. It's a **browser** vulnerability. If a JS library can be used to break the sandbox, I can infect as many machines as I like by willingly use a flawed JS script in a viral page – usr-local-ΕΨΗΕΛΩΝ Mar 28 '17 at 10:15
2

That depends on the (known) vulnerabilities in the outdates JavaScript library. Often newer versions fix (minor) security issues as well.

A great list to check your library against, is a tool called Retire.js.

Also note that the use of a content delivery network (CDN) like CloudFlare might be a minor risk itself. Because theoretically they can change the contents of the JavaScript that is included on your page. To mitigate this type of risk I would recommend to check out Sub resource integrity (SRI).

Bob Ortiz
  • 6,234
  • 8
  • 43
  • 90
2

I say no, here's why. Libraries just run code that the browser lets them. They should never be able to do bad things. If a lib reveals a security problem you're either not using it correctly (ex user input handling on a template), or it revealed a browser vulnerability. At any rate, the sandbox and policies are supposed to protect against malicious scripts.

The only exception I can think of would be client-side encryption, which should be maintained and not be undertaken lightly. Library authors are not security experts, which is why it's so important for the browser to be the police. While off to a shaky start, the last few years have seen browsers do a great job of that, especially now that plugins like pdf and flash are diminished.

One last thought: i'm referring exclusively to client-side browser-based javascript libs; if you have outdated libs for a server like node.js, YES, bad things can happen!

EDIT: this includes external servers beside your own; CDNs, partner sites, ads, etc. If a web exploit occurred without a server, it wouldn't really be an exploit. There are occasional underpinning issues with the browser/plugins/etc, but the code that runs on top of them bears no responsibility to correct those: vendors do.

dandavis
  • 2,658
  • 10
  • 16
  • You are forgetting about script injections. There are many situations where developers rely on libraries to properly prevent XSS injections while they do their thing. When this isn't working as reliable as the developer expects due to an oversight on part of the library developers, you can have information leakage and even session stealing exploits wherever users view input generated by other users. – Philipp Mar 28 '17 at 13:30
  • i didn't forget: "_ex user input handling on a template_". XSS is not a client-side problem, it's a server issue. Clients don't get other's input from thin air, and you cannot trust a client to be able to fix a server mistake. – dandavis Mar 28 '17 at 14:04
  • Thinking that client-side anything causes XSS is like thinking "d-day" _caused_ WW2; it's just the battlefield. – dandavis Mar 28 '17 at 14:11
  • @dandavis it's not a problem for the server, right. But it's still a problem for the final user. Let's say I'm able to exploit your library because it's loading something else from an unsafe (http) resource. I may be able to inject any JS I like and steal your users credentials. That doesn't affect the server in any way, but still your users are going to held you accountable for that. – BgrWorker Mar 28 '17 at 15:12
  • @BgrWorker: it IS a server problem, yours or otherwise; external baddies _must_ come from a server. No bad can come from a lib unto itself. Yes, you should be held accountable for your implementation. If an update makes you safer, you were doing something wrong in the first place. My main point is: don't fall into the same mindset that we do with binaries where patching _is_ the defense. – dandavis Mar 28 '17 at 16:29
1

It depends.

If the new version is released as a patch to any security issue found in the old library, using it is a risk.

But, JavaScript is a client side language. So, it risk your server only if there is a flaw in Ajax Request functions.

JS library from http sources may cause other issues but from https won't make any other issue.

Another thing to note is that, as a web developer, read the migration log carefully before updating. Because if any function is deprecated and you used the function in your page, it won't work properly.

i--
  • 225
  • 2
  • 10
  • 2
    I'd also recommend to host the updated JS files locally. Even if it is unlikely, if someday the CDN is down, at least your website will keep on working – niilzon Mar 28 '17 at 07:51
  • Strongly disagree. The main Advantage of CDN is caching. CDN don't have a downtime. because it use multiple servers at multiple location. If a JS Library is cached from one site, the same file is used to all the sites uses the same CDN. For improved performance @niilzon – i-- Mar 28 '17 at 07:54
  • Would you mind telling me why the downvote? – i-- Mar 28 '17 at 07:55
  • 1
    @SagarV _"CDN don't have a downtime."_ - it's funny hearing that just a month after big Amazon AWS outage. – user11153 Mar 28 '17 at 09:14
  • 1
    A client-side language it may be, but it can still have security implications. Some people use JS libraries for crypto purposes (see: Lastpass, etc.) - an incorrect crypto implementation is a major issue. Often, JS libs are used to render user-entered data on other users' machines - this can lead to XSS attacks if this data is not escaped properly. All these can be due to a bug in a JS library. – Bob Mar 28 '17 at 11:23
  • @SagarV I did not downvote you however as user11153 points out, you are wrong regarding the 100% certainty of 24/7 availability of CDN's. Also client-side caching would be of no help for a user that needs the file for the first time at the moment the CDN is down. In that case your website would be "broken". – niilzon Mar 29 '17 at 07:07