Not the best answer, but I'd say developers need to learn enough to make good decisions with respect to the risk to their project.
I have seen that any project needs someone who knows enough about the entire system to be able to:
- Diagnose and fix cases where parts won't connect - increases with the number of diverse systems you try to connect together.
- Can test to make sure that cryptography is being implemented as expected.
- Can make the judgement calls on "how good" the crypto functions need to be and where they should be placed.
It's a good practice to have 1 guy who really, really knows this stuff, who can set up a pattern for everyone else. In most web development projects I've worked on, we've had a pattern for how to initiate secure sessions and what cases require the pattern, and a second set of patterns relating to secure authentication. This gets written as a reusable tool set, so it can be used by everyone else, who works at the most abstract level.
How the technology base is chosen is as much a senior level architectural decision as anything else and it's an area where if you have a big risk, you need to hire people with the expertise to know what to vet. One of the common problems I see is that the average software engineer will focus on a particular type of concern (often the strength of the algorithm) and miss the larger picture that the system is only as secure as the weakest link, so good practices that are totally separate from cryptography (memory handling, input verification, appropriate measures for credential distribution) - are as important (if not more so) as the strength of the crypto apparatus.
In a low-cost, low-risk environment, I favor the idea of slavish devotion to doing security exactly the way the platform provider says. It reduces the complexity of integration and it usually isn't insane... so long as you follow the guidance exactly. So it's taking the "never roll your own" maxim to the nth degree.
As risk and security requirements increase, the need to have someone who can evaluate tradeoffs and try other options increase. In just about every platform I've encountered, there's some severe limitations to what the default crypto libraries can do. Breaking out of the standard guidance can vary from being pretty simple and low-risk to being impossible, highly risky to security, or simply expensive to integrate. The problems in this area can make or break a product in two main ways - either integraing an uncommon scenario can drive product costs into the stratosphere, or you can introduce a security risk you didn't see coming that hits the product way down the line and can cause business catastrophe post release... Both situtations can be mitigated by hiring people with security control integration experience, but those folks cost money, so the costs will go up as you mitigate the risks.
Having worked in a high-end security development shop, I can say that the best practices I've seen have been to have a lot of mentoring in this area where developers that have made previous products mentor newer engineers through vetting the architectural choices and figuring out the level of abstraction required in each case.
There's no right answer on some of the questions above... for example:
Should they build upon the cryptography libraries present in the programming language?
Sometimes. .NET and Java both have perfectly acceptable implementations of SSL, for example, but they get both get difficult if you change authentication mechanisms, or try to integrate with specialized devices with very limited options for algorithm choices. In many cases, the default implementation of standard crypto protocols limits your ability to make configuration choices or to bundle more complex add-ons without cracking the cover.
Should they use third party cryptography libraries?
Sometimes - I do this most especially when I need a more complicated option than the programming language provides natively. The challenge here is that in many high end environments, some third party libraries are not sufficiently vetted for customer approval, or you may be at the mercy of an evolving product that is unstable for what you need. That said, over time, developers experienced in this area build up a collection of favorite APIs in a given framework.
Or should they take things on step further and use a library that takes care of everything, such as uLogin for PHP for example.
If it works, sure. I'll admit, in my experience, I have had to crack the cover too many times, to provide specialized authentication methods or to address the need to integrate with not-regularly-supported equipment - so I rarely think of a total package like this as a viable option - but I'm willing to bet that my experience is the extreme case.
Whenever you give this many choices away, you have to hope that the implementor made the same choices you'd make. It's worth it to investigate hacks in this area and so see what the vulnerabilities are.
If a third party library is to be used, how can a non-cryptographer verify the security of said library besides going by it's reputation.
This is the one easy one - there's a few standards out there:
FIPS - covers that a crypto library or device meets specifications for a certain degree of protection. The lowest level includes software-only implementations
Common Criteria - covers a wide number of devices, OSes and other products - it's a test that the product does what the documentation says it does.
That's my two most common sources, but I'd bet there were others. I also check length of time in the industry and the support model for upgrades. When I look for a crypto product, I look for a certain longetivity and a wide range of interoperability testing and a clear way of distributing patches and responding to detected vulnerabilities. For example, seeing a long lag between US-CERT vulnerability releases and product upgrades would be a bad sign. But not seeing any US-CERT vulnerabilities is not a good sign - it can be a sign that the product's market share is so low that no one is even trying to break it.
If you ask if any of these standards are perfect- I'd say no. But they are a good sign that the product has been out long enough to have invested in some of these certifications, which means they've put some serious energy into making sure the security characteristics are up to snuff.