In software, code auditing can be used as a mean to gain trust into a software. Of course closed, proprietary software would complicate this, and require to do reverse engineering. Anyway there seems to be a way for software to eliminate distrust to some extend.
Recent events have once again shown that hardware components in computers have become a powerful attack vector. Indeed the DMA-attack is problematic and can be inside the following hardware components: hard disks and SSDs, PCI, USB devices when they can compromise the USB host controller. Of course the CPUs themselves and the side-microcontrollers, e.g. Intel vPro/AMT/ME can also be backdoored (from a design of IC perspective).
Provided the technology used in manufactoring of the IC it seems doubtful that one can "look into the hardware" in a way that would correspond to the code auditing and or at least the reverse engineering of software.
Are there some best practises developed to audit hardware to some extend against malicious backdoors?
The current way I have found to mitigate the risk of hardware is to connect them with airgaping components or configure them with hardware jumpers, hence providing some sort of isolation. Obviously this is the exact opposite of what an integrated circuit or a SoC (System on Chip) are conceived for.