0

I had a server go offline yesterday complaining of disk failure. I re-seated the disks in the set and all of the light went green. I rebooted the machine only to find that now it won't boot into esxi.

It was complaining about a foreign config and one of the disks was status rebuild. According to dell forum, I "cleared" the foreign config and rebooted. Now all of the disks are in status "ready" but the machine still won't boot (vd 0 not found).

I don't believe I did anything to lose data? I was under the impression that the raid setup was to help survive a disk failure.

Any tips or suggestions on how I can get the machine to boot into esxi again would be appreciated. I would also be happy to find a way to get a VM off of the machine that I could move to another machine.

Thanks.

stormdrain
  • 1,377
  • 7
  • 28
  • 51
  • 1
    Clearing the foreign config may have wiped the RAID configuration which is on disk. You could try to boot with only with a subset of the disks, e.g. only 2 disks. – Thomas Jan 21 '17 at 14:04
  • Thanks. How would the config come back though? Would it just work with the proper number of functioning disks? – stormdrain Jan 21 '17 at 15:42

1 Answers1

0

What I ended up doing was re-creating the configuration by re-creating the VD with the seated drives. The setup was exactly the same - same drives in the same slots with the same striping as RAID10. Importantly, I did not initialize the VD as this would have erased the disks.

Once I rebooted, it booted into esxi and once the array had complete parity checks, I re-added the datastore and the vm's were available again.

stormdrain
  • 1,377
  • 7
  • 28
  • 51