How to recover failed Harddrive in SLES 11

0

I attempt to recover important data from primary hdd which running SLES 11. At that time (while error occurs), I did force reboot and then in the terminal login shown (repair filesystem)

Then I try to follow suggested command there, asked me to run ext3fs. Bla bla and I doesn't works.

Then I did reinstall SLES 11 on another HDD and pull out the failed HDD. Now, I put the failed HDD as secondary, but in command df -h it is not listed there. It only shows /dev/sda1 (new hdd)

I use another command cat /proc/partitions which shows:

8 0 1953514584 sda
8 1 1951486818 sda1
8 2 2024188 sda2
8 16 1953514584 sdb
8 17 1951486818 sdb1
8 18 2024188 sdb2

sdb is the failed hdd, please help me how to make it readable in order I could copy the important data from it and what is the appropriate step to avoid the data inside broken while I attempt to recover it?

akauts

Posted 2014-04-29T04:00:46.577

Reputation: 1

df only shows mounted filesystems. So, mount appropriate slices (/dev/sdb{1,2}) somewhere (or scan form LVMs in case you've used it). – Sami Laine – 2014-04-29T14:47:07.297

Answers

0

df is not the right tool to look at unmounted file systems.

Run fsck /dev/sdb1 to check for recoverable fs errors.

Then, mount the recovered file systems through command line (mount) or yast.

user4004936

Posted 2014-04-29T04:00:46.577

Reputation: 131