1
My build:
-Asus Z270i
-2 Samsung nvme 960 Evo 1TB M.2 drives
-one Ocz Vector 180 ssd 280gb
The Backstory
I originally had two Samsung M.2 SSD's but after trying to install Solus for a second time on one of said SSD's, I received a plethora of errors indicating a drive issue pertaining to the inability to write to the drive. Because this was a relatively new device, I brushed this off as a mere bad-unit. Fast forward 8 months and I have been using windows 10 on the other Samsung drive since I don't want to risk possibly damaging or breaking another near $800 slab of memory chips, and so my friend donated an OCZ Vector SSD to me for experimental installs. It worked!... So I did my usual nooby coding and snooping to figure out and familiarize myself with Linux, soon leading to the destruction of that first install. "No worries", I thought, "I can just reinstall". Ans so I did. 10 times, each breakage of the os for various reasons irrelevant.
The Problem Begins
Come 11th install, and I give up installing drivers, so I try installing arch. I get this error; ERROR: unable to install packages to new root
. This error was in arch-anywhere and anarchy, and it only occurred on the OCZ drive, but a similar output came on the Samsung that broke. Both failures seem to pertain to the drives locking write access somehow, and what seems to confirm this is that the iso's work fine on that working Samsung drive I mentioned had windows, but not the others. Mind you neither GParted nor Windows Disk manager have had much luck making any changes to the drives although they can read them just fine.
The Question
Does anyone have a clue as to what could be causing this? And if so, how might I go about recovering a drive from this issue? I haven't yet found any other forum posts regarding even a similar enough issue to help at all. This happened on two different drives of different style ports (Both M.2 and SATA), so I think it may suggest a motherboard quirk, but I am a total and utter noob at drive troubleshooting. Therefore, any and all suggestions as to what to do about this and whether installing other Linux-distros on my still-working drive is safe will be greatly appreciated. I know this was long, but this is a very specific issue, or so it seems, so I welcome and really appreciate any forms of help.
Happy hardware!
1If the drive is technically fine (it may not be), I would suspect trimming issues. What filesystem(s) do you use? I have bad experiences with Btrfs clogging my SSD, unless I do
btrfs ballance ...
andfstrim
frequently enough. I imagine your SSD may consider blocks used by previous (now nonexistent) installs as still in use. If I'm right, this may help you to "reset" the drive: How to TRIM/DISCARD a whole SSD partition on Linux? Warning: trimming your whole device will discard all data on it. On the other hand my guess may be totally wrong. – Kamil Maciorowski – 2018-02-08T06:59:57.703