Can anyone make sense of the G-sense error rate?

2

Before I get to my problem, I have to ask this question. Can the normalized value ever be 0, and can it ever be a negative number? I read many sources saying that there is no such thing as a normalized value of 0 or less, and the lowest normalized value is 1.

My brother and I both have laptops. We both have Western Digital hard drives installed in them. We both run windows 8.1. Mine is an ASUS N56JR, and my brother has an Acer (I forgot what the model was).

Here are some facts specific to Western Digital hard drives, well at least my brother's hard drive and my hard drive. If the raw value goes up by one unit, the normalized value goes down by one unit. A raw value in decimal form of 98 would have a normalized value of 2. The sum of raw value in decimal form and the normalized value always equal to 100 so long as the raw value in decimal form is less than 100.

I am going to list a sequence of 4 numbers, and they are in order, the normalized value, the worst value, the threshold, and the raw value in decimal form.

This is my laptop: 12 12 0 88

This is my brother's laptop: 91 91 0 19

Which would you say is the person who moves his laptop around while it is on, and which person leaves it at his desk 24 hours a day and never moves it at all?

If you say that I was the one who moves the laptop while it is on more often than my brother, you are wrong. The opposite is true.

This begs the question. Why is it so? We both use Western Digital hard drives, and my laptop is 3 months older than my brother's.

Is the G-sense sensor on my hard drive oversensitive, or is my brother's G-sense sensor undersensitive? Doesn't Western Digital have the same kind of G-sense sensors installed on all their hard drives? Should I panic when the G-sense error rate normalized value reaches 1 and the raw value goes beyond 100?

G-sense doesn't make sense

Posted 2015-05-11T19:21:56.233

Reputation: 21

No answers