Are NVidia ESA values standardized?

0

I'm presently reverse-engineering the Dell XPS 630 and 730 Master IO boards, to create cross-platform tools that aren't restricted by OS and motherboard. I've completely analyzed the protocol, and have a fully functional program to control the fans and lights and whatnot. However, reading the temperatures and fan speeds requires more math.

These boards are supposed to be NVidia ESA Certified, which at the time was intended to be a magical open standard that would solve everyone's problems forever! Unfortunately, I can't find this "open standard" anywhere. Various posts indicate that what I'm looking for should be a part of the standard, but no one has seen fit to mention how.

Mostly, I'm looking to find how these obscure values are parsed to usable terms - whether the ESA Specification says "0x00 to 0xFF is a range from 10 to 200 degrees C", or if Dell just knows precisely what thermal sensor is in place and can therefore convert the values in their own software.

EDIT: After quite a bit of experimenting, I have determined that the algorithm for temperatures in this case is a very simple one; T = n - 64, where T is the temperature in Celsius, and n is the byte value reported from the controller. At least, allegedly; this is the same controller and software in which the fan speed won't go above 95%, and yet exceeds the fan model's maximum RPM.

However, I am leaving this question unanswered, because I've not yet found out if this is a standard of any kind.

DigitalMan

Posted 2015-05-07T03:19:27.890

Reputation: 171

Where did you get 0x00 to 0xFF? That would be 256 values that could be used... – Austin T French – 2015-05-07T04:02:15.967

1@AthomSfere Correct - the temperature sensors use one byte, and it is not directly a Celsius or Fahrenheit reading. Fans use three bytes, with the first two being a short that approximately equates to a percentage when divided by 65535. – DigitalMan – 2015-05-07T04:09:33.633

No answers