All semi autonomous equipment ever since the birth of the computer revolution has had some sort of "Processor" on it, it's just until now it was never really flagged as such.
What your seeing here is the ongoing corruption and half truths that are spread through our society by over zealous marketing agencies, where sales people are encouraged more and more and made to believe they are the stars of the show, simply because they are the one's making the profits.
The fact of the matter is this however, anything that has to perform a set of tasks where the next iteration of a process can be different to the previous iteration, must have some kind of interpreter that can make sense of the instructions the device is given, and then react to those instructions in some fashion.
Back in the mists of time, terminology such as "Controllers" where the norm, but these still boiled down to the same thing.
Take for example an "IDE Hard drive, with it's on board IDE controller", while this is not a CPU in the same sense that you think of a CPU on your PC's main board, it is never the less still a form of CPU.
The host PC sends "OP Codes" (Short for operation codes) across the bus (PCI, ISA, MCI, PCIe or whatever) to the drives controller, the controller then reads this code, and any data that's provided with it and turns them into physical operations that then cause the drive to move the heads to the correct place, and read the requested data.
Routers have an even longer history, Cisco has built networking gear now for best part of the last 50 years or more, and every single one of these devices have had a custom controller/CPU in them all that time. This CPU was designed by Cisco, for Cisco expressly for the purpose of programming and controlling their entire range or Routers & Switches.
Graphics cards are another thing, you hear people band the term "GPU" around like it's some mystical thing that only does graphics. It's not, it's a massively parallel mathematical algorithm processor, Iv'e just finished doing the technical edit on a book on Nvidia CUDA, and what I learned about Nvidia GPU's was rather surprising, these things are Processor's in their own right, processors that are designed to do a specialist set of jobs, but they are still semi intelligent and capable of many different types of operation.
As has been pointed out already, the Netgear Readynas is actually more like a full PC in it's own right. It's just specially designed to function only as a remote storage device.
If you wanted to there would be nothing stopping you from re-programming the Netgear device with new software and making it function perfectly fine as a web server, database server or even a small Linux development server. (A quick search will show you more than a handful of projects aimed to do such a thing with these NAS units)
In terms of processor, well it might surprise you to learn that it's not just Hard drives that have "Processors" on these days, try this little experiment.
Go stand in your kitchen and see just how many CPU's you can count.
I'm willing to bet that your Fridge/Freezer, Washing Machine, Dish washer, Oven and Microwave (at the very least) all have some sort of Processor in, it may not be an Intel Core i7, but it's still a processor, and it's designed to sit their quietly, interpreting instructions sent to it by other electrical/digital circuits which it then turns into the physical operations you see.
So what is the definition of a Processor?
Well it's a bit hard to pin down these days, but in general the definition of a "Processor" is something along the lines of "Any self contained unit, that is capable of acting on external inputs in a semi intelligent way, and producing a known set of outputs derived from those inputs"
SO any stand alone unit, circuit, chip or autonomous machine that can effect a physical manifestation of some known process based on a set of pre-defined inputs can in it's most basic and generic sense be considered to be a processor of some description.
19That Netgear isn't just a router but a full fledged file server. With the hard drive, it does just do some preprocessing on one and IO on others. Theoretically a bit faster, but an SSD is still the king of speed. Looks like the ASUS router has some VPN features and other fanciness that would need to have some processing power, hence the dual core. – user341814 – 2014-08-20T04:22:25.960
17The Von Neumann model says nothing about the structure of I/O devices. You still need a graphics card to drive a monitor, even though that model lumps it all under a single "output" block. – user253751 – 2014-08-20T10:11:09.733
10The Von Neumann architecture (from 1945) is a great starting point (conceptually) to understand stored-program computers. The actual implementation of modern computers (including most peripherals) is significantly more detailed. In 1945 there were no "smart peripherals" so they would not be represented in the diagram. Cars are conceptually the same as they were in 1945 (four wheels, an engine, steering wheel) but you'd not expect a simplified diagram of a car from 1945 would give you a comprehensive understanding of them today. – Maxx Daymon – 2014-08-20T14:07:28.623
7
The von Neumann architecture diagram also doesn't include an arrow between "Memory" and "Storage". Consider DMA.
– a CVn – 2014-08-20T14:26:14.787All that "Von Neumann architecture" means is that the processor is "programmable", and the program memory is shared with the data memory. (As opposed to a "Harvard machine", where the program memory is separate from the data memory.) – Daniel R Hicks – 2014-08-20T15:19:01.613
3Did you know that (apart from Apple - because of Woz), every early home microcomputer (that I can think of) had a processor in the floppy drive? Remember the chunk-chunk-chunk sound of early Apple floppy drives? That was because they found sector zero by moving the drive arm to the maximum distance three times. – Elliott Frisch – 2014-08-22T17:43:10.320
@ElliotFrisch I thought that was done by the OS? – user253751 – 2014-08-24T03:15:04.827
@ElliottFrisch: The reason that most 1980's home computers used a microprocessor in the floppy drive was that reading a floppy drive requires that a byte of data be accepted about once every 20-30 microseconds while a sector is being read; that required either using DMA circuitry, or else having a processor which could devote many thousands of consecutive cycles to the task of reading a disk. On a machine like the Commodore 64, the video chip takes over the processor bus 1500 times per second, delaying code execution by 40-43 microseconds each time. Many other machines with fancy graphics... – supercat – 2014-08-25T15:38:30.927
...such as the Atari 800 also use cycle-stealing. The ability to steal memory cycles allows the Commodore and Atari to display much fancier graphics than the Apple, but means that their main processors cannot perform any task which would require their undivided attention. Although the Apple II clock is slightly irregular because of the video (most cycles are 977.8ns, but every 65th cycle is 139.6ns longer), that discrepancy is small enough to be ignored. The loss of groups of 43 consecutive cycles, isn't. – supercat – 2014-08-25T15:45:04.877
Arguably, the VIC 20 (which preceded the Commodore 64) could have used its own processor to handle floppy drive access if it disabled the 60Hz keyboard-scanning interrupt during floppy access, but the machine only had 5K RAM and a single CPU-bus slot. The amount of circuitry that would have been needed to let the VIC 20's processor control a floppy drive "directly" while still having a slot to plug in RAM expansion units would have been sufficiently great that adding the processor as well represented a minimal added expense. – supercat – 2014-08-25T15:55:02.223
From an engineering standpoint, it might not have been a bad idea for Commodore to have produced an interface cartridge which could connect to either the floppy drive or printer, but from a marketing standpoint saying the computer could connect directly to a roughly-$400 (IIRC) printer and a $599 floppy was probably better than saying it would require a $100 controller, even if the $100 controller would have allowed the prices of the floppy and the printer to be reduced by $100 each. – supercat – 2014-08-25T15:59:16.113
@supercat Fair enough; and I did know most of that. My point was that integrating processors onto disk drives was not a recent phenomenon. I also found it amusing to reflect that the C64 had a MOS6502 as a main processor... and every 1541 floppy drive also had a 6502. Of course, Commodore bought MOS Technology so they could source them cheap.
– Elliott Frisch – 2014-08-25T16:17:24.637@ElliottFrisch: I do not believe the IBM PC floppy drives, nor the IBM PC floppy controller cards, included anything that would be considered a microprocessor in the usual sense [the Apple's floppy controller card contained a discrete logic machine that executed two "instructions" from its own ROM for every 6502 cycle; I think that's just as much of a "processor" as anything in the PC drives or controller]. The PC's controller could perform a more sophisticated sequence of steps without processor involvement than could that of the Apple II, but... – supercat – 2014-08-25T16:31:51.973
@ElliottFrisch: I think things like "fetch N bytes of data and stop" were implemented by using a counter which was hard-wired to count bytes, rather than by using a shared ALU to decrement the value of a register which supported only simple "read value" and "write value" functionality. – supercat – 2014-08-25T16:36:41.230