What is the use for built-in graphic card on a "gaming" motherboard?

39

9

Many motherboards marketed as "gaming" has an integrated Intel graphic cards. Examples are the ASUS B150I PRO GAMING/WIFI/AURA and the Gigabyte GA-Z170N-Gaming 5 but these are just a couple of many. Note the "Gaming" word in their respective names.

Now I understand, that if you want to build a gaming PC most likely you would opt for Nvidia or AMD. This is because integrated video do not have a chance to compare with higher end Nvidia/AMD offerings. Correct me if I'm wrong.

I understand that putting in an integrated graphics into a motherboard increases it's cost. So there must be a reason why manufactures do this. It looks to me that putting an integrated GPU on a gaming MB is more of a rule rather than an exception.

I however cannot figure it out, what this integrated graphic is good for. Could you please explain what it can be used for (I'm guessing the intentional use, but any other possible uses too) given that for a gaming PC one is most likely to utilize an external GPU?

If you think any of my assumptions are wrong please point that out, since the whole thing does not make a lot of sense to me it is quite likely that it's my assumptions that are wrong somewhere.

Andrew Savinykh

Posted 2016-05-29T03:32:25.830

Reputation: 1 521

11A somewhat esoteric use would be if you're running Linux but want to play in a Windows virtual machine by giving it control of your graphics card. In that case, you need to have a second GPU to display the Linux OS (there is no way to have a GPU shared with both the host and the VM) and this little built-in GPU becomes very useful. – André Borie – 2016-05-29T15:38:46.417

13To get right to the point (since the answers are correct but avoid the critical detail): Those motherboards do NOT have any graphics processor integrated. All they have is a video output connector that allows, if you buy a CPU with integrated graphics, to be able to use it. But if you buy a CPU without integrated graphics (prior to Skylake, you could buy Xeon CPUs and use them with gaming motherboards, and on Xeon the iGPU is optional), the motherboard will not provide it. – Ben Voigt – 2016-05-29T17:41:22.823

11"Gaming" is a meaningless marketing term and nothing more. You can buy non-gaming motherboards and have better performance for less money – Keltari – 2016-05-29T22:21:54.183

Don't forget that you're most likely not going to be gaming 24/7 - discrete GPUs tend to be rather power hungry even today, even when they're barely doing anything; an integrated GPU can handle desktop composition just fine, with vastly lower power requirements. Given their cost, there's hardly any reason not to include them. Even when you are gaming, the other GPU can be used for tasks like desktop composition or hardware video decoding, offloading work (and memory usage) from the main GPU. – Luaan – 2016-05-30T09:54:00.650

@Keltari but it glows in red! – downrep_nation – 2016-05-31T05:10:38.957

2Nvidia or Radeon These are not the same things, it should be Nvidia or AMD (company names) or GeForce or Radeon (GPU model names). – A.L – 2016-05-31T11:09:11.203

@Keltari Then you are not a gamer – paparazzo – 2016-05-31T13:28:29.967

2@Paparazzi real gamers don't fall for the hype. We make fun of those who do – Keltari – 2016-05-31T15:06:24.053

@Keltari If you don't know there are real differences in a gaming board that is on you – paparazzo – 2016-05-31T15:27:04.313

1@A.L - I felt something did not read right there, but could not put a finger on it. You are spot on, thank you very much, edited. – Andrew Savinykh – 2016-05-31T18:46:30.790

@Keltari: he drank the kool-aid, you won't win – Yorik – 2016-05-31T19:37:36.577

Answers

16

There are few wrong assumptions and they led you to wrong conclusions:

Many motherboards marketed as "gaming" has an integrated Intel graphic cards.

The graphic card is on CPU. Intel made this decision, not motherboard maker. When buying Intel, GPU cannot be avoided.

I understand that putting in an integrated graphics into a motherboard increases it's cost.

It depends on what you're looking at. If you look at price of chips alone, the costs are not ground breaking. On LGA775 platform the GPU was integrated in chipsets, so some had integrated GPU while others were genuinely lacking the processing power. However, the low-end chipsets with GPU (eg. G41) were actually cheaper than high-end chipsets without GPU (eg P45). So we can conclude that while integrated card must increase the price of a chip, it's not really enough to justify costs of making 2 lines of chips: with and without. This is probably why Intel decided to put a GPU on every single consumer CPU.

Now, since the GPU is already on the silicon, we can consider the costs that can be decided by motherboard designer. If he wants to make the GPU work, he adds the connectors (probably the most expensive part of implementing onboard GPU this days), traces, and a handful of dirt-cheap passive components like those tiny resistors and capacitors. Those costs are still negligible. If we were talking about lowest-end budget motherboard, axing few dollars would probably be at least put under serious consideration - but on a high end motherboard that is already expensive any possible savings are negligible.

This is because integrated video do not have a chance to compare with higher end Nvidia/Radeon offerings.

I cannot really call you wrong on this one. With high-end they can't compare. However, the old wives tale that integrated GPUs are useless isn't true anymore! There are 2 desktop Intel processors (LGA1150 Broadwell, Core i5-5675C and Core i7-5775C) that have integrated Iris Pro Graphics 6200 that was a shock when it was released in Q2 2015. It's performance is comparable to low end discrete GPU, so it can be actually used to play most games on lower detail. If you're a gamer on tight power or space budget (eg. console-sized living room PC), I believe this would be a way to go. This integrated GPU was probably quite expensive, that's why it's seen only on $276 CPU.

There is also an elephant in the room here. I believe you've assumed that "gaming" means "top performance". Well, it does not. It's simply a marketing strategy. Nobody is really able to tell what "gaming" label means, except that it features aggressive styling and a higher price tag. Basically a premium product. So, when in doubt, just add every feature you can and you'll have one more point on feature list. Like pretty RGB lights that most users will probably lock up in the case and shove under the desk to be never seen again or shiny metal over PCI slot that does nothing but looks cool. (Seriously, lights? How are they in ANY way useful in gaming? I can't believe you questioned integrated GPU while there are lights on the mobo!)

Agent_L

Posted 2016-05-29T03:32:25.830

Reputation: 1 493

2There are mobos with integrated graphics, not just processors. – Madeyedexter – 2016-05-29T20:53:37.990

1When buying Intel, GPU cannot be avoided. Wrong ! If you buy a multi thousand dollars intel ᴄᴘᴜ, chance are that the space used on the die for the gpu will be used to let the ᴄᴘᴜ having more transistor which means more throughoutput. Just check it on ark.intel.com (this include the latest gen). However, I’m not sure the video output of the motherboard would work. – user2284570 – 2016-05-30T00:01:20.273

And as far as "gaming" goes it's just makes a better question in my opinion. I better ask about "gaming" cards, indicate explicitly that I understand that it is a marketing term in the very first sentence of the question and put the word gaming in quotation marks, than try to come with more awkward definition of the mobo class I'm inquiring about. Apparently even then people feel compelled to point out that it's marketing. Thank you, I already know that =) – Andrew Savinykh – 2016-05-30T08:20:25.920

1"he adds the connectors [...] and a handful of dirt-cheap passive components like those tiny resistors and capacitors. Those costs are still negligible." I suspect a lot of people have little idea how negligable these costs are. Note that if you contract a chinese PCB manufacturer, they will usually throw in as many surface-mount resistors/transistors/etc as your design requires. They just don't bother charging for them. Even in the UK in small volumes, those components cost significantly less than a penny each. In small volumes, HDMI connectors are < 30p each, delivered from China. – Jules – 2016-05-30T13:16:33.920

1To a motherboard manufacturer, dealing in large bulk purchases of components, I would doubt that the extra cost of adding an HDMI port to a motherboard amounts to more than about 20p. The only reason they might choose not to do it is if they're struggling to make enough space for other components, particularly connectors that need to contend for precious space on the edge of the board. – Jules – 2016-05-30T13:17:52.457

@Madeyedexter Yes, and I mentioned them - the LGA 775 platform. If you believe there are modern, "gaming" platforms with GPU on-board instead on-CPU then please do correct me. – Agent_L – 2016-05-31T06:05:55.543

@user2284570 Yes, you can buy a Xeon, half of them come without GPU. No, it doesn't have extra processing power, just about $10 lower price tag. Yes, the video out will not work with such Xeon CPU, it's clearly stated in just about every motherboard manual. And most importantly - I deliberately omitted niche CPUs. They offer no benefits, only drawbacks for average home, office or gaming user so there is no point in mentioning them. https://en.wikipedia.org/wiki/List_of_Intel_Xeon_microprocessors#Skylake-based_Xeons

– Agent_L – 2016-05-31T06:19:09.787

@zespri I believe that for GPUs the "gaming" tag is one of few places where it actually belongs (or rather it's applied by default). Almost all popular GPUs have very little use except for playing games. There are some models oriented for professional graphics (eg Nvidia Quadro) or for supercomputing (eg early Titans), but they're clearly marked as non-gaming. My rant about "gaming" was about using it for products that have many uses and there are no features that would make gaming clearly stand out in some way. Like motherboards, headsets, etc. – Agent_L – 2016-05-31T06:25:51.260

"shiny metal over PCI slot that does nothing but looks cool" Well, that, and helping to ensure that your computer meets RF radiation shielding requirements and thus can be sold commercially at all. – a CVn – 2016-06-07T07:48:53.310

@MichaelKjörling All other motherboards do without such shields. If this one requires shield to meet EMI requirements, then it means that there is some underlying flaw in the design (which I doubt). Besides, it's not advertised as anti-radiation device, it's advertised as mechanical reinforcement of a part that should not carry bulk of mechanical stress in the first place. – Agent_L – 2016-06-07T08:17:53.023

@Agent_L I thought you were referring to case slot access holes. If you weren't, you might want to edit to clarify that you are talking about something else. I agree that a motherboard should not need to support significant mechanical stresses (possibly with the exception of the CPU cooler, but that's what the retainer support plate is for). – a CVn – 2016-06-07T08:44:10.083

@MichaelKjörling I clearly stated "shiny metal over PCI slot" and you even quoted me on that. There is no need for more clarification. – Agent_L – 2016-06-07T08:50:25.053

45

There's a few. Firstly, nearly every modern single mainstream1 processor has a integrated on die GPU. The chipset supports it. Essentially your only cost is the traces and connectors, so it's a 'free' feature you can design in - unlike older designs. Interestingly, many of the Sandy and Ivy Bridge-era Intel chipsets outside the Z series made you pick one or the other (H series) or didn't have onboard video (P series). Many earlier processor families used a PCIe 'slot' for a onboard chip but most integrated graphics is on die.

Modern integrated GPUs do neat stuff like quicksync, which mean even with a discrete card the IGPU bit of your core can be working. With earlier drivers you needed a display (or a dummy display) but you can set up quicksync to work without one for faster transcodes or video playback. I'm sure AMD has something similar on their APUs - but I've not used them recently - they're somewhat more powerful than intel's models, and paired with a discrete radeon might do switchable graphics to save power.

It's also handy if your main video card's blown and you don't have a spare. I seriously found this useful with my last PC, which had GPU failure. Sure, you can replace it, but its totally worth it to be able to check just by yanking out the old card and changing the output the monitor is plugged into.

So, in short: "All the pricy stuff is already there and Intel insists, so why not add a cheap feature?"

1 I'd consider most Intel LGA 115x processors, and AMD APUs to be mainstream. The AMD FX series and Intel LGA 2011 are enthusiast focused, though the FX series kinda overlaps with intel's mainstream products on price. AMD fans may disagree.

As of 2018 - things get a bit more complex. Intel's core i3 and i5s are solidly mainstream. The i7 and i9 badges have mainstream and server inspired models. As for AMD - ryzen's the mainstream and threadripper is enthusiast.

Journeyman Geek

Posted 2016-05-29T03:32:25.830

Reputation: 119 122

Might be nice to add that AMD doesn't have integrated graphics in order to keep prices down. Well, that's what they say anyways. (Last I heard at least) – Insane – 2016-05-29T07:14:59.810

5AMD's cheaper processors are apus. I suppose AMD's gaming/performance oriented fx processors don't have on die GPUs... – Journeyman Geek – 2016-05-29T09:09:50.347

9The third paragraph happened to me last year. The GPU failed and I just had to change the HDMI cable from the GPU to the motherboard (Intel CPU graphics) to be able to use the PC until I bought a new GPU. – Edu – 2016-05-29T11:21:03.267

2The difficulty in finding a mid-range CPU without integrated graphics is a bit beside the point. Even if only 25% of CPUs had a iGPU, it would still make sense for the m/b manufacturer to add the output connector allowing you to use iGPU if you chose to buy it. – Ben Voigt – 2016-05-29T17:37:01.200

17integrated devices are so nice. Im dating myself, but I remember when motherboards had a CPU slot, memory slots, and ISA slots and that was it. You had to buy serial/parallel cards, video cards, and even IDE HDD cards separately. You kids today are spoiled with integrated sounds cards, network cards, GPUs, SATA controllers, USB, etc. Plus we had to hand crank the machines for power and walk uphill both ways to school in the snow - wearing newspaper for shoes. – Keltari – 2016-05-29T22:26:00.603

6@Keltari -- too true. And those IDE cards were so much cheaper than the MFM/RLL controller cards they replaced... but don't forget the slot for the 8087: what folks today are really spoiled by is the fact that they get floating point for free when they buy a processor, rather than having to get another one just for that purpose... :) – Jules – 2016-05-30T13:07:30.047

1

Hmm. Got me reminiscing now. I reckon the disk controller in my first PC was probably one of these. Note that the card is substantially bigger than most modern motherboards. :)

– Jules – 2016-05-30T13:33:18.003

5

Could you please explain what it can be used for (I'm guessing the intentional use, but any other possible uses too) given that for a gaming PC one is most likely to utilize an external GPU?

There are two uses I can think of for integrated video in enthusiast hardware:

  1. It can drive an additional monitor. Have one or two monitors driven by the expensive PCIe GPU. Use those for your games that demand performance. Drive an extra monitor off the integrated video and use that for email and web browsing. Modern high-end cards tend to be able to drive more and more monitors, but I still think this is a valid point.

    My primary system has a monitor I use to keep up a web browser while gaming. It is great for having a wiki, forum, or Arqade up with information I can use in the game. My GPU has enough outputs that I can drive it off the primary video. If it did not, I would not hesitate to plug it into the integrated video connector.

  2. Troubleshooting if your GPU is damaged. You still have backup video to use your system while troubleshooting your primary video, or while ordering a replacement and waiting for it to arrive.

user76225

Posted 2016-05-29T03:32:25.830

Reputation:

Good points especially number 1. I have not considered that because I usually use graphic cards that can support two monitors, but if you don't have such card or what a third one, it is certainly a possible use. – Andrew Savinykh – 2016-05-30T08:23:47.640

Re additional monitor -- it's been a while since I bought a new computer with an Intel chip, but the last AMD machine I bought had an internal GPU but the chipset on the motherboard couldn't enable the internal GPU if I wanted to be able to use an external graphics card. I presume Intel systems (and/or more recent AMD ones... this was maybe 4 years ago now) have removed that restriction? – Jules – 2016-05-30T13:21:50.290

0

I have a high-end Nvidia card, and at some point the commonly used boot loader on a lot of utility discs stopped working with it. So I try to boot clone disk for backing up, and can't see the boot menu! Likewise with newer copies of ubcd and any "live" CD. Even Windows could not be installed, nor win10 update be run, using the normal video card. I have to re-enable and plug a monitor into the built-in Intel video in order to do stuff.

So, I'm glad I spent $10 more and got the CPU with the integrated GPU, in this case.

JDługosz

Posted 2016-05-29T03:32:25.830

Reputation: 597

1I think this is wrong. The whole reason why you could not see anything on screen is exactly because you have the integrated card and the output is redirected there. If you did not have the integrated card your output would appear on the external one, because it has nowhere else to go and you would not have the problem. – Andrew Savinykh – 2016-05-30T08:25:45.233

1No, the built-in video can be disabled in the boot settings (not just left on auto which detects what's plugged in), and some boot discs (that stay in text mode) work fine. – JDługosz – 2016-05-30T08:33:07.823

0

Sometimes when you upgrade your graphic cards your cards' driver software might not match, so you have to use the graphic card in motherboard to start your pc and get the driver.

M.Huo

Posted 2016-05-29T03:32:25.830

Reputation: 1

I've never had a situation where my video card couldn't downgrade to standard SVGA mode so I could get Windows to boot and download a new driver... – Jules – 2016-05-30T13:25:22.997

I had such a situation last week. I tried to install an Nvidia driver update, and got a driver that does not work under Windows Vista - it will not even allow the computer to boot, so there was no way to request such a downgrade. I finally found the installation DVDs, booted from one of them instead of from the hard drive, and found that a certain repair program CLAIMED to do nothing useful,but actually restored the ability to boot from the hard drive. If you are running an Nvidia-based graphics board, don't even TRY to install a driver newer than 365.19. – milesrf – 2016-05-31T20:27:20.000

0

It has seen that at times, when we upgrade our graphics card, the driver software of graphics card may not match. The best option to avoid this problem is that we have to use graphics card inside of motherboard to get the driver.

Michael Steven

Posted 2016-05-29T03:32:25.830

Reputation: 1

1

Take a look at How do I write a good answer

– Sam – 2017-06-01T07:48:49.320