Why weren't discrete x86 CPUs ever used in game hardware? Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Can removing a cartridge from an NES (or any other cartridge-based game system) damage the hardware or software?Why did CP/M and MS-DOS used the BIOS drivers instead of their own drivers to access hardware?Why should 2 CPUs ever share the same bus?Why did Commodore sell CPUs to its competitors?Did any pre-microprocessor CPUs have a clock near the toggle speed of the logic ICs used?Was the IBM 5100 ever used for codebreaking?Examples of operating systems using hardware task switching of x86 CPUsWhy did some CPUs use two Read/Write lines, and others just one?Why can't special controllers or accessories be used with Super FX games?

How do I use the new nonlinear finite element in Mathematica 12 for this equation?

Can a new player join a group only when a new campaign starts?

Hangman Game with C++

Project Euler #1 in C++

Find 108 by using 3,4,6

How could we fake a moon landing now?

Why do we need to use the builder design pattern when we can do the same thing with setters?

How were pictures turned from film to a big picture in a picture frame before digital scanning?

Trademark violation for app?

Dating a Former Employee

Disembodied hand growing fangs

Effects on objects due to a brief relocation of massive amounts of mass

Maximum summed subsequences with non-adjacent items

How fail-safe is nr as stop bytes?

Chebyshev inequality in terms of RMS

Take 2! Is this homebrew Lady of Pain warlock patron balanced?

Significance of Cersei's obsession with elephants?

Drawing without replacement: why is the order of draw irrelevant?

Why does it sometimes sound good to play a grace note as a lead in to a note in a melody?

An adverb for when you're not exaggerating

What is this clumpy 20-30cm high yellow-flowered plant?

What's the meaning of "fortified infraction restraint"?

Most bit efficient text communication method?

How can I reduce the gap between left and right of cdot with a macro?



Why weren't discrete x86 CPUs ever used in game hardware?



Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Can removing a cartridge from an NES (or any other cartridge-based game system) damage the hardware or software?Why did CP/M and MS-DOS used the BIOS drivers instead of their own drivers to access hardware?Why should 2 CPUs ever share the same bus?Why did Commodore sell CPUs to its competitors?Did any pre-microprocessor CPUs have a clock near the toggle speed of the logic ICs used?Was the IBM 5100 ever used for codebreaking?Examples of operating systems using hardware task switching of x86 CPUsWhy did some CPUs use two Read/Write lines, and others just one?Why can't special controllers or accessories be used with Super FX games?










1















First off, you don't need to point out that the current generation home game consoles use APUs with x86_64 cores. Re-read the question title, if necessary.



I cannot recall a single arcade system or game console that ever used x86 for its CPU. I'm happy to be corrected in the comments if there were some. Notwithstanding these exceptions, which must be incredibly rare, it sure seems that gaming hardware steered away from this otherwise incredibly popular CPU family.



Why is this the case? What tradeoffs in game hardware design likely led engineers away from using x86 for the CPU?










share|improve this question

















  • 5





    I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.

    – supercat
    9 hours ago






  • 4





    Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).

    – Stephen Kitt
    8 hours ago







  • 2





    Several Irem arcade machines also used NEC V30 CPUs.

    – Stephen Kitt
    8 hours ago






  • 2





    I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.

    – Ross Ridge
    7 hours ago






  • 2





    Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.

    – user71659
    6 hours ago
















1















First off, you don't need to point out that the current generation home game consoles use APUs with x86_64 cores. Re-read the question title, if necessary.



I cannot recall a single arcade system or game console that ever used x86 for its CPU. I'm happy to be corrected in the comments if there were some. Notwithstanding these exceptions, which must be incredibly rare, it sure seems that gaming hardware steered away from this otherwise incredibly popular CPU family.



Why is this the case? What tradeoffs in game hardware design likely led engineers away from using x86 for the CPU?










share|improve this question

















  • 5





    I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.

    – supercat
    9 hours ago






  • 4





    Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).

    – Stephen Kitt
    8 hours ago







  • 2





    Several Irem arcade machines also used NEC V30 CPUs.

    – Stephen Kitt
    8 hours ago






  • 2





    I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.

    – Ross Ridge
    7 hours ago






  • 2





    Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.

    – user71659
    6 hours ago














1












1








1








First off, you don't need to point out that the current generation home game consoles use APUs with x86_64 cores. Re-read the question title, if necessary.



I cannot recall a single arcade system or game console that ever used x86 for its CPU. I'm happy to be corrected in the comments if there were some. Notwithstanding these exceptions, which must be incredibly rare, it sure seems that gaming hardware steered away from this otherwise incredibly popular CPU family.



Why is this the case? What tradeoffs in game hardware design likely led engineers away from using x86 for the CPU?










share|improve this question














First off, you don't need to point out that the current generation home game consoles use APUs with x86_64 cores. Re-read the question title, if necessary.



I cannot recall a single arcade system or game console that ever used x86 for its CPU. I'm happy to be corrected in the comments if there were some. Notwithstanding these exceptions, which must be incredibly rare, it sure seems that gaming hardware steered away from this otherwise incredibly popular CPU family.



Why is this the case? What tradeoffs in game hardware design likely led engineers away from using x86 for the CPU?







hardware gaming arcade x86 game-consoles






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked 9 hours ago









Brian HBrian H

17.8k66150




17.8k66150







  • 5





    I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.

    – supercat
    9 hours ago






  • 4





    Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).

    – Stephen Kitt
    8 hours ago







  • 2





    Several Irem arcade machines also used NEC V30 CPUs.

    – Stephen Kitt
    8 hours ago






  • 2





    I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.

    – Ross Ridge
    7 hours ago






  • 2





    Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.

    – user71659
    6 hours ago













  • 5





    I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.

    – supercat
    9 hours ago






  • 4





    Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).

    – Stephen Kitt
    8 hours ago







  • 2





    Several Irem arcade machines also used NEC V30 CPUs.

    – Stephen Kitt
    8 hours ago






  • 2





    I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.

    – Ross Ridge
    7 hours ago






  • 2





    Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.

    – user71659
    6 hours ago








5




5





I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.

– supercat
9 hours ago





I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.

– supercat
9 hours ago




4




4





Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).

– Stephen Kitt
8 hours ago






Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).

– Stephen Kitt
8 hours ago





2




2





Several Irem arcade machines also used NEC V30 CPUs.

– Stephen Kitt
8 hours ago





Several Irem arcade machines also used NEC V30 CPUs.

– Stephen Kitt
8 hours ago




2




2





I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.

– Ross Ridge
7 hours ago





I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.

– Ross Ridge
7 hours ago




2




2





Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.

– user71659
6 hours ago






Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.

– user71659
6 hours ago











3 Answers
3






active

oldest

votes


















7














The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.



Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.



Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.






share|improve this answer


















  • 4





    This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

    – Stephen Kitt
    7 hours ago






  • 2





    @StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

    – Ross Ridge
    7 hours ago











  • @Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

    – Stephen Kitt
    7 hours ago











  • @RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

    – supercat
    1 hour ago











  • @RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

    – Raffzahn
    58 mins ago


















4














Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free reign on choosing what CPU to use, basing their choice factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.



We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and 16 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had 8 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.



During the 70's and first half of the 80's, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80's that used a 5 Mhz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so didn't benefit the 8088's 20-bit address space.



Starting around the mid-80's, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware being more cost sensitive stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80's, the Commodore Amiga and Atari ST, also went with the 68000.



While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90's that games started demanding the level of performance it offered, and when its price would dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90's was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90's that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k considered obsolete. Still that didn't stop Sega from using the CISC-based NEC V60 CPU its arcade hardware designs in the early 90's.



For the rest of the 90's though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs, at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.



A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU made this design choice, but its only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.



Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.



So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.






share|improve this answer























  • Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

    – user71659
    2 hours ago












  • I think Microsoft even named the X-Box in homage to the DirectX API.

    – Brian H
    2 hours ago











  • The 8086 included some design missteps, but it was pretty well designed for a 16-bit processor. No 16-bit processor is going to be able to access large amounts of memory as efficiently as a 32-bit one, but I can't think of any better general design for how to have a 16-bit processor access 1MiB of address space. The 8088 performs poorly, but that's largely because the 8086 was designed to be used in 16-bit systems rather than 8-bit ones. The 80286 performs poorly, but that's because its designers failed to recognize and retain some of the key benefits of the 8086 design.

    – supercat
    2 hours ago











  • Also you should emphasize the fact that the 8088 (which belongs to the x86 family despite the last digit of its part number) was used in arcade machines.

    – supercat
    2 hours ago











  • @supercat Most of the my third paragraph deals with the use of the 8088 in arcade games. The 68000 is an example of a much better designed 16-bit CPU. The 8088 performance problems, while made worse because of the 8-bit bus, exist in the 8086:. en.wikipedia.org/wiki/Intel_8086#Performance

    – Ross Ridge
    1 hour ago


















2














Konix Multisystem: 6 MHz 8086 (1989).



Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.






share|improve this answer























  • Great find, nailed it :))

    – Raffzahn
    1 hour ago











Your Answer








StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "648"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f9738%2fwhy-werent-discrete-x86-cpus-ever-used-in-game-hardware%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























3 Answers
3






active

oldest

votes








3 Answers
3






active

oldest

votes









active

oldest

votes






active

oldest

votes









7














The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.



Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.



Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.






share|improve this answer


















  • 4





    This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

    – Stephen Kitt
    7 hours ago






  • 2





    @StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

    – Ross Ridge
    7 hours ago











  • @Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

    – Stephen Kitt
    7 hours ago











  • @RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

    – supercat
    1 hour ago











  • @RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

    – Raffzahn
    58 mins ago















7














The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.



Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.



Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.






share|improve this answer


















  • 4





    This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

    – Stephen Kitt
    7 hours ago






  • 2





    @StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

    – Ross Ridge
    7 hours ago











  • @Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

    – Stephen Kitt
    7 hours ago











  • @RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

    – supercat
    1 hour ago











  • @RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

    – Raffzahn
    58 mins ago













7












7








7







The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.



Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.



Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.






share|improve this answer













The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.



Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.



Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.







share|improve this answer












share|improve this answer



share|improve this answer










answered 8 hours ago









useruser

4,255819




4,255819







  • 4





    This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

    – Stephen Kitt
    7 hours ago






  • 2





    @StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

    – Ross Ridge
    7 hours ago











  • @Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

    – Stephen Kitt
    7 hours ago











  • @RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

    – supercat
    1 hour ago











  • @RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

    – Raffzahn
    58 mins ago












  • 4





    This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

    – Stephen Kitt
    7 hours ago






  • 2





    @StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

    – Ross Ridge
    7 hours ago











  • @Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

    – Stephen Kitt
    7 hours ago











  • @RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

    – supercat
    1 hour ago











  • @RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

    – Raffzahn
    58 mins ago







4




4





This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

– Stephen Kitt
7 hours ago





This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

– Stephen Kitt
7 hours ago




2




2





@StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

– Ross Ridge
7 hours ago





@StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

– Ross Ridge
7 hours ago













@Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

– Stephen Kitt
7 hours ago





@Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

– Stephen Kitt
7 hours ago













@RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

– supercat
1 hour ago





@RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

– supercat
1 hour ago













@RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

– Raffzahn
58 mins ago





@RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

– Raffzahn
58 mins ago











4














Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free reign on choosing what CPU to use, basing their choice factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.



We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and 16 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had 8 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.



During the 70's and first half of the 80's, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80's that used a 5 Mhz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so didn't benefit the 8088's 20-bit address space.



Starting around the mid-80's, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware being more cost sensitive stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80's, the Commodore Amiga and Atari ST, also went with the 68000.



While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90's that games started demanding the level of performance it offered, and when its price would dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90's was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90's that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k considered obsolete. Still that didn't stop Sega from using the CISC-based NEC V60 CPU its arcade hardware designs in the early 90's.



For the rest of the 90's though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs, at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.



A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU made this design choice, but its only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.



Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.



So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.






share|improve this answer























  • Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

    – user71659
    2 hours ago












  • I think Microsoft even named the X-Box in homage to the DirectX API.

    – Brian H
    2 hours ago











  • The 8086 included some design missteps, but it was pretty well designed for a 16-bit processor. No 16-bit processor is going to be able to access large amounts of memory as efficiently as a 32-bit one, but I can't think of any better general design for how to have a 16-bit processor access 1MiB of address space. The 8088 performs poorly, but that's largely because the 8086 was designed to be used in 16-bit systems rather than 8-bit ones. The 80286 performs poorly, but that's because its designers failed to recognize and retain some of the key benefits of the 8086 design.

    – supercat
    2 hours ago











  • Also you should emphasize the fact that the 8088 (which belongs to the x86 family despite the last digit of its part number) was used in arcade machines.

    – supercat
    2 hours ago











  • @supercat Most of the my third paragraph deals with the use of the 8088 in arcade games. The 68000 is an example of a much better designed 16-bit CPU. The 8088 performance problems, while made worse because of the 8-bit bus, exist in the 8086:. en.wikipedia.org/wiki/Intel_8086#Performance

    – Ross Ridge
    1 hour ago















4














Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free reign on choosing what CPU to use, basing their choice factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.



We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and 16 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had 8 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.



During the 70's and first half of the 80's, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80's that used a 5 Mhz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so didn't benefit the 8088's 20-bit address space.



Starting around the mid-80's, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware being more cost sensitive stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80's, the Commodore Amiga and Atari ST, also went with the 68000.



While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90's that games started demanding the level of performance it offered, and when its price would dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90's was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90's that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k considered obsolete. Still that didn't stop Sega from using the CISC-based NEC V60 CPU its arcade hardware designs in the early 90's.



For the rest of the 90's though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs, at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.



A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU made this design choice, but its only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.



Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.



So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.






share|improve this answer























  • Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

    – user71659
    2 hours ago












  • I think Microsoft even named the X-Box in homage to the DirectX API.

    – Brian H
    2 hours ago











  • The 8086 included some design missteps, but it was pretty well designed for a 16-bit processor. No 16-bit processor is going to be able to access large amounts of memory as efficiently as a 32-bit one, but I can't think of any better general design for how to have a 16-bit processor access 1MiB of address space. The 8088 performs poorly, but that's largely because the 8086 was designed to be used in 16-bit systems rather than 8-bit ones. The 80286 performs poorly, but that's because its designers failed to recognize and retain some of the key benefits of the 8086 design.

    – supercat
    2 hours ago











  • Also you should emphasize the fact that the 8088 (which belongs to the x86 family despite the last digit of its part number) was used in arcade machines.

    – supercat
    2 hours ago











  • @supercat Most of the my third paragraph deals with the use of the 8088 in arcade games. The 68000 is an example of a much better designed 16-bit CPU. The 8088 performance problems, while made worse because of the 8-bit bus, exist in the 8086:. en.wikipedia.org/wiki/Intel_8086#Performance

    – Ross Ridge
    1 hour ago













4












4








4







Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free reign on choosing what CPU to use, basing their choice factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.



We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and 16 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had 8 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.



During the 70's and first half of the 80's, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80's that used a 5 Mhz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so didn't benefit the 8088's 20-bit address space.



Starting around the mid-80's, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware being more cost sensitive stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80's, the Commodore Amiga and Atari ST, also went with the 68000.



While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90's that games started demanding the level of performance it offered, and when its price would dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90's was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90's that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k considered obsolete. Still that didn't stop Sega from using the CISC-based NEC V60 CPU its arcade hardware designs in the early 90's.



For the rest of the 90's though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs, at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.



A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU made this design choice, but its only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.



Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.



So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.






share|improve this answer













Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free reign on choosing what CPU to use, basing their choice factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.



We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and 16 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had 8 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.



During the 70's and first half of the 80's, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80's that used a 5 Mhz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so didn't benefit the 8088's 20-bit address space.



Starting around the mid-80's, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware being more cost sensitive stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80's, the Commodore Amiga and Atari ST, also went with the 68000.



While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90's that games started demanding the level of performance it offered, and when its price would dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90's was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90's that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k considered obsolete. Still that didn't stop Sega from using the CISC-based NEC V60 CPU its arcade hardware designs in the early 90's.



For the rest of the 90's though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs, at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.



A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU made this design choice, but its only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.



Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.



So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.







share|improve this answer












share|improve this answer



share|improve this answer










answered 3 hours ago









Ross RidgeRoss Ridge

3,97721526




3,97721526












  • Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

    – user71659
    2 hours ago












  • I think Microsoft even named the X-Box in homage to the DirectX API.

    – Brian H
    2 hours ago











  • The 8086 included some design missteps, but it was pretty well designed for a 16-bit processor. No 16-bit processor is going to be able to access large amounts of memory as efficiently as a 32-bit one, but I can't think of any better general design for how to have a 16-bit processor access 1MiB of address space. The 8088 performs poorly, but that's largely because the 8086 was designed to be used in 16-bit systems rather than 8-bit ones. The 80286 performs poorly, but that's because its designers failed to recognize and retain some of the key benefits of the 8086 design.

    – supercat
    2 hours ago











  • Also you should emphasize the fact that the 8088 (which belongs to the x86 family despite the last digit of its part number) was used in arcade machines.

    – supercat
    2 hours ago











  • @supercat Most of the my third paragraph deals with the use of the 8088 in arcade games. The 68000 is an example of a much better designed 16-bit CPU. The 8088 performance problems, while made worse because of the 8-bit bus, exist in the 8086:. en.wikipedia.org/wiki/Intel_8086#Performance

    – Ross Ridge
    1 hour ago

















  • Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

    – user71659
    2 hours ago












  • I think Microsoft even named the X-Box in homage to the DirectX API.

    – Brian H
    2 hours ago











  • The 8086 included some design missteps, but it was pretty well designed for a 16-bit processor. No 16-bit processor is going to be able to access large amounts of memory as efficiently as a 32-bit one, but I can't think of any better general design for how to have a 16-bit processor access 1MiB of address space. The 8088 performs poorly, but that's largely because the 8086 was designed to be used in 16-bit systems rather than 8-bit ones. The 80286 performs poorly, but that's because its designers failed to recognize and retain some of the key benefits of the 8086 design.

    – supercat
    2 hours ago











  • Also you should emphasize the fact that the 8088 (which belongs to the x86 family despite the last digit of its part number) was used in arcade machines.

    – supercat
    2 hours ago











  • @supercat Most of the my third paragraph deals with the use of the 8088 in arcade games. The 68000 is an example of a much better designed 16-bit CPU. The 8088 performance problems, while made worse because of the 8-bit bus, exist in the 8086:. en.wikipedia.org/wiki/Intel_8086#Performance

    – Ross Ridge
    1 hour ago
















Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

– user71659
2 hours ago






Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

– user71659
2 hours ago














I think Microsoft even named the X-Box in homage to the DirectX API.

– Brian H
2 hours ago





I think Microsoft even named the X-Box in homage to the DirectX API.

– Brian H
2 hours ago













The 8086 included some design missteps, but it was pretty well designed for a 16-bit processor. No 16-bit processor is going to be able to access large amounts of memory as efficiently as a 32-bit one, but I can't think of any better general design for how to have a 16-bit processor access 1MiB of address space. The 8088 performs poorly, but that's largely because the 8086 was designed to be used in 16-bit systems rather than 8-bit ones. The 80286 performs poorly, but that's because its designers failed to recognize and retain some of the key benefits of the 8086 design.

– supercat
2 hours ago





The 8086 included some design missteps, but it was pretty well designed for a 16-bit processor. No 16-bit processor is going to be able to access large amounts of memory as efficiently as a 32-bit one, but I can't think of any better general design for how to have a 16-bit processor access 1MiB of address space. The 8088 performs poorly, but that's largely because the 8086 was designed to be used in 16-bit systems rather than 8-bit ones. The 80286 performs poorly, but that's because its designers failed to recognize and retain some of the key benefits of the 8086 design.

– supercat
2 hours ago













Also you should emphasize the fact that the 8088 (which belongs to the x86 family despite the last digit of its part number) was used in arcade machines.

– supercat
2 hours ago





Also you should emphasize the fact that the 8088 (which belongs to the x86 family despite the last digit of its part number) was used in arcade machines.

– supercat
2 hours ago













@supercat Most of the my third paragraph deals with the use of the 8088 in arcade games. The 68000 is an example of a much better designed 16-bit CPU. The 8088 performance problems, while made worse because of the 8-bit bus, exist in the 8086:. en.wikipedia.org/wiki/Intel_8086#Performance

– Ross Ridge
1 hour ago





@supercat Most of the my third paragraph deals with the use of the 8088 in arcade games. The 68000 is an example of a much better designed 16-bit CPU. The 8088 performance problems, while made worse because of the 8-bit bus, exist in the 8086:. en.wikipedia.org/wiki/Intel_8086#Performance

– Ross Ridge
1 hour ago











2














Konix Multisystem: 6 MHz 8086 (1989).



Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.






share|improve this answer























  • Great find, nailed it :))

    – Raffzahn
    1 hour ago















2














Konix Multisystem: 6 MHz 8086 (1989).



Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.






share|improve this answer























  • Great find, nailed it :))

    – Raffzahn
    1 hour ago













2












2








2







Konix Multisystem: 6 MHz 8086 (1989).



Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.






share|improve this answer













Konix Multisystem: 6 MHz 8086 (1989).



Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.







share|improve this answer












share|improve this answer



share|improve this answer










answered 2 hours ago









scrussscruss

7,45611450




7,45611450












  • Great find, nailed it :))

    – Raffzahn
    1 hour ago

















  • Great find, nailed it :))

    – Raffzahn
    1 hour ago
















Great find, nailed it :))

– Raffzahn
1 hour ago





Great find, nailed it :))

– Raffzahn
1 hour ago

















draft saved

draft discarded
















































Thanks for contributing an answer to Retrocomputing Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f9738%2fwhy-werent-discrete-x86-cpus-ever-used-in-game-hardware%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Are there any AGPL-style licences that require source code modifications to be public? Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?Force derivative works to be publicAre there any GPL like licenses for Apple App Store?Do you violate the GPL if you provide source code that cannot be compiled?GPL - is it distribution to use libraries in an appliance loaned to customers?Distributing App for free which uses GPL'ed codeModifications of server software under GPL, with web/CLI interfaceDoes using an AGPLv3-licensed library prevent me from dual-licensing my own source code?Can I publish only select code under GPLv3 from a private project?Is there published precedent regarding the scope of covered work that uses AGPL software?If MIT licensed code links to GPL licensed code what should be the license of the resulting binary program?If I use a public API endpoint that has its source code licensed under AGPL in my app, do I need to disclose my source?

2013 GY136 Descoberta | Órbita | Referências Menu de navegação«List Of Centaurs and Scattered-Disk Objects»«List of Known Trans-Neptunian Objects»

Button changing it's text & action. Good or terrible? The 2019 Stack Overflow Developer Survey Results Are Inchanging text on user mouseoverShould certain functions be “hard to find” for powerusers to discover?Custom liking function - do I need user login?Using different checkbox style for different checkbox behaviorBest Practices: Save and Exit in Software UIInteraction with remote validated formMore efficient UI to progress the user through a complicated process?Designing a popup notice for a gameShould bulk-editing functions be hidden until a table row is selected, or is there a better solution?Is it bad practice to disable (replace) the context menu?