Reduced Instruction Set Computers (RISC) VS Complex Instruction Set Computers (CISC)
Computers are everywhere. Even in some places you wouldn't expect to be a “computer” as most would know it. Servers, Desktop, Laptops, Tablets, and Cellphones are some of the most commonly recognized “Computers”, but Computers don't have to be complex or powerful in ways we're used to. Other common “computers” are things you use every day and never think about. TVs, Wireless Routers, dishwashers, ovens, even vehicles use computers called ECUs (Engine Control Unit). Computers are everywhere and can range from the world class super computers in data centers to the cheapest toy drone your children play with. These difference in purposes result in a wide range of needs and not every chip type best fits that scenario. Nobody would want a phone where the battery last one hour and nobody wants a server that can't handle its intended workload, so this results in different types of chips. Even though we have more than the 2 listed (eg. SIMD is used for GPUs), We tend to classify many CPUs in one of two groups: RISC or CISC. Lets take a look at how they are different.
Pros of CISC:
Cons of CISC:
Its not all bad for CISC. Many would be surprised as to what a “complex” instruction is. Do you use encryption or maybe compression? These are two VERY common complex instructions. Its the difference between using a common calculator with basic functions (RISC-like example) as it takes several operations to reach an answer compared to using a TI-84 graphing calculator (CISC-like example) with a built in function that just accepts the variables and spits out the answer in one operation. But realistically we know common usage by the common individual means 90% of the time we aren't using complex instructions on their computers.
Pros of RISC:
Cons of RISC:
Its not all bad for RISC either. Its nice to have a computer in your pocket that has battery life of all day and isn't burning your leg. We also know, you don't use a tablet or a phone for heavy duty computing needs so the additional functionality is something rarely needed in these mobile devices. Most people wouldn't realize they are missing any features or instructions considering they would probably never use them for their everyday activities of browsing facebook, watching videos, or taking photos.
What is different from a Xeon and a Core i7? Its crazy how often I ask this question and get a bad answer. In many interviews I've heard people try to confidently fake their way to a job. When asking someone if they could build their own server with money being no object, its extremely common to hear they'd choose a Core i7 and add a gaming GPU. When asking why they don't choose a Xeon for their server I get blank stares and a common reply is “I don't like AMD stuff”. This immediately shows the person has no clue what they're talking about. I'm not against adding GPUs to servers, they can easily serve a purpose in a server, but I rarely get a good answer as to why they want it in a SERVER. Some have been great answers such as folding@home, remotely building 3D models, or some kind of compiler functions that can use GPUs. Others have been terrible claiming Servers have more cores therefore making them way better for gaming which is absolutely false. Understanding the differences in these different grades is crucial.
Then you want a Consumer Grade CPU or computer. A perfect example would be I could buy a Core i7-10700 for $275 or I could buy a much slower Xeon Silver for over $500. It doesn't just stop there. If you buy a more expensive CPU, you're gonna pay for a more expensive motherboard to it with, as well as more expensive RAM and the trend of costing 2X+ continues. Consumer Grade equipment is really meant to serve a limited number of users (usually 1).
Then you want Workstation or Server Grade components. Workstation components could use Consumer Grade amounts of RAM, but it all depends on the use case. If you need multiple GPUs for 3D model rendering (Pixar studios anyone? Game Devs? etc) you require your CPU offer enough PCIE lanes to support that many GPUs. Workstation and Server Grade CPUs offer more PCIE lanes than Consumer Grade. Depending on how large your jobs are, you may require more than 32G of RAM. These jobs are heavily threaded, so while single thread performance is helpful, its not king. More cores and threads means more work done in less time. Perfect example is I need to move 50 people. I could take a sports car and move people 2 at a time a little faster, or I could take a greyhound bus and move all 50 people at once just slower.
Workstation and Server Grade doesn't always mean more CPUs, cores, threads, more GPUs. Sometimes it just means more reliability, 24/7 uptime, error correction.
Full Disclosure: I don't have any stock in either of the companies listed, I do not stand to gain anything by support for one or the other.
Firstly I want to say, go with whoever your research tells you to go with. You don't have to go by my page alone. There are always going to be things that Intel is better at as well as things AMD is better at. Based on your use case decided from there. AMD tends tends to have longer upgrade paths keeping a CPU socket around longer allowing for easier upgrades. AMD also allows Unbuffered ECC memory to be used with its chips so long as the motherboard manufacturer decided to enable it. AMD chips can seem to blur the lines between consumer and workstation grade. By allowing the use of Unbuffered ECC memory and consumer/workstation grade chips of up to 64-cores and 128-thread, AMD has remained fairly flexible about their chips and sockets and given the customer more options. Due to their larger core-count, High demand, and inability to keep stock, AMD has become the more expensive option if you only look at CPUs. Be aware that Intel motherboards are notoriously more expensive, but there are always budget options for both brands. Intel CPUs being cheaper could save you money if all you want is a Gaming machine and aren't really super security minded (read the history). Lets face reality, hackers aren't usually gunning for smaller individuals, but larger corporate targets. Competition is good here as its reducing the prices making CPUs more affordable.
For the longest time the question has been, do I go with Intel or AMD. Intel is the original x86 CISC processor vendor. Many old-school users know Intel is the first, or the original and choose to use Intel based on that alone. Intel and AMD have duked out in performance battles for years. Early on during the Pentium days, it was a battle of trading blows. AMD was very innovative, competitive, and usually cheaper. But eventually AMD started to lose out. In 2011 AMD released its AMD FX line of processors to compete against Intel's new Core line. Both were each company's top tier processors, but AMD's had a flawed design in which multiple cores shared FPUs (Floating Point Unit). This greatly reduced the performance as each core was not truly independent of another core. AMD had other innovative wins, but the performance crown was no longer an option at this time. Intel also had the benefit of Hyper Threading (a.k.a. SMT Simultaneous Multi Thread). Until then AMD had continually won the ability to have better multi-threaded performance and if overclocked, possibility of comparable single-thread performance for cheaper. But Intel's new Core series processors were much better in single-threaded performance than AMD's FX line and the Multi-threaded performance didn't win by much even though Intel actually had half the number of cores (best AMD=8core/8thread, best Intel=4core/8thread). It was for several years that AMD seemed to lose every performance battle and were considered the budget option.
To make matters worse, Intel was no stranger to unlawful exclusionary practices making deals to block competitors and purchase the competition. Intel and AMD had a lawsuit in 1991 (partially dismissed do to statute of limitations) which was settled out of court in 1995. Intel was attempting to become a monopoly and in 1998 the FTC halted acquisitions of other companies to prevent this. Around that time, the FTC filed another antitrust ruling against Intel for threatening to stop selling microprocessors to companies as a method of leverage, pushing companies to drop the legal actions against intel for intel's abuse of microprocessor patents they held. In 1999 a settlement was reached for the 1998 anti-trust case, but the agreement was not an “admission of guilt”. In 2004 Japan claimed Intel violated antitrust laws again. Intel refuted the findings, but agreed to change business practices to appease them. Later the same year AMD filed another anti-trust lawsuit for anti-competitive behavior. AMD sued Intel for $50M in damages. In 2006 another anti-trust complaint was filed by AMD in Germany against Intel for making a deal for blocked a retailer from selling any computers based on AMD processors. During this time AMD had just come to an agreement to purchase ATI Graphics. This move was a gamble and nearly bankrupted AMD since all the anti-trust behavior was blocking the sale of their products. In 2007 the Intel was charged by the European Commission for paying or offering rebates to manufactures to delay or cancel AMD products as well as blocking the sale of AMD products. In 2008 Intel was again slammed with anti-trust charges by the EU Commission for again aiming to exclude competitors (AMD) from the market. In 2009, Intel was found guilty of the EU Commission's complaints. The US wasn't far behind in reaching the same conclusion later that year. Intel and AMD agreed to a settlement that resulted in AMD being paid $1.25B and AMD acquiring a cross-licensing deal.
By this time, AMD has been missing out on a lot of money for R&D. They were struggling to stay competitive and were always a year behind. $1.25B was nowhere near the amount of money lost during the time, but AMD already had a $2B loan they needed to pay on due to buy ATI. ATI would end up being a successful gamble for AMD, but their processor division was hurting. Their release of the AMD FX line in 2012 proved lacking. They had to sale the processors cheap to move product. They were effectively a good deal money-wise, but they weren't going to win the performance awards. AMD had also made literal gambling on their processors worth while. Since the processors were not locked and CPUs were mostly binned from same parts, you could buy a 6-core processor and unlock it to 8-cores if the other 2 cores were healthy all from within the BIOS. This allowed them to gain profits, but the margins weren't great. AMD struggled like this for 6+ years. In 2017 AMD released a product which breathed new life into its company and the Zen Architecture was born.
AMD's new Zen Architecture was amazing. It was a completely new micro-architecture that included SMT (Simultaneous Muti Threading) to compete against Intel's HT (Hyper Threading). Intel had been the raining champion for performance for several years uncontested. Without competition, there's usually no real reason to struggle or improve your product. While Intel did improve the product over the years, the Improvements from year to year weren't exactly amazing. In many cases, new generation CPUs didn't really have much to entice people to spend the extra money on a new machine. Differences in Broadwell and Skylake for example didn't seem big enough to matter. Generation difference became less perceivable during this time. AMD had just release a new series of CPUs where you acquired double the CPU for ⅓ less money compared to Intel. Intel had just released Kaby Lake 4Core 8Thread CPUs. It wasn't long after that AMD release 8-Core 16-thread CPUs for less money and the single-threaded performance difference wasn't far off. Intel was still winning on Single-Threaded performance, but not for long.
Less than a year after Zen 1 (Ryzen 1XXX 14nm) released an Intel CPU Security flaw was discovered. Meltdown was found to allow programs to access other places in memory. While that doesn't sound scary at first, the actual result was a nightmare. Passwords, Credit Card numbers, any of your secrets could be picked out of memory and passed on to an attacker. Nothing was isolated and nothing was safe. This flaw was a result of Intel's chip designers chasing performance and neglecting security. It was a hardware design that couldn't be easily fixed. Software patching to avoid using performance gains and be more secure typically resulted in 5-30% performance loss with an average performance loss of over 12%. After that AMD had become a little more competitive as Intel's single-threaded performance was only about 8%-10% different from AMD. Thats a lot to gamers, but not much to the rest of the world. It didn't stop there as Spectre was also discovered, attacking ALL CPUs based on speculative execution. ARM and AMD lost as little as 1-3% performance from the patching, but Intel was not so lucky losing ~15% performance from the required patch which wasn't considered fool proof and the guaranteed fix was to disable Intel's HyperThreading technology resulting in 20+% loss of performance.. AMD then released Zen+ (2XXX series 12nm) which improved upon original Zen giving a boost of ~10% performance. The only winning feature Intel had left was gaming with the ability to output more FPS. In 2018, MDS/Zombieload were discovered once again affecting Intel and AMD disproportionately. Then RIDL and Fallout (another MDS attack) were discovered as a hardware design flaw that affected Intel chips allowing cross application or Cross VM data theft. Patching for AMD (who wasn't affected by all variants) resulted in ~3% performance loss, but Intel once again lost another ~16% performance to patch all the flaws. Then SWAPGS was discovered to only affect newer Intel Chips on Windows machines and LVI was discovered to attack and leak data from Intel's SGX enclaves. LVI required a redesign of hardware to fix! LVI wasn't the only new flaw to attack Intel SGX as Foreshadow (L1TF) and Snoop were discovered shortly after as well which accessed VMs, Hypervisors, etc. Spectre wasn't done either as Spectre v2 was discovered to affect all chips with SMT/HT. Performance loss once again affected Intel disproportionately. Intel performance was hurt so bad that in some situations the patch penalty was 50% performance loss. The patch was decided to be disabled by default due to how bad performance loss was and it was cheaper and more reliable to just disable Hyper Threading on Intel CPUs which still resulted in massive performance loss. This wasn't the end to Intel's disastrous situation. Many more flaws were found and patched over time. You can read more on them here.
While Intel was performing damage control, AMD released Zen 2 (Ryzen3XXX 7nm) in 2019. It was at this point AMD CPUs had better performance that Intel CPUs a mass majority of the time. The security patching on Intel really hurt their performance to a point they just weren't able to compete anymore in price/performance ratios. They had to lower their prices and change the marketing campaign to “real world performance” benchmarks because they lost nearly all synthetic benchmarks unless they used processors from different weight classes to stack the odds in their favor. You could find benchmarks where security mitigations are disabled making Intel look like performance KING, but many of those are called out when people do their research. Intel was losing in power consumption as well as performance. For the first time ever, AMD had been able to take the performance per watt crown and Ryzen Laptops were becoming standard and gaming laptop options. AMD stock has recovered and grown at a rate considered unbelievable previously. The company has been able to afford to put more in R&D and programmers to supply better drivers which was a known issue for their GPU market. It seems everything in on the up and up. You can learn about CPU performance benchmarks at Phoronix.com
CPUs originally were limited in function supporting only the basic compute function we know them for today. As time has moved forward, more and more features and hardware has moved into the CPU. We tend to refer to chips that contain multiple jobs “SOCs” or Systems On Chips. One example of this is Graphics. There are several CPUs, both Intel and AMD, out there that offer an iGPU (integrated GPU). While they don't compare in performance to Dedicated GPUs (dGPU), not everyone needs the horsepower of a dGPU. Now there is a single chip that offers both CPU and GPU to the user. Lets take this a step further. AMD offers CPUs with a TPM (Trusted Platform Module) in the CPU for trusted certs and encryption uses. Now we have CPU, GPU, and TPM in a single chip. The TPM may not sound impressive, but if you know how it was implemented, it becomes very interesting and impressive. The TPM (referred to as fTPM in the UEFI BIOS) is actually a smaller ARM RISC CPU with TrustZone TPM technology and the implementation is called a Platform Security coProcessor (PSP for short). So were talking a separate RISC CPU inside a CISC CPU! In addition to having a CPU, GPU, and RISC CPU, you also have an I/O Die for basic Input/Output of SATA and USB connectivity and a slew of sensors!
These are all different methods of connecting a CPU into a full system.
You have the older PGA (Pin-Grid Array) which has pins in the bottom of a CPU that fits in a socket with pin holes.
You also have the newer LGA (Land-Grid Array) which has pads/contacts on the bottom of the CPU requiring spring like pins be in the socket to touch the individual pads.
Then you have BGA (Ball-Grid Array) which is used in smaller and mobile electronics. Effectively a method of directly soldering to the board by using small balls of solder on the pads of the CPU and melting/soldering them to pads on a board.
Don't be fooled in to thinking CPU and socket choice is a simple as choosing between 3 Grid Arrays. As CPUs have advanced, the number of pins, pads, etc have increased. If that isn't enough, feature changes have also resulted in entire socket changes to support newer hardware and to lock out unsupported hardware. Pin numbers also aren't simple to distinguish just on visibility. You need to pay very close attention to the CPU you wish to buy and the corresponding socket required.