Sleigh bells ring
Are you listening
In the lane
Snow is glistening
A beautiful sight
We’re happy tonight
Waiting for our Xbox One X
Oh, and what a glorious day tomorrow will be! At work, staring at the phone, wondering when reception will call to advise there is a package for you. At home, a knock on the door from Santa’s delivery men. If they come through the chimney, it’s probably not the delivery of your XB X… and you should probably call the police. The cool kids are getting their toys. Some of us staring with jealous eyes, whilst we wait for the next batch of XB X’s to arrive.
Microsoft’s whole marketing campaign was around the power of the X. They made sure everybody understood that this is the most powerful console yet. I find it a bit ironic that in the end the only thing gamers seem to be concerned about is the 4K rendering. I’ve seen way too many people advise other people NOT to get the X unless they have a 4K TV. Because the X would be useless without that 4K TV. This simply isn’t true and I feel like pulling my hair out every time I see somebody say this. Yes, the experience will be much richer on a 4K TV. Yes, with a 4K TV you will get the “full experience”. No, I am not saying don’t go buy that 4K TV. I am simply saying that the X will NOT be “useless” on your current 1080p TV.
Want to know why? Let’s dive a little deeper into the hardware of this machine. I want to help people understand the spec side of things. Life is confusing enough, so when people throw words like teraflops of what-what at you and you have no idea what it means, of course your focus will drift to what we think we understand. Pretty, 4K imagery.
CPU – Custom @ 2.3Ghz, 8 cores
The Xbox One X features 8 cores based on the AMD Jaguar microarchitecture. Sounds similar to the Xbox One, doesn’t it? The CPU technology has been customized by Microsoft to the point of them not referring to it as Jaguar architecture any more but rather a “custom” CPU. They achieved a frequency bump from 1.75Ghz to 2.3Ghz, increasing performance by 31%.
Now we know the CPU performance has been increased by 31%. How about the 8 cores they speak of? This just means that the CPU has 8 cores, and each core is capable of running one task independent of others. So basically, it can run 8 tasks simultaneously. To understand it better, let’s compare this to a real-world scenario. Say there is an office with 8 clerks. When a job comes in, one clerk (core) can tackle the job, whilst the rest can sit idle, saving energy. When two jobs come in, instead of having to split the processing (1 clerk/core) between two jobs, or having the second job wait until the first job is done, a second clerk (core) can jump in and handle the second job. Leaving the first clerk in peace to complete the other job. So, when multiple jobs come in, multiple clerks can handle the different jobs simultaneously. In short, more cores help when the workload is high.
Getting into the GHz is another story. Since the external clock determines how quickly the processor can communicate with the system’s memory, it has a significant effect on your processor’s real-world speed. Generally, only the internal clock gets a mention with it comes to marketing. In this case, 2.3GHz from Microsoft. The higher the GHz the better. It’s a lot more complicated than that, but it pretty much decreases the time a CPU spends on a single frame, meaning a higher framerate.
Custom GPU @ 1.172 GHz, 40 CUs, Polaris features, 6.0 TFLOPS
The Xbox One X boasts the most powerful GPU in any console, and features an AMD custom GPU. It offers 40 compute units (CUs) clocked at 1172MHz, 2560 stream processors and 32 ROPs. This goes above most heads. Let’s see if we can break it down without getting too technical. Try, being the operative word here.
Historically, cores process data inside a processor. The more cores you have working together, the faster a computer will perform tasks. CPU cores were designed for serial tasks like productivity applications, while GPUs were designed for more parallel and graphics-intensive tasks like video editing, gaming and rich Web browsing. AMD’s latest revolutionary processing architecture, Heterogeneous System Architecture (HSA), bridges the gap between CPU and GPU cores and delivers an innovation called compute cores. This ground-breaking technology allows CPU and GPU cores to speak the same language and share workloads and the same memory. With HSA, CPU and GPU cores are designed to work together in a single accelerated processing unit (APU), creating a faster, more efficient and seamless way to accelerate applications while delivering impressive performance and rich entertainment.
Compute core: Any core capable of running at least one process in its own context and virtual memory space, independently from other cores.
The definition of the term “compute core” describes consistently and transparently the total number of cores available to do work or run applications on our next-gen accelerated processors. So back to our office analogy. Here you have 40 workers able to handle tasks independently! Productivity just shot through the roof!
Stream processors are one of the most important parts of the GPU and they decide how much power your GPU has. Stream processors are also called processor cores or pixel pipelines. You know about multi-core processors (read the CPU section if you skipped it). These multi-core processors consist of a very few cores that perform various individual tasks and increases the parallel processing and multi-tasking. This ultimately leads to better performance and efficiency.
If you want to get into the benchmark detail then look up the AMD Radeon RX 480. The Xbox One X offers 11% more compute hardware than the RX 480. There’s of course more custom blocks here as well. A console designed for 4K and HDR still need to work with SDR 1080p displays. The Xbox display controller can super sample down from 4K to 1080p, or even 1440p, as needed. There’s media blocks for HEVC as well to handle the 4K video requirements for Blu-Ray and streaming and the Xbox game capute can also capture at 4K.
12GB of GDDR5 RAM @ 6.8GHz w/ 326 GB/s bandwidth
I’m happy to see that Microsoft decided to go with GDDR5 instead of the DDR3 they used in the original Xbox One. GDDR5 is better for games, and if you need to have one type of unified memory you want it to be GDDR5, DDR3 is used in lower end systems, and commonly PC as its main memory. This is because there are many small apps that do regular memory operations not as optimized for GDDR5, versus things like games, streaming movies, or other high throughput operations. GDDR5 has a higher bandwidth throughput than DDR3. That means that the GDDR5 ram will allow more information through in a shorter amount of time. Memory bandwidth is the rate at which data can be read from or stored by the processor.
Word of warning, do not jump on PC forums laughing at how the console peasant have surpassed the PC master race in terms of performance value. See this 12GB of GDDR5 RAM is split between the CPU and GPU. Also, if I can jump back to the CPU for a second. This custom CPU equates to current gen i3 CPUs. So yes, for a console the hardware is superior (for now). But you simply cannot compare it to PC. You are not comparing apples with apples.
These are the components I wanted to dive into. I want you to understand how powerful the hardware is. I need you to understand that even though the hardware will be able to deliver an immersive 4K experience for those who have 4K TVs, it will also be a significant improvement on 1080p TVs. The hardware itself is not dependent on whether you have a 4K TV or not.
The Xbox One X will do something called super sampling to create better looking images. Super sampling is a complex term but the basic idea is that the game renders itself in 4K thinking it’s connected to a 4K screen, which means objects are rendered with four times the detail. Obviously due to the TV only being 1080p all the detail won’t be shown, but the resulting image is will still be much richer than the image you get on your original Xbox One. Images will look sharper. The environment in games will look more realistic. If in general you do not give a rats’ ass about how pretty the trees are in the game (for example) then you might not even notice the change. Performance wise you should notice a difference, from faster loading times to a quicker switch between apps.
A 4K HDR TV, will unleash the beast. No doubt about it. Games will render at a native 4K, so up to 4 times the detail of the regular 1080p TVs. Thanks to the HDR, some games will look even better with a wider array of colour options. It will basically turn into a love song, where the skies are bluer and grass greener.
So, even though you will only unleash the full potential on a 4K HDR TV, it’s definitely not a “complete waste” if still on that 1080p TV.
Have any thoughts on this? I’d love to hear them!