Contents > How to choose your monitor 6 / 7


What hardware acceleration should I expect with my graphics card ? - 7/8

Published on June 16, 2016   |  Updated on November 27, 2019


Our computers are nowadays full of power but only a small part is most often used at one time or another by our processors and graphics card. Software manufacturers are beginning to understand this and are delegating some of their calculations to graphics cards and not just to the processor. This process of assigning a particular part of the computer to a particular calculation in order to accelerate it is called hardware acceleration.
In parallel with this welcome development, there has also been an improvement in the display of gradients thanks to the famous 10-bit LUT tables. What are we talking about?

Our beloved graphic cards today know how to do much more than just display our beautiful images or video games. Some of them now know how to support the processor (CPU) in certain complex calculations thanks to various software or hardware improvements - we talk about hardware acceleration - to make the processing times of our images or videos ever shorter and sometimes to allow access to display qualities that are always improved thanks to 10-bit coding (with a compatible screen, operating system and software) as we will see more specifically on the next page: 10-bit display. It is therefore wise to take stock of who is doing what and whether you really need it...




Hardware acceleration, specific computation technologies: CUDA, OPEN CL, OPEN GL, METAL

Let's start with the most important thing ! Many of you are asking me for advice about your future graphic card and whether the graphic card has an influence on the quality of the display of photos, colors? Well, let's start by debunking an old myth : 

To calculate and therefore display our images or videos smoothly, the two graphic card manufacturers NVidia and ATI do not use the same calculation engines. They each have their own know-how and their own optimizations. However, these optimizations and singularities will not be visible during the simple display of images in Photoshop because they are a few things that are "simple" enough to do and especially universal. All the graphic cards on the market know how to do this without questioning. So when does the choice of a graphics card become more sensitive?

In recent years - since INTEL and AMD, processor manufacturers, have had more difficulty doubling their raw computing power every 18 months - they have had the idea not only to increase the power of CPUs but to multiply the number of computing units within the same CPU : these are called cores.


This Intel Core i7 processor has, for example, 6 computer cores.


However, for our software to use this "different" computing power, it must be rewritten in order to benefit from it. This process of calculating any treatment from no longer a processor but multiple cores is called parallelization. Several calculations are made by the different cores at the same time. Pictured example: let's imagine that you need to do 10 different calculations. "Before", the CPU had to do all 10 operations in single file! Today, if you have ten cores in your CPU, each one will do a single calculation and the final result will be returned 10 times faster! (To make it short because in real life it still goes a little slower!).

Essential !  Calculation parallelization can only be done if you have a multi-core processor AND rewritten software to take into account this increased computing power.

Similarly, software developers thought they could use the power of the many cores of graphics cards more because, especially when editing Photoshop, they don't have much to do - the life of a graphics card that only displays images in Photoshop is pretty cushy! - And strangely enough, rather than rewriting the software once and for all, we had to rewrite the software again to take into account this other source of power of our machines! What a waste of time!

Note!  To speak of parallelization, we now speak more of material acceleration or GPU acceleration. The hardware power of a machine is better used to support the CPU and reduce processing times.

Enable hardware acceleration in Photoshop grafted to the graphics card

Enabling GPU acceleration in Photoshop CC. It only speeds up a few tools or filters (so you might not feel it) but it's quite spectacular. As it works with almost all graphic cards, you might as well check it !


In the best of worlds we live in, we think that all hardware and software manufacturers have definitively put themselves around a table in order to create a universal way of doing things once and for all. Well, you're not going to believe it : they havn't ! Don't say I didn't warn you. Everyone has come up with its OWN solution with, of course, tons of good reasons. It's for our own good, let's never forget it! That being said, I am ironic, but I can perfectly understand some very logical arguments. That said, it doesn't help our business either!

Several hardware accelerations... for different software !

These hardware acceleration calculation engines are therefore called either CUDA at NVidia, OPEN CL at ATI and NVidia, Open GL or now METAL at Apple which wants to finish with OPENCL and OPENGL precisely!!! All NVidia cards support CUDA (some NVidia Quadro cards even make it a specialty) AND OpenCL but ATI cards only support OpenCL. So when you have a favorable conjunction of the stars - Uh, no! - between your software, your operating system AND the type of hardware acceleration of your graphics card, you will really save in computing time. But because of these different configurations, you may very well end up with a super high-powered graphics card but not the right software (or operating system if you are under MacOS Catalina) optimized for it... or the other way around!

Essential!  Hardware acceleration is only possible if you use the "right" graphics card with the "right" software under the "right" operating system (this only applies to MacOS Catalina). Apart from this combination, you will save little computing time despite the potential power binge. The hardware acceleration therefore has almost an ON/OFF side.

Designers of panoramic assembly software such as Autopano or PTGui but even more the 3D modeling software, image editing software such as DXO and Capture One, video processing software such as Adobe Première or Photoshop but to a lesser extent have started to work and rewrite their software (or part of it) so that they use all the computing resources of our computers and know how to distribute the calculations between all these devices. When the optimization is well written, the result is spectacular!

In summary!  Today, hardware acceleration is present "everywhere" and more and more software is using it in part, so for a rather small gain. It's better than nothing, though ! It therefore provides a little extra on almost all machines most often because there are few cases of optimal and spectacular optimizations. These cases are more common with very greedy professional software such as in the world of 3D, video, medicine, physics and partly in the world of photo retouching but paradoxically, Photoshop is the one that uses it the least.


To be read in french ! As such, Christophe Métairie's essays are extremely interesting and instructive. A basic NVidia Quadro card - which I will discuss below - does much better under Capture One than a large game card: computation times are halved! On the other hand, under Camera Raw or Lightroom, the gain is unfortunately minimal... - Hardware acceleration and graphics card - cmp-color

Hardware acceleration test with the graphics card made by



My advice!  If you use one of these resource-intensive software programs that you know has been optimized for a type of graphics card, you will probably start searching for this Grail, but you may have to make a difficult choice. The more with one software can be the less with another (games for example).

Second advice about MacOS! Be careful if you switch to MacOS Catalina because Apple clearly announces that it wants to stop OpenCL and OpenGL in favor of its hardware acceleration: METAL. Of course, all your software will continue to work but without the benefit of your computer's hardware acceleration. Oh, my God! Oh, my God!

Examples of hardware optimization:

 Capture One or DXO OpticsPro - These image editing software products are highly appreciated by their users and love CUDA technology cards like NVidia Quadro cards. With these graphics cards, they are able to divide processing times up to X3... and on the way give access to the 10-bit display (which I will discuss again on the next page). On the other hand, these very specialized cards will not be of any use to you with your favorite games if you are a fan of SPF from time to time because they do not use the same resources !

  FinalCut Pro X - Video editors who work with FinalCut Pro X (Apple software only) often use a Mac Pro with ATI FirePro D700, D500 or D300 cards installed in pairs since 2013. The whole thing has been perfectly optimized by Apple and the results are there. It seems easy to work in 4K flow in real time. On the other hand, don't expect ATI FirePro cards to help you reduce computation times in Capture One, DXO or Camera Raw and Lightroom. Just a few display accelerations under certain tools... like with less powerful or less specific maps.

 Adobe Première is optimized for CUDA or OpenCL hardware acceleration. On Mac Pro 2013 and later it will use the OpenCL hardware acceleration of the D700, D500 or D300. Under Windows, you will have the choice of CUDA acceleration with NVidia Quadro cards but also some GT or GTX (List on Adobe website), or OpenCL acceleration with ATI FirePro cards and some Radeon (List on Adobe website). A priori, Adobe Première still prefers CUDA and if possible in X2.

 Curiously, Photoshop CC, even in its latest version 2019, is not so optimized for hardware acceleration. Only a few filters benefit from this, such as optimized sharpness or the over-sampling function. The gain is then very sensitive. This also makes it possible to speed up some image refreshes a little during zoom or rotation. We would like so much more functions or filters to take advantage of it... NVidia Quadro and Photoshop CC cards 

PTGui - This panoramic assembly software perfectly manages the OpenCL of my Mac 2013 (High Sierra) and it is then very spectacular compared to the Autopano Giga: X10 at least!




Display quality and bit number of graphics cards

The images can be in 8 or 16 bits. Graphics cards can be in 8 or 10 bits. The LUT tables of the displays can be 8, 10, 14 or 16 bits. But what advantages do these differences bring to the display of our images ?

Essential !  Let's be clear, there are many reasons why some of our images have tonal breaks and this is absolutely not due to the 8 or 10 bit display of the graphics card or the 8 or 14 bit LUT tables on the screen but to a somewhat brutal or limited processing in the vast majority of cases of our images. The display of tonal breaks in the gradients of our images is therefore multifactorial and high bits will not solve all problems!

But let's see now, from a beautiful gradient file (file created in Photoshop, in 16 bits, on a new document filled with a grayscale), when it could be graded by the equipment and how it will be seen.

Color gradients, "cassures de tons" : what are we talking about ?

The best panels display very beautiful grayscales or very progressive colors (from quality files it goes without saying but let's say it all the same!), without "color tones breakage" as you can see in the image below. When you display, from these same gradient files, the image below shows that there is a hardware problem. However, these can have several origins, which we will describe in more detail...

Breaks of tones in a gray gradient created in Photoshop

Preamble! I find that more and more displays, calibrated with the latest X-Rite or Datacolor colorimeters, offer much cleaner gradations, even on "low-end" displays like my Dell P2419H and yet with "normal" graphics cards and without hardware calibration, supposed to be the cream of the crop ! Material progress is therefore moving in the right direction...

Let's start with the bits in the images: 8 or 16 bits?

As we have seen on the page dedicated to the human eye and colours, a human eye with a very beautiful visual acuity is able to distinguish two hundred shades for each primary RGB colour. As our view is based on an RGB mixing model, it gives us eight million possibilities, so the famous L*a*b* color space. Thus, to display these colors, we need a byte-based computer model (8 bits) because it allows 256 values per primary color. (In 7 bits we would have had only 128 possibilities). So we end up with sixteen million possible combinations when we see only eight million colours at most! To be displayed correctly, an image therefore does not need to be in 16 bits. 8 bits are more than enough. But then what are 16-bit images for? Well, only during the retouching phase. Each primary color is then described on 65536 levels, profile conversion, levels and other saturations will be done with such a level of precision that losses will be largely minimized.

Key point!  An image to be displayed correctly does not need to be in 16 bits. 8 bits are enough. On the other hand, the risk of damage will be greatly reduced on a 16-bit image during the various retouching operations (when the thresholds are significantly tightened several times in a row, for example). The display quality of a gradient even in 8 bits can therefore be perfect. It is elsewhere that the quality of display is determined...

8, 10, 14 or 16 bits.... for the LUT table on the screen ?

The LUT tables (tables for converting the received signal - that of the image - to the signal sent to the screen) of the screens can therefore be in 8, 14 or even 16 bits. But what for ? It's all a matter of rounding errors during conversions! Let's see it now...

Vocabulary for a good understanding - Some technique to start with : an RGB signal from your photo should be sent to your graphics card and then to your screen to be displayed correctly, if possible without loss. However, it can be damaged during its transmission from one to the other. The witnessing of the original signal (that of your digital file), is done in what is called a LUT table. There is a LUT table in your graphics card and another in your screen. LUT is an acronym for conversion table. However, the problem is very simple: when an RGB signal from your file is sent to another device - here to your graphics card and then to your screen - and since we work digitally, therefore in bit and not continuously (we work on 256 levels), there can be approximations in the passage of cookies during what is called conversion. The source file has a given RGB value and this RGB value must be slightly modified by the destination device (for different reasons that I explain on my conversion page). It is during this conversion calculation which can have approximation errors that we also call rounding errors and which will materialize by... these famous tonal breaks. Our source file contains a beautiful gradient of sky blue and yet it appears with tone breakings on the screen...

Key point !  Note that we are talking about beautiful image files and not overprocessed images, so they are necessarily full of real tone breakings, and so there is no reason not to appear as such on the screen ! We are talking about not adding breakings to a file that did not contain any.

To avoid this as much as possible, manufacturers install LUT tables in their displays that work in 10 bits (or more) and not in 8 bits traditionally on 1024 levels instead of 256 in order to "smooth" these losses. But since marketing goes through this and 10-bit LUT tables have been around for more than five years, we are now talking about 16-bit LUT tables. And why not in 32 bits while we' re at it !!!! The objective being just to make the rounding error tiny, this objective can already be achieved in 10 bits and a fortiori in 14 bits so the 16 bits is a marketing argument.

My opinion: indeed, screens with 10-bit or more LUT table always display very progressive gradients from "clean" files (with under-exposed and too brightened photos, you will have tone breaks even on these luxury hardware configurations). However, the gradients of a recent iMac are very beautiful too and yet in 8 bits! As for the LUT tables with more than 10 bits screen, you now know what I think.... A 10-bit display is always good because it is synonymous with the manufacturer's desire for quality, but you can also find very beautiful 8-bit displays. The 10-bit improves at the margin so never expect something spectacular. It is often necessary to get closer to the screen as the conversion errors are low when working on beautiful files and on recent screens correctly calibrated.

And finally the 10-bit display of the graphics cards?

This deserved a specific page:

What is the purpose of the 10-bit display ?  



What is hardware calibration (vs software)?

To perform a hardware calibration, you will need :

  • a compatible screen,
  • a dedicated USB cable,
  • a compatible calibration software (such as ColorNavigator or Spectraview).

A calibration is always done in two steps: calibration and characterization, a process during which the ICC profile is actually created. Usually, during calibration, you choose target values (brightness, illuminant, contrast, etc.) in the software and then place your colorimeter on the screen so that it helps you to reach these values. To do this, you can use the keys and OSD menus on the display. By following the instructions on the screen you know whether to lower or increase the brightness of your screen with the dedicated keys. When the screen is well adjusted - well calibrated - only then does the software scroll through the colored areas to characterize it.

With some high-end displays and calibration software such as ColorNavigator or Spectraview II, the calibration part can be handled automatically, without you having to intervene on the screen. You define target values to be reached in the software and it will make the changes directly in the LUT table of the screen. This is supposed to allow more regular curves to be displayed, so gradients without tone breaks. I had this possibility on my old Quato and I have this possibility on my Eizo CS240 (Replaced by Eizo CS2420) and frankly, I don't see the difference with/without (always from beautiful 16-bit gradients). Theoretically it' s better but in reality, on the very good screens and blind, you don't necessarily see the difference... The last advantage is the possibility to inject the ICC profile directly into the LUT table of the screen so once calibrated it can be installed on another computer.

Next page...

8 / 8 - What is the purpose of the 10-bit display ?



Through these 8 pages I will share with you my advices to choose your photo editing or video editing screen...
- General advices
- What screen size to choose ?
- Switch to 4K... or not yet?
- Panel technology : IPS, TN, VA, OLED?
- Panel luminosity, HDR or not, sRGB or wide gamut?

- Which graphics and connector card ?
- Hardware acceleration and LUT ? 6/7
  - What is hardware acceleration ?
- Display quality and looked table
- What is hardware calibration ?
What is the purpose of the 10-bit display ?

- 2020 monitors buying guide
- How to calibrate your monitor?
- My 35 full monitor reviews!


Calibrate your monitor with the best
colorimeter: X-Rite i1Display Pro !

Read my full review...


Calibrate your photo printer with the
best value for money: X-Rite i1 Studio !

Read my full review...  










From 2002, this Website offers...

This site dedicated to color management for photographers and videographers, beginners or professionals, visited by more than 330,000 people last year, offers on the one hand to help you easily understand and put into practice the color management of your photos and on the other hand helps you make the best investments with more than 100 hardware or software reviews since April 2002 in French and 2014 in English! It is the result of a patient work and a long experience shared by a professional but above all passionate photographer !


Where do the tested products come from ? Is my opinion completely independent ?

All the products I talk about on this site have been tested by me either following a personal purchase, a friendly loan, during a training session at a customer's site or after a manufacturer's loan.
They thank me for the precision of my remarks which will help their future products to progress. So things are moving and my

impartial opinion is more and more appreciated.... If I believe your emails, your trust marks and even some brands.


... And how to participate!

So my proposal when you don't know how to thank me for the free content: think about using my links to buy in these different partner shops because they play the game!




Legal informations

Legal information is available on my page Legal information








Sitemap  |   About   |   Legal information