When It comes to processer speed whether It's a CPU or GPU, You probably thinking in MEGAHERTZ or GIGAHERTZ at first . But these numbers aren't actually very useful if you are compireing processers accross different models.

So that's why Graphics card makers in particular have started leaning on a specification called FLOPS to describe there latest and gratest hardware, but What Does That Mean ?

FLOPS is a profermance metric that stands for FLOATING POINT OPERATIONS PER SECOND.

Now this might sound like the time it takes for your magic wand to give you an answer , but It's Really a Measure of How quickly your processor can do math that involves a mix of large small and fractional numbers. It's matters because computers only have a finite amount of space to store the numbers that they work with for a sigle number this is typically either 32 or 64 bits depending on what processor and program you are useing.

In order to express a range of very large and very small numbers some of these bits are allocated to destroy the digits of number itself while other are reserved to spacify where the decimal point should be little bit like scientific notation. These makes it easy to express huge or tiny numbers in a limited number of bits .

Keep in mind that that floating point operations are less straightforward for your processer to carry out then onces that only invovels integers due to the computer needing to deal with ever-changing exponenets converting them to decimal numbers and rounding them off.

The number of math your GPU needs to do inorder to render the images that you see on your screen uses vectors to determine, Where each line and shape should go and cruching these numbers involveing useing many different floating point numbers whose exponent values can vary quite a bit.

Now supper computers and more powerfull workstations used for scientific research and weather modeling are also often described in terms of FLOPS as they also havily rely on useing floating point numbers.

But what exactily does any of these mean for a common user. Should a Common user go for highest FLOPS numbered GPU in your Budge ? Answare is No , Even though more flops does indicate more raw computational power. It means it's batter in the same way a CPU with more megahertz or a digital camera with more megapixls is better.

Many other things will play a huge role in how good experience you will have with your Graphics card including memory capacity and bandwidth the specific architecture that your GPU uses and even how nicely the drivers ply with the specific games in your library.

So Conclusion is Unless you are a AI or Big Data Researcher there is no need to be all starry-eyed at how the new Titan V has hundred and ten teraflops of power power . But, At least you know the specification on the back of the box doesn't reffer to the sound that your wallet makes as it FLOPS down on the Checkout counter.

So that's why Graphics card makers in particular have started leaning on a specification called FLOPS to describe there latest and gratest hardware, but What Does That Mean ?

FLOPS is a profermance metric that stands for FLOATING POINT OPERATIONS PER SECOND.

Now this might sound like the time it takes for your magic wand to give you an answer , but It's Really a Measure of How quickly your processor can do math that involves a mix of large small and fractional numbers. It's matters because computers only have a finite amount of space to store the numbers that they work with for a sigle number this is typically either 32 or 64 bits depending on what processor and program you are useing.

In order to express a range of very large and very small numbers some of these bits are allocated to destroy the digits of number itself while other are reserved to spacify where the decimal point should be little bit like scientific notation. These makes it easy to express huge or tiny numbers in a limited number of bits .

Keep in mind that that floating point operations are less straightforward for your processer to carry out then onces that only invovels integers due to the computer needing to deal with ever-changing exponenets converting them to decimal numbers and rounding them off.

The number of math your GPU needs to do inorder to render the images that you see on your screen uses vectors to determine, Where each line and shape should go and cruching these numbers involveing useing many different floating point numbers whose exponent values can vary quite a bit.

Now supper computers and more powerfull workstations used for scientific research and weather modeling are also often described in terms of FLOPS as they also havily rely on useing floating point numbers.

But what exactily does any of these mean for a common user. Should a Common user go for highest FLOPS numbered GPU in your Budge ? Answare is No , Even though more flops does indicate more raw computational power. It means it's batter in the same way a CPU with more megahertz or a digital camera with more megapixls is better.

Many other things will play a huge role in how good experience you will have with your Graphics card including memory capacity and bandwidth the specific architecture that your GPU uses and even how nicely the drivers ply with the specific games in your library.

So Conclusion is Unless you are a AI or Big Data Researcher there is no need to be all starry-eyed at how the new Titan V has hundred and ten teraflops of power power . But, At least you know the specification on the back of the box doesn't reffer to the sound that your wallet makes as it FLOPS down on the Checkout counter.