Rocksolid Light

Welcome to novaBBS (click a section below)

mail  files  register  newsreader  groups  login

Message-ID:  

Ma Bell is a mean mother!


computers / alt.comp.os.windows-10 / How can I tell if a GPU is using double precision?

SubjectAuthor
* How can I tell if a GPU is using double precision?Commander Kinsey
+- Re: How can I tell if a GPU is using double precision?Chris
+* Re: How can I tell if a GPU is using double precision?Paul
|`* Re: How can I tell if a GPU is using double precision?Commander Kinsey
| `* Re: How can I tell if a GPU is using double precision?Paul
|  `- Re: How can I tell if a GPU is using double precision?Commander Kinsey
`* Re: How can I tell if a GPU is using double precision?Bill
 `* Re: How can I tell if a GPU is using double precision?Commander Kinsey
  +* Re: How can I tell if a GPU is using double precision?Bill
  |`* Re: How can I tell if a GPU is using double precision?Commander Kinsey
  | `- Re: How can I tell if a GPU is using double precision?B. R. 'BeAr' Ederson
  +* Re: How can I tell if a GPU is using double precision?Bill
  |+- Re: How can I tell if a GPU is using double precision?Paul
  |`* Re: How can I tell if a GPU is using double precision?Commander Kinsey
  | `- Re: How can I tell if a GPU is using double precision?Paul
  `- Re: How can I tell if a GPU is using double precision?Chris

1
How can I tell if a GPU is using double precision?

<op.1kcsnhi5mvhs6z@ryzen.lan>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61536&group=alt.comp.os.windows-10#61536

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!aioe.org!news.uzoreto.com!newsfeed.xs4all.nl!newsfeed8.news.xs4all.nl!news-out.netnews.com!news.alt.net!fdc2.netnews.com!peer03.ams1!peer.ams1.xlned.com!news.xlned.com!fx11.ams1.POSTED!not-for-mail
Content-Type: text/plain; charset=iso-8859-15; format=flowed; delsp=yes
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
Subject: How can I tell if a GPU is using double precision?
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
From: CK1...@nospam.com (Commander Kinsey)
Message-ID: <op.1kcsnhi5mvhs6z@ryzen.lan>
User-Agent: Opera Mail/1.0 (Win32)
X-Antivirus: AVG (VPS 220408-0, 8/4/2022), Outbound message
X-Antivirus-Status: Clean
Lines: 1
X-Complaints-To: abuse(at)newshosting.com
NNTP-Posting-Date: Sat, 09 Apr 2022 10:27:08 UTC
Organization: Newshosting.com - Highest quality at a great price! www.newshosting.com
Date: Sat, 09 Apr 2022 11:27:07 +0100
X-Received-Bytes: 916
 by: Commander Kinsey - Sat, 9 Apr 2022 10:27 UTC

How can I tell if a GPU is using double precision? Some sort of measurement of processing and whether it's single, double, or a mixture.

Re: How can I tell if a GPU is using double precision?

<t2sgh5$njo$1@dont-email.me>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61539&group=alt.comp.os.windows-10#61539

  copy link   Newsgroups: alt.comp.os.windows-11 alt.computer.workshop alt.comp.os.windows-10 alt.comp.freeware
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: ithink...@gmail.com (Chris)
Newsgroups: alt.comp.os.windows-11,alt.computer.workshop,alt.comp.os.windows-10,alt.comp.freeware
Subject: Re: How can I tell if a GPU is using double precision?
Date: Sat, 9 Apr 2022 17:41:57 -0000 (UTC)
Organization: A noiseless patient Spider
Lines: 8
Message-ID: <t2sgh5$njo$1@dont-email.me>
References: <op.1kcsnhi5mvhs6z@ryzen.lan>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Injection-Date: Sat, 9 Apr 2022 17:41:57 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="c20aa955d2979a1cb1c6279cc0a5c955";
logging-data="24184"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19Evimaf466WPxiQ5zhwjOZJc6P8iErl/o="
User-Agent: NewsTap/5.5 (iPhone/iPod Touch)
Cancel-Lock: sha1:ZTPiGKnh874j8LH1bSNxnQLo2w4=
sha1:zdrkYwXpovD3Busy4MO7dF98W0Q=
 by: Chris - Sat, 9 Apr 2022 17:41 UTC

Commander Kinsey <CK1@nospam.com> wrote:
> How can I tell if a GPU is using double precision? Some sort of
> measurement of processing and whether it's single, double, or a mixture.

That's a function of software not hardware. GPUs are minimum 64bit, but
more commonly 128bit and have some larger registers too. How the address
space is used is up to the developer and/or API?

Re: How can I tell if a GPU is using double precision?

<t2si3j$8ot$1@dont-email.me>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61540&group=alt.comp.os.windows-10#61540

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: nos...@needed.invalid (Paul)
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
Subject: Re: How can I tell if a GPU is using double precision?
Date: Sat, 9 Apr 2022 14:08:44 -0400
Organization: A noiseless patient Spider
Lines: 72
Message-ID: <t2si3j$8ot$1@dont-email.me>
References: <op.1kcsnhi5mvhs6z@ryzen.lan>
Mime-Version: 1.0
Content-Type: text/plain; charset=iso-8859-15; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Sat, 9 Apr 2022 18:08:51 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="263c54d8c98ca95affd8fcf10d7c54d4";
logging-data="8989"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/g5EPaEfFNfWkx078gX7wpgrSwIKn4RYY="
User-Agent: Ratcatcher/2.0.0.25 (Windows/20130802)
Cancel-Lock: sha1:QfkUNLPsUN3ZtdYiYeQ66WeiSS4=
In-Reply-To: <op.1kcsnhi5mvhs6z@ryzen.lan>
Content-Language: en-US
 by: Paul - Sat, 9 Apr 2022 18:08 UTC

On 4/9/2022 6:27 AM, Commander Kinsey wrote:
> How can I tell if a GPU is using double precision? Some sort of measurement of processing and whether it's single, double, or a mixture.

Using FP is a function of coding.

If you write C code, and use a C to SASS or CUDA
compiler of some sort, then what you asked for, is
what you get. If you do your declarations, that's how
the processing type is determined. My INT code won't
be using any (almost unavailable) FP resources. And
the FP supports multiply and multiplyAdd, and does not
do division. To do division, you use Reciprocal and Multiply.

int i
float x
double y

The FP64 is like the Clive Sinclair version of IEEE754. It
may not do exactly what the CPU does for its FP, leaving many
details to the coder. Maybe your Intel processor uses 80 bit for
intermediary stuff, and 64 bit as "output" to be put back
in the register. Those would be guard bits.

But before you run the actual code, your program
has to do DeviceQuery and determine whether the "type"
is even present, and in what numbers. FP16, FP32, FP64.

On mine, the FP64 ratio is a miserable 1:32. But
unless I can find a DeviceQuery demo, I may not be
able to give you an example.

The professional cards (Tesla versus your consumer GPU),
the ratio is stated as 1:2 , and that's enough to make
the card get very hot. I think my card on FP16 (used
perhaps for neural networks) is only 1:128 or "a joke".
This means if I want some of these oddball cases (FP16 for
neural networks) to be satisfied, I have to spend thousands.

There's a seven page article that will give you some keywords,
but it takes an actual programmer site to make sense of it.

"GPU Programming and Streaming Multiprocessors" (1 chapter from a book) Aug 6, 2013

https://www.informit.com/articles/article.aspx?p=2103809

"How to determine if my GPU does 16/32/64 bit arithmetic operations?"
(about half of an answer, not as satisfying as it could be)

https://stackoverflow.com/questions/43478827/how-to-determine-if-my-gpu-does-16-32-64-bit-arithmetic-operations

"streaming multiprocessor number"

https://superuser.com/questions/198119/streaming-multiprocessor-number

"And see this line:

Multiprocessors x Cores/MP = Cores: 14 (MP) x 32 (Cores/MP) = 448 (Cores)"

So if I have 2560 Cuda cores and 32 cores per MP, I have 80 SM.
And then you have to figure out whether FP are available per core
or per SM, and use the available ratio info to figure out how many
you got. Tesla has FP64 at 1:2 so you can "do double() all day".
Consumer cards can be very weak, but might have some, so that
your code doesn't just crash.

Well written code would have if-then-else to guard against
the user running the code on prehistoric cards. And you will still
find hobby coders who have written and tuned their CUDA code
for their *particular* card, not giving a rats ass that the
code is sub-optimal for the resources on other cards.

Paul

Re: How can I tell if a GPU is using double precision?

<op.1kdwahtqmvhs6z@ryzen.lan>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61549&group=alt.comp.os.windows-10#61549

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!aioe.org!news.uzoreto.com!news-out.netnews.com!news.alt.net!fdc2.netnews.com!peer01.ams1!peer.ams1.xlned.com!news.xlned.com!fx01.ams1.POSTED!not-for-mail
Content-Type: text/plain; charset=iso-8859-15; format=flowed; delsp=yes
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
Subject: Re: How can I tell if a GPU is using double precision?
References: <op.1kcsnhi5mvhs6z@ryzen.lan> <t2si3j$8ot$1@dont-email.me>
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
From: CK1...@nospam.com (Commander Kinsey)
Message-ID: <op.1kdwahtqmvhs6z@ryzen.lan>
User-Agent: Opera Mail/1.0 (Win32)
X-Antivirus: AVG (VPS 220409-4, 9/4/2022), Outbound message
X-Antivirus-Status: Clean
Lines: 74
X-Complaints-To: abuse(at)newshosting.com
NNTP-Posting-Date: Sun, 10 Apr 2022 00:43:20 UTC
Organization: Newshosting.com - Highest quality at a great price! www.newshosting.com
Date: Sun, 10 Apr 2022 01:43:19 +0100
X-Received-Bytes: 4263
 by: Commander Kinsey - Sun, 10 Apr 2022 00:43 UTC

On Sat, 09 Apr 2022 19:08:44 +0100, Paul <nospam@needed.invalid> wrote:

> On 4/9/2022 6:27 AM, Commander Kinsey wrote:
>> How can I tell if a GPU is using double precision? Some sort of measurement of processing and whether it's single, double, or a mixture.
>
> Using FP is a function of coding.
>
> If you write C code, and use a C to SASS or CUDA
> compiler of some sort, then what you asked for, is
> what you get. If you do your declarations, that's how
> the processing type is determined. My INT code won't
> be using any (almost unavailable) FP resources. And
> the FP supports multiply and multiplyAdd, and does not
> do division. To do division, you use Reciprocal and Multiply.
>
> int i
> float x
> double y
>
> The FP64 is like the Clive Sinclair version of IEEE754. It
> may not do exactly what the CPU does for its FP, leaving many
> details to the coder. Maybe your Intel processor uses 80 bit for
> intermediary stuff, and 64 bit as "output" to be put back
> in the register. Those would be guard bits.
>
> But before you run the actual code, your program
> has to do DeviceQuery and determine whether the "type"
> is even present, and in what numbers. FP16, FP32, FP64.
>
> On mine, the FP64 ratio is a miserable 1:32. But
> unless I can find a DeviceQuery demo, I may not be
> able to give you an example.
>
> The professional cards (Tesla versus your consumer GPU),
> the ratio is stated as 1:2 , and that's enough to make
> the card get very hot. I think my card on FP16 (used
> perhaps for neural networks) is only 1:128 or "a joke".
> This means if I want some of these oddball cases (FP16 for
> neural networks) to be satisfied, I have to spend thousands.
>
> There's a seven page article that will give you some keywords,
> but it takes an actual programmer site to make sense of it.
>
> "GPU Programming and Streaming Multiprocessors" (1 chapter from a book) Aug 6, 2013
>
> https://www.informit.com/articles/article.aspx?p=2103809
>
> "How to determine if my GPU does 16/32/64 bit arithmetic operations?"
> (about half of an answer, not as satisfying as it could be)
>
> https://stackoverflow.com/questions/43478827/how-to-determine-if-my-gpu-does-16-32-64-bit-arithmetic-operations
>
> "streaming multiprocessor number"
>
> https://superuser.com/questions/198119/streaming-multiprocessor-number
>
> "And see this line:
>
> Multiprocessors x Cores/MP = Cores: 14 (MP) x 32 (Cores/MP) = 448 (Cores)"
>
> So if I have 2560 Cuda cores and 32 cores per MP, I have 80 SM.
> And then you have to figure out whether FP are available per core
> or per SM, and use the available ratio info to figure out how many
> you got. Tesla has FP64 at 1:2 so you can "do double() all day".
> Consumer cards can be very weak, but might have some, so that
> your code doesn't just crash.
>
> Well written code would have if-then-else to guard against
> the user running the code on prehistoric cards. And you will still
> find hobby coders who have written and tuned their CUDA code
> for their *particular* card, not giving a rats ass that the
> code is sub-optimal for the resources on other cards.

But given a certain task, isn't it possible you have to use FP64? You need the precision. And I use older 4:1 cards which are very suitable. Modern cards are a piece of shit, especially Nvidia (spit!).

Re: How can I tell if a GPU is using double precision?

<t30urv$12f$1@dont-email.me>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61583&group=alt.comp.os.windows-10#61583

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: nos...@needed.invalid (Paul)
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
Subject: Re: How can I tell if a GPU is using double precision?
Date: Mon, 11 Apr 2022 06:11:12 -0400
Organization: A noiseless patient Spider
Lines: 14
Message-ID: <t30urv$12f$1@dont-email.me>
References: <op.1kcsnhi5mvhs6z@ryzen.lan> <t2si3j$8ot$1@dont-email.me>
<op.1kdwahtqmvhs6z@ryzen.lan>
Mime-Version: 1.0
Content-Type: text/plain; charset=iso-8859-15; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Mon, 11 Apr 2022 10:11:11 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="6cd4f86cbc9614ecdfd707892010f574";
logging-data="1103"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/CdNFWaAXD1RFoBDipHA1QWlnHl9+gQhM="
User-Agent: Ratcatcher/2.0.0.25 (Windows/20130802)
Cancel-Lock: sha1:f7DWJXYCX66O4PElmrlhzy9QSEE=
In-Reply-To: <op.1kdwahtqmvhs6z@ryzen.lan>
Content-Language: en-US
 by: Paul - Mon, 11 Apr 2022 10:11 UTC

On 4/9/2022 8:43 PM, Commander Kinsey wrote:

>
> But given a certain task, isn't it possible you have to use FP64?
> You need the precision.  And I use older 4:1 cards which are
> very suitable.  Modern cards are a piece of shit, especially Nvidia (spit!).

There are expensive cards with 1:2 and that's how they
can price them in the thousands per card.

It's common for high tech companies to "ration" stuff
that does not need rationing.

Paul

Re: How can I tell if a GPU is using double precision?

<op.1kgk5he5mvhs6z@ryzen.lan>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61588&group=alt.comp.os.windows-10#61588

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!aioe.org!feeder1.feed.usenet.farm!feed.usenet.farm!news-out.netnews.com!news.alt.net!fdc2.netnews.com!peer03.ams1!peer.ams1.xlned.com!news.xlned.com!fx07.ams1.POSTED!not-for-mail
Content-Type: text/plain; charset=iso-8859-15; format=flowed; delsp=yes
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
Subject: Re: How can I tell if a GPU is using double precision?
References: <op.1kcsnhi5mvhs6z@ryzen.lan> <t2si3j$8ot$1@dont-email.me>
<op.1kdwahtqmvhs6z@ryzen.lan> <t30urv$12f$1@dont-email.me>
MIME-Version: 1.0
Content-Transfer-Encoding: Quoted-Printable
From: CK1...@nospam.com (Commander Kinsey)
Message-ID: <op.1kgk5he5mvhs6z@ryzen.lan>
User-Agent: Opera Mail/1.0 (Win32)
X-Antivirus: AVG (VPS 220411-0, 11/4/2022), Outbound message
X-Antivirus-Status: Clean
Lines: 16
X-Complaints-To: abuse(at)newshosting.com
NNTP-Posting-Date: Mon, 11 Apr 2022 11:35:33 UTC
Organization: Newshosting.com - Highest quality at a great price! www.newshosting.com
Date: Mon, 11 Apr 2022 12:35:31 +0100
X-Received-Bytes: 1544
 by: Commander Kinsey - Mon, 11 Apr 2022 11:35 UTC

On Mon, 11 Apr 2022 11:11:12 +0100, Paul <nospam@needed.invalid> wrote:

> On 4/9/2022 8:43 PM, Commander Kinsey wrote:
>
>> But given a certain task, isn't it possible you have to use FP64?
>> You need the precision. And I use older 4:1 cards which are
>> very suitable. Modern cards are a piece of shit, especially Nvidia (spit!).
>
> There are expensive cards with 1:2 and that's how they
> can price them in the thousands per card.
>
> It's common for high tech companies to "ration" stuff
> that does not need rationing.

Fuck paying £1000 for a 1:2 when I can get an old 1:4 for £50.

Re: How can I tell if a GPU is using double precision?

<cmb5K.238340$41E7.76648@fx37.iad>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61612&group=alt.comp.os.windows-10#61612

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!aioe.org!news.uzoreto.com!newsfeed.xs4all.nl!newsfeed8.news.xs4all.nl!news-out.netnews.com!news.alt.net!fdc2.netnews.com!peer01.ams1!peer.ams1.xlned.com!news.xlned.com!peer01.iad!feed-me.highwinds-media.com!news.highwinds-media.com!fx37.iad.POSTED!not-for-mail
MIME-Version: 1.0
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:91.0) Gecko/20100101
Thunderbird/91.7.0
Subject: Re: How can I tell if a GPU is using double precision?
Content-Language: en-US
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
References: <op.1kcsnhi5mvhs6z@ryzen.lan>
From: nonegi...@att.net (Bill)
In-Reply-To: <op.1kcsnhi5mvhs6z@ryzen.lan>
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
X-Antivirus: Avast (VPS 220411-2, 4/11/2022), Outbound message
X-Antivirus-Status: Clean
Lines: 19
Message-ID: <cmb5K.238340$41E7.76648@fx37.iad>
X-Complaints-To: https://www.astraweb.com/aup
NNTP-Posting-Date: Tue, 12 Apr 2022 09:10:32 UTC
Date: Tue, 12 Apr 2022 05:10:31 -0400
X-Received-Bytes: 2045
 by: Bill - Tue, 12 Apr 2022 09:10 UTC

On 4/9/2022 6:27 AM, Commander Kinsey wrote:
> How can I tell if a GPU is using double precision? Some sort of
> measurement of processing and whether it's single, double, or a mixture.

You received a lot of good responses. But, to a programmer, the obvious
answer is to examine the machine code. Intel publishes list of all of
the "assembly language and machine instructions" that each of it's CPUs
know how to run--because, in a nutshell, that's ALL a CPU know how to do
is to run those instructions. A purist might say that I am omitting
"microcode".

If you took a course in Assembly language (a topic which doesn't receive
anywhere near as much as it used to), this would be second nature to
you. When you have time, at least take a "look" at the instructions I
mentioned. In Intel's manuals, the instructions are described in so much
detail (all of the "flags" that an instruction may set for instance)
that it is simply incredible the amount of detail that is involved
(maybe 2 pages, say, for one instruction...I haven't had the need to
look in a while).

Re: How can I tell if a GPU is using double precision?

<op.1kh9p9hqmvhs6z@ryzen.lan>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61613&group=alt.comp.os.windows-10#61613

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!paganini.bofh.team!news.fcku.it!news.uzoreto.com!news-out.netnews.com!news.alt.net!fdc2.netnews.com!peer01.ams1!peer.ams1.xlned.com!news.xlned.com!fx10.ams1.POSTED!not-for-mail
Content-Type: text/plain; charset=iso-8859-15; format=flowed; delsp=yes
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
Subject: Re: How can I tell if a GPU is using double precision?
References: <op.1kcsnhi5mvhs6z@ryzen.lan> <cmb5K.238340$41E7.76648@fx37.iad>
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
From: CK1...@nospam.com (Commander Kinsey)
Message-ID: <op.1kh9p9hqmvhs6z@ryzen.lan>
User-Agent: Opera Mail/1.0 (Win32)
X-Antivirus: AVG (VPS 220411-2, 11/4/2022), Outbound message
X-Antivirus-Status: Clean
Lines: 23
X-Complaints-To: abuse(at)newshosting.com
NNTP-Posting-Date: Tue, 12 Apr 2022 09:24:00 UTC
Organization: Newshosting.com - Highest quality at a great price! www.newshosting.com
Date: Tue, 12 Apr 2022 10:23:59 +0100
X-Received-Bytes: 2303
 by: Commander Kinsey - Tue, 12 Apr 2022 09:23 UTC

On Tue, 12 Apr 2022 10:10:31 +0100, Bill <nonegiven@att.net> wrote:

> On 4/9/2022 6:27 AM, Commander Kinsey wrote:
>> How can I tell if a GPU is using double precision? Some sort of
>> measurement of processing and whether it's single, double, or a mixture.
>
> You received a lot of good responses. But, to a programmer, the obvious
> answer is to examine the machine code. Intel publishes list of all of
> the "assembly language and machine instructions" that each of it's CPUs
> know how to run--because, in a nutshell, that's ALL a CPU know how to do
> is to run those instructions. A purist might say that I am omitting
> "microcode".
>
> If you took a course in Assembly language (a topic which doesn't receive
> anywhere near as much as it used to), this would be second nature to
> you. When you have time, at least take a "look" at the instructions I
> mentioned. In Intel's manuals, the instructions are described in so much
> detail (all of the "flags" that an instruction may set for instance)
> that it is simply incredible the amount of detail that is involved
> (maybe 2 pages, say, for one instruction...I haven't had the need to
> look in a while).

You're all answering the wrong question. I'm not a programmer. I've nothing to do with the program running. But I want (as a user) to know if it's using DP or SP and in what proportion, as it affects my choice of GPU purchase in the future.

Re: How can I tell if a GPU is using double precision?

<LIg5K.67395$Kdf.29834@fx96.iad>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61619&group=alt.comp.os.windows-10#61619

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!weretis.net!feeder8.news.weretis.net!feeds.phibee-telecom.net!newsfeed.xs4all.nl!newsfeed8.news.xs4all.nl!news-out.netnews.com!news.alt.net!fdc2.netnews.com!peer01.ams1!peer.ams1.xlned.com!news.xlned.com!peer02.iad!feed-me.highwinds-media.com!news.highwinds-media.com!fx96.iad.POSTED!not-for-mail
MIME-Version: 1.0
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:91.0) Gecko/20100101
Thunderbird/91.7.0
Subject: Re: How can I tell if a GPU is using double precision?
Content-Language: en-US
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
References: <op.1kcsnhi5mvhs6z@ryzen.lan> <cmb5K.238340$41E7.76648@fx37.iad>
<op.1kh9p9hqmvhs6z@ryzen.lan>
From: nonegi...@att.net (Bill)
In-Reply-To: <op.1kh9p9hqmvhs6z@ryzen.lan>
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
X-Antivirus: Avast (VPS 220412-3, 4/11/2022), Outbound message
X-Antivirus-Status: Clean
Lines: 37
Message-ID: <LIg5K.67395$Kdf.29834@fx96.iad>
X-Complaints-To: https://www.astraweb.com/aup
NNTP-Posting-Date: Tue, 12 Apr 2022 15:15:55 UTC
Date: Tue, 12 Apr 2022 11:15:54 -0400
X-Received-Bytes: 3162
 by: Bill - Tue, 12 Apr 2022 15:15 UTC

On 4/12/2022 5:23 AM, Commander Kinsey wrote:
> On Tue, 12 Apr 2022 10:10:31 +0100, Bill <nonegiven@att.net> wrote:
>
>> On 4/9/2022 6:27 AM, Commander Kinsey wrote:
>>> How can I tell if a GPU is using double precision? Some sort of
>>> measurement of processing and whether it's single, double, or a mixture.
>>
>> You received a lot of good responses. But, to a programmer, the obvious
>> answer is to examine the machine code.  Intel publishes list of all of
>> the "assembly language and machine instructions" that each of it's CPUs
>> know how to run--because, in a nutshell, that's ALL a CPU know how to do
>> is to run those instructions.  A purist might say that I am omitting
>> "microcode".
>>
>> If you took a course in Assembly language (a topic which doesn't receive
>>   anywhere near as much as it used to), this would be second nature to
>> you.  When you have time, at least take a "look" at the instructions I
>> mentioned. In Intel's manuals, the instructions are described in so much
>> detail (all of the "flags" that an instruction may set for instance)
>> that it is simply incredible the amount of detail that is involved
>> (maybe 2 pages, say, for one instruction...I haven't had the need to
>> look in a while).
>
> You're all answering the wrong question.  I'm not a programmer.  I've
> nothing to do with the program running.  But I want (as a user) to know
> if it's using DP or SP and in what proportion, as it affects my choice
> of GPU purchase in the future.

How so? I don't think modern GPUs rely on the CPU to assist them with
tedious graphics-related calculations of the sort you are describing.
If you wish to compare GPUs, compare the "benchmark" results, that is
precisely what those numbers are for. Of course, the "driver" is a
critical part of the GPU (system) too! With a poor driver, it doesn't
matter how "good" your CPU or GPU is. Established performance results
are the only thing that you can depend on, and of course your
motherboard and memory are a component of the system too, and those
are sometimes documented along with benchmark results.

Re: How can I tell if a GPU is using double precision?

<qTg5K.151775$dln7.13181@fx03.iad>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61620&group=alt.comp.os.windows-10#61620

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!aioe.org!news.uzoreto.com!npeer.as286.net!npeer-ng0.as286.net!peer01.ams1!peer.ams1.xlned.com!news.xlned.com!peer03.iad!feed-me.highwinds-media.com!news.highwinds-media.com!fx03.iad.POSTED!not-for-mail
MIME-Version: 1.0
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:91.0) Gecko/20100101
Thunderbird/91.7.0
Subject: Re: How can I tell if a GPU is using double precision?
Content-Language: en-US
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
References: <op.1kcsnhi5mvhs6z@ryzen.lan> <cmb5K.238340$41E7.76648@fx37.iad>
<op.1kh9p9hqmvhs6z@ryzen.lan>
From: nonegi...@att.net (Bill)
In-Reply-To: <op.1kh9p9hqmvhs6z@ryzen.lan>
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
X-Antivirus: Avast (VPS 220412-3, 4/11/2022), Outbound message
X-Antivirus-Status: Clean
Lines: 36
Message-ID: <qTg5K.151775$dln7.13181@fx03.iad>
X-Complaints-To: https://www.astraweb.com/aup
NNTP-Posting-Date: Tue, 12 Apr 2022 15:27:18 UTC
Date: Tue, 12 Apr 2022 11:27:18 -0400
X-Received-Bytes: 3104
 by: Bill - Tue, 12 Apr 2022 15:27 UTC

On 4/12/2022 5:23 AM, Commander Kinsey wrote:
> On Tue, 12 Apr 2022 10:10:31 +0100, Bill <nonegiven@att.net> wrote:
>
>> On 4/9/2022 6:27 AM, Commander Kinsey wrote:
>>> How can I tell if a GPU is using double precision? Some sort of
>>> measurement of processing and whether it's single, double, or a mixture.
>>
>> You received a lot of good responses. But, to a programmer, the obvious
>> answer is to examine the machine code.  Intel publishes list of all of
>> the "assembly language and machine instructions" that each of it's CPUs
>> know how to run--because, in a nutshell, that's ALL a CPU know how to do
>> is to run those instructions.  A purist might say that I am omitting
>> "microcode".
>>
>> If you took a course in Assembly language (a topic which doesn't receive
>>   anywhere near as much as it used to), this would be second nature to
>> you.  When you have time, at least take a "look" at the instructions I
>> mentioned. In Intel's manuals, the instructions are described in so much
>> detail (all of the "flags" that an instruction may set for instance)
>> that it is simply incredible the amount of detail that is involved
>> (maybe 2 pages, say, for one instruction...I haven't had the need to
>> look in a while).
>
> You're all answering the wrong question.  I'm not a programmer.  I've
> nothing to do with the program running.  But I want (as a user) to know
> if it's using DP or SP and in what proportion, as it affects my choice
> of GPU purchase in the future.

I have a hunch it's using two's complement (i.e. integer operations),
but like I said in my other post, you shouldn't care what system it is
using as long as it gives you the performance you want. If you compare
the specifications of a GPU, they will give you all sorts of criteria to
consider. Besides for things like the number of "Vcores", in a given
category "voltage" is the most telling; but with voltage comes heat.
So for instance, I'd go with the RTX 3070 at 220v over the RTX-3070ti at
290v, even though the latter is a little faster.

Re: How can I tell if a GPU is using double precision?

<t348rf$mcf$1@dont-email.me>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61621&group=alt.comp.os.windows-10#61621

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: ithink...@gmail.com (Chris)
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
Subject: Re: How can I tell if a GPU is using double precision?
Date: Tue, 12 Apr 2022 17:19:58 +0100
Organization: A noiseless patient Spider
Lines: 35
Message-ID: <t348rf$mcf$1@dont-email.me>
References: <op.1kcsnhi5mvhs6z@ryzen.lan> <cmb5K.238340$41E7.76648@fx37.iad>
<op.1kh9p9hqmvhs6z@ryzen.lan>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Tue, 12 Apr 2022 16:19:59 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="212a0e133a13be3f29ed6d2f9e74580c";
logging-data="22927"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/qPKA4ZrwW0d5VAXfQI7CIHoJ3BLs+bL4="
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:91.0)
Gecko/20100101 Thunderbird/91.7.0
Cancel-Lock: sha1:69ZzZLzX+OTnPW8xbRpZVgp1HoU=
In-Reply-To: <op.1kh9p9hqmvhs6z@ryzen.lan>
Content-Language: en-GB
 by: Chris - Tue, 12 Apr 2022 16:19 UTC

On 12/04/2022 10:23, Commander Kinsey wrote:
> On Tue, 12 Apr 2022 10:10:31 +0100, Bill <nonegiven@att.net> wrote:
>
>> On 4/9/2022 6:27 AM, Commander Kinsey wrote:
>>> How can I tell if a GPU is using double precision? Some sort of
>>> measurement of processing and whether it's single, double, or a mixture.
>>
>> You received a lot of good responses. But, to a programmer, the obvious
>> answer is to examine the machine code.  Intel publishes list of all of
>> the "assembly language and machine instructions" that each of it's CPUs
>> know how to run--because, in a nutshell, that's ALL a CPU know how to do
>> is to run those instructions.  A purist might say that I am omitting
>> "microcode".
>>
>> If you took a course in Assembly language (a topic which doesn't receive
>>   anywhere near as much as it used to), this would be second nature to
>> you.  When you have time, at least take a "look" at the instructions I
>> mentioned. In Intel's manuals, the instructions are described in so much
>> detail (all of the "flags" that an instruction may set for instance)
>> that it is simply incredible the amount of detail that is involved
>> (maybe 2 pages, say, for one instruction...I haven't had the need to
>> look in a while).
>
> You're all answering the wrong question.

That's because your question doesn't make sense.

> I'm not a programmer.  I've
> nothing to do with the program running.  But I want (as a user) to know
> if it's using DP or SP and in what proportion, as it affects my choice
> of GPU purchase in the future.

If the programme requires specific hardware to run the developer will
tell you. If there's no such requirement then it is irrelevant. None of
this has anything to do with DP or SP.

Re: How can I tell if a GPU is using double precision?

<t34v0p$a2c$1@dont-email.me>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61630&group=alt.comp.os.windows-10#61630

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: nos...@needed.invalid (Paul)
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
Subject: Re: How can I tell if a GPU is using double precision?
Date: Tue, 12 Apr 2022 18:38:17 -0400
Organization: A noiseless patient Spider
Lines: 53
Message-ID: <t34v0p$a2c$1@dont-email.me>
References: <op.1kcsnhi5mvhs6z@ryzen.lan> <cmb5K.238340$41E7.76648@fx37.iad>
<op.1kh9p9hqmvhs6z@ryzen.lan> <qTg5K.151775$dln7.13181@fx03.iad>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Tue, 12 Apr 2022 22:38:17 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="5ab84ae0d0a55c83f15ef79fb81b8b02";
logging-data="10316"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19GVLXtIIWvqMCCeinIW9cWdwh04SWU0Lg="
User-Agent: Ratcatcher/2.0.0.25 (Windows/20130802)
Cancel-Lock: sha1:j/3I01bD/VwwUGCNDcr8C8pQ3Gw=
In-Reply-To: <qTg5K.151775$dln7.13181@fx03.iad>
Content-Language: en-US
 by: Paul - Tue, 12 Apr 2022 22:38 UTC

On 4/12/2022 11:27 AM, Bill wrote:
> On 4/12/2022 5:23 AM, Commander Kinsey wrote:
>> On Tue, 12 Apr 2022 10:10:31 +0100, Bill <nonegiven@att.net> wrote:
>>
>>> On 4/9/2022 6:27 AM, Commander Kinsey wrote:
>>>> How can I tell if a GPU is using double precision? Some sort of
>>>> measurement of processing and whether it's single, double, or a mixture.
>>>
>>> You received a lot of good responses. But, to a programmer, the obvious
>>> answer is to examine the machine code.  Intel publishes list of all of
>>> the "assembly language and machine instructions" that each of it's CPUs
>>> know how to run--because, in a nutshell, that's ALL a CPU know how to do
>>> is to run those instructions.  A purist might say that I am omitting
>>> "microcode".
>>>
>>> If you took a course in Assembly language (a topic which doesn't receive
>>>   anywhere near as much as it used to), this would be second nature to
>>> you.  When you have time, at least take a "look" at the instructions I
>>> mentioned. In Intel's manuals, the instructions are described in so much
>>> detail (all of the "flags" that an instruction may set for instance)
>>> that it is simply incredible the amount of detail that is involved
>>> (maybe 2 pages, say, for one instruction...I haven't had the need to
>>> look in a while).
>>
>> You're all answering the wrong question.  I'm not a programmer.  I've nothing to do with the program running.  But I want (as a user) to know if it's using DP or SP and in what proportion, as it affects my choice of GPU purchase in the future.
>
> I have a hunch it's using two's complement (i.e. integer operations), but like I said in my other post, you shouldn't care what system it is using as long as it gives you the performance you want.  If you compare
> the specifications of a GPU, they will give you all sorts of criteria to consider. Besides for things like the number of "Vcores", in a given
> category "voltage" is the most telling; but with voltage comes heat.
> So for instance, I'd go with the RTX 3070 at 220v over the RTX-3070ti at 290v, even though the latter is a little faster.

https://www.int.washington.edu/PROGRAMS/12-2c/week3/clark_02.pdf Page 11

32 CUDA Cores per SM
32 fp32 ops/clock \
16 fp64 ops/clock \___ Ratios found on expensive cards
32 int32 ops/clock /

When you write your original code, before it is compiled, that
determines whether an fp32 or an fp64 would be used.

My card from NVidia, has awful ratios, like 1:32 instead
of the 1:2 shown in the above table. It makes you wonder how
the dispatcher knows a particular piece of code belongs
on a particular logic block.

https://www.techpowerup.com/gpu-specs/geforce-gtx-1080.c2839

"FP32 (float) performance 8.873 TFLOPS

FP64 (double) performance 277.3 GFLOPS (1:32)"

Paul

Re: How can I tell if a GPU is using double precision?

<op.1kjrpfujmvhs6z@ryzen.lan>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61639&group=alt.comp.os.windows-10#61639

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!weretis.net!feeder8.news.weretis.net!feeder1.feed.usenet.farm!feed.usenet.farm!news-out.netnews.com!news.alt.net!fdc2.netnews.com!peer03.ams1!peer.ams1.xlned.com!news.xlned.com!fx01.ams1.POSTED!not-for-mail
Content-Type: text/plain; charset=iso-8859-15; format=flowed; delsp=yes
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
Subject: Re: How can I tell if a GPU is using double precision?
References: <op.1kcsnhi5mvhs6z@ryzen.lan> <cmb5K.238340$41E7.76648@fx37.iad>
<op.1kh9p9hqmvhs6z@ryzen.lan> <LIg5K.67395$Kdf.29834@fx96.iad>
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
From: CK1...@nospam.com (Commander Kinsey)
Message-ID: <op.1kjrpfujmvhs6z@ryzen.lan>
User-Agent: Opera Mail/1.0 (Win32)
X-Antivirus: AVG (VPS 220412-4, 12/4/2022), Outbound message
X-Antivirus-Status: Clean
Lines: 41
X-Complaints-To: abuse(at)newshosting.com
NNTP-Posting-Date: Wed, 13 Apr 2022 04:49:54 UTC
Organization: Newshosting.com - Highest quality at a great price! www.newshosting.com
Date: Wed, 13 Apr 2022 05:49:53 +0100
X-Received-Bytes: 3390
 by: Commander Kinsey - Wed, 13 Apr 2022 04:49 UTC

On Tue, 12 Apr 2022 16:15:54 +0100, Bill <nonegiven@att.net> wrote:

> On 4/12/2022 5:23 AM, Commander Kinsey wrote:
>> On Tue, 12 Apr 2022 10:10:31 +0100, Bill <nonegiven@att.net> wrote:
>>
>>> On 4/9/2022 6:27 AM, Commander Kinsey wrote:
>>>> How can I tell if a GPU is using double precision? Some sort of
>>>> measurement of processing and whether it's single, double, or a mixture.
>>>
>>> You received a lot of good responses. But, to a programmer, the obvious
>>> answer is to examine the machine code. Intel publishes list of all of
>>> the "assembly language and machine instructions" that each of it's CPUs
>>> know how to run--because, in a nutshell, that's ALL a CPU know how to do
>>> is to run those instructions. A purist might say that I am omitting
>>> "microcode".
>>>
>>> If you took a course in Assembly language (a topic which doesn't receive
>>> anywhere near as much as it used to), this would be second nature to
>>> you. When you have time, at least take a "look" at the instructions I
>>> mentioned. In Intel's manuals, the instructions are described in so much
>>> detail (all of the "flags" that an instruction may set for instance)
>>> that it is simply incredible the amount of detail that is involved
>>> (maybe 2 pages, say, for one instruction...I haven't had the need to
>>> look in a while).
>>
>> You're all answering the wrong question. I'm not a programmer. I've
>> nothing to do with the program running. But I want (as a user) to know
>> if it's using DP or SP and in what proportion, as it affects my choice
>> of GPU purchase in the future.
>
> How so? I don't think modern GPUs rely on the CPU to assist them with
> tedious graphics-related calculations of the sort you are describing.
> If you wish to compare GPUs, compare the "benchmark" results, that is
> precisely what those numbers are for. Of course, the "driver" is a
> critical part of the GPU (system) too! With a poor driver, it doesn't
> matter how "good" your CPU or GPU is. Established performance results
> are the only thing that you can depend on, and of course your
> motherboard and memory are a component of the system too, and those
> are sometimes documented along with benchmark results.

You still don't understand. I know what DP and SP each card does. But I don't know what type the program is using! I want some kind of task manager graph to show current usage seperated into DP and SP.

Re: How can I tell if a GPU is using double precision?

<op.1kjyrcanmvhs6z@ryzen.lan>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61640&group=alt.comp.os.windows-10#61640

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!weretis.net!feeder8.news.weretis.net!news.uzoreto.com!news-out.netnews.com!news.alt.net!fdc2.netnews.com!peer02.ams1!peer.ams1.xlned.com!news.xlned.com!fx13.ams1.POSTED!not-for-mail
Content-Type: text/plain; charset=iso-8859-15; format=flowed; delsp=yes
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
Subject: Re: How can I tell if a GPU is using double precision?
References: <op.1kcsnhi5mvhs6z@ryzen.lan> <cmb5K.238340$41E7.76648@fx37.iad>
<op.1kh9p9hqmvhs6z@ryzen.lan> <qTg5K.151775$dln7.13181@fx03.iad>
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
From: CK1...@nospam.com (Commander Kinsey)
Message-ID: <op.1kjyrcanmvhs6z@ryzen.lan>
User-Agent: Opera Mail/1.0 (Win32)
X-Antivirus: AVG (VPS 220412-4, 12/4/2022), Outbound message
X-Antivirus-Status: Clean
Lines: 42
X-Complaints-To: abuse(at)newshosting.com
NNTP-Posting-Date: Wed, 13 Apr 2022 07:22:16 UTC
Organization: Newshosting.com - Highest quality at a great price! www.newshosting.com
Date: Wed, 13 Apr 2022 08:22:14 +0100
X-Received-Bytes: 3571
 by: Commander Kinsey - Wed, 13 Apr 2022 07:22 UTC

On Tue, 12 Apr 2022 16:27:18 +0100, Bill <nonegiven@att.net> wrote:

> On 4/12/2022 5:23 AM, Commander Kinsey wrote:
>> On Tue, 12 Apr 2022 10:10:31 +0100, Bill <nonegiven@att.net> wrote:
>>
>>> On 4/9/2022 6:27 AM, Commander Kinsey wrote:
>>>> How can I tell if a GPU is using double precision? Some sort of
>>>> measurement of processing and whether it's single, double, or a mixture.
>>>
>>> You received a lot of good responses. But, to a programmer, the obvious
>>> answer is to examine the machine code. Intel publishes list of all of
>>> the "assembly language and machine instructions" that each of it's CPUs
>>> know how to run--because, in a nutshell, that's ALL a CPU know how to do
>>> is to run those instructions. A purist might say that I am omitting
>>> "microcode".
>>>
>>> If you took a course in Assembly language (a topic which doesn't receive
>>> anywhere near as much as it used to), this would be second nature to
>>> you. When you have time, at least take a "look" at the instructions I
>>> mentioned. In Intel's manuals, the instructions are described in so much
>>> detail (all of the "flags" that an instruction may set for instance)
>>> that it is simply incredible the amount of detail that is involved
>>> (maybe 2 pages, say, for one instruction...I haven't had the need to
>>> look in a while).
>>
>> You're all answering the wrong question. I'm not a programmer. I've
>> nothing to do with the program running. But I want (as a user) to know
>> if it's using DP or SP and in what proportion, as it affects my choice
>> of GPU purchase in the future.
>
> I have a hunch it's using two's complement (i.e. integer operations),
> but like I said in my other post, you shouldn't care what system it is
> using as long as it gives you the performance you want. If you compare
> the specifications of a GPU, they will give you all sorts of criteria to
> consider. Besides for things like the number of "Vcores", in a given
> category "voltage" is the most telling; but with voltage comes heat.
> So for instance, I'd go with the RTX 3070 at 220v over the RTX-3070ti at
> 290v, even though the latter is a little faster.

A modern mainstream Nvidia GPU is 32:1. A modern mainstream GPU from AMD is 16:1. The old AMDs I buy are 4:1. I need to know how much DP and SP a particular program is doing so I can choose what card to buy next.

Milkyway@Home is pretty much just DP. Einstein@Home and Folding@Home I think are about 1/4 DP and 3/4 SP. But I'm guessing those numbers by comparing how fast they run on different cards. I'd love to have a precise figure.

Re: How can I tell if a GPU is using double precision?

<t36mfi$1ae$1@dont-email.me>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61642&group=alt.comp.os.windows-10#61642

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: nos...@needed.invalid (Paul)
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
Subject: Re: How can I tell if a GPU is using double precision?
Date: Wed, 13 Apr 2022 10:24:50 -0400
Organization: A noiseless patient Spider
Lines: 108
Message-ID: <t36mfi$1ae$1@dont-email.me>
References: <op.1kcsnhi5mvhs6z@ryzen.lan> <cmb5K.238340$41E7.76648@fx37.iad>
<op.1kh9p9hqmvhs6z@ryzen.lan> <qTg5K.151775$dln7.13181@fx03.iad>
<op.1kjyrcanmvhs6z@ryzen.lan>
Mime-Version: 1.0
Content-Type: text/plain; charset=iso-8859-15; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Wed, 13 Apr 2022 14:24:50 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="5ab84ae0d0a55c83f15ef79fb81b8b02";
logging-data="1358"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1+VXfWGYFl15Un8+Vy1q7r31Dg20x/vO6E="
User-Agent: Ratcatcher/2.0.0.25 (Windows/20130802)
Cancel-Lock: sha1:4xUHfHlZpgrDdmcQ6WTqNTRROOE=
In-Reply-To: <op.1kjyrcanmvhs6z@ryzen.lan>
Content-Language: en-US
 by: Paul - Wed, 13 Apr 2022 14:24 UTC

On 4/13/2022 3:22 AM, Commander Kinsey wrote:
> On Tue, 12 Apr 2022 16:27:18 +0100, Bill <nonegiven@att.net> wrote:
>
>> On 4/12/2022 5:23 AM, Commander Kinsey wrote:
>>> On Tue, 12 Apr 2022 10:10:31 +0100, Bill <nonegiven@att.net> wrote:
>>>
>>>> On 4/9/2022 6:27 AM, Commander Kinsey wrote:
>>>>> How can I tell if a GPU is using double precision? Some sort of
>>>>> measurement of processing and whether it's single, double, or a mixture.
>>>>
>>>> You received a lot of good responses. But, to a programmer, the obvious
>>>> answer is to examine the machine code.  Intel publishes list of all of
>>>> the "assembly language and machine instructions" that each of it's CPUs
>>>> know how to run--because, in a nutshell, that's ALL a CPU know how to do
>>>> is to run those instructions.  A purist might say that I am omitting
>>>> "microcode".
>>>>
>>>> If you took a course in Assembly language (a topic which doesn't receive
>>>>   anywhere near as much as it used to), this would be second nature to
>>>> you.  When you have time, at least take a "look" at the instructions I
>>>> mentioned. In Intel's manuals, the instructions are described in so much
>>>> detail (all of the "flags" that an instruction may set for instance)
>>>> that it is simply incredible the amount of detail that is involved
>>>> (maybe 2 pages, say, for one instruction...I haven't had the need to
>>>> look in a while).
>>>
>>> You're all answering the wrong question.  I'm not a programmer.  I've
>>> nothing to do with the program running.  But I want (as a user) to know
>>> if it's using DP or SP and in what proportion, as it affects my choice
>>> of GPU purchase in the future.
>>
>> I have a hunch it's using two's complement (i.e. integer operations),
>> but like I said in my other post, you shouldn't care what system it is
>> using as long as it gives you the performance you want.  If you compare
>> the specifications of a GPU, they will give you all sorts of criteria to
>> consider. Besides for things like the number of "Vcores", in a given
>> category "voltage" is the most telling; but with voltage comes heat.
>> So for instance, I'd go with the RTX 3070 at 220v over the RTX-3070ti at
>> 290v, even though the latter is a little faster.
>
> A modern mainstream Nvidia GPU is 32:1.  A modern mainstream GPU from AMD is 16:1.  The old AMDs I buy are 4:1.  I need to know how much DP and SP a particular program is doing so I can choose what card to buy next.
>
> Milkyway@Home is pretty much just DP.  Einstein@Home and Folding@Home I think are about 1/4 DP and 3/4 SP.  But I'm guessing those numbers by comparing how fast they run on different cards.  I'd love to have a precise figure.

A skim through a table of cards, shows there aren't
a lot of good ones particularly. A couple Titans, then
to get anywhere close to amazing, you have to buy an entire
box full of hardware and mezzanine cards.

*******

The closest thing I could find, is NSight running in Visual Studio.
There is a swath labeled "Compute", but who knows what that means.

https://pbs.twimg.com/media/DpJeUz0X4AADpcj?format=jpg&name=medium

It's hard to say what kind of code you need to collect stats whether
Debug or Release. I couldn't get it to "trigger" to capture
anything here, using Release code. And my Debug build didn't finish.

*******

The metric mentioned here, isn't very interesting. It mentions
that when doing DP, the efficiency cannot be higher than 3%
on a card like mine. So then your "dynamic range" when watching
what is going on, would be between 0% and 3%.

https://forums.developer.nvidia.com/t/fp-efficiency-and-utilization/164693

"In the metrics, I see two FP related stats:

flop_sp_efficiency
smsp__sass_thread_inst_executed_ops_fadd_fmul_ffma_pred_on.av
g.pct_of_peak_sustained_elapsed

single_precision_fu_utilization
smsp__pipe_fma_cycles_active.avg.pct_of_peak_sustained_active
"

You can bench the FLOPs with CUDA-Z, but then you don't know
to what extent The MilkyWay run uses that capability. At least,
unless you could get at those metrics.

https://phoenixnap.dl.sourceforge.net/project/cuda-z/cuda-z/0.10/CUDA-Z-0.10.251-64bit.exe

There is also something called NVPROF, but you can see in the
description this is hardly usable on an application you don't control.

https://stackoverflow.com/questions/55638254/flop-efficiency-in-cuda

Summary: Get the MilkyWay developer to add some metric
to the control panel graphics. I can't tell from that discussion,
whether some library has an API to extract the raw
counts so you would know. Since I don't have proof
a flop_dp_efficiency exits, it may still not be possible.
It makes you wonder how CUDA-Z does it, in the benching tab.

https://aws1.discourse-cdn.com/nvidia/optimized/3X/1/8/18913cd57df079a2e97edb72b6b9af6bfdf28047_2_690x462.png

CUDA-Z source should be in here. In cudainfo.cu, you
can see that no effort is made to check that the code
launched is 100% efficient, and there's an implicit
assumption the measured value is running the hardware
as hard as possible. Which might not be true.

https://sourceforge.net/p/cuda-z/code/HEAD/tree/

Paul

Re: How can I tell if a GPU is using double precision?

<1wzkrqe1vjr5m$.dlg@br-ederson.eternal-september.org>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=61643&group=alt.comp.os.windows-10#61643

  copy link   Newsgroups: alt.computer.workshop alt.comp.os.windows-10 alt.comp.os.windows-11 alt.comp.freeware
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!br-ederson.eternal-september.org!.POSTED!not-for-mail
From: use.repl...@this.is.invalid (B. R. 'BeAr' Ederson)
Newsgroups: alt.computer.workshop,alt.comp.os.windows-10,alt.comp.os.windows-11,alt.comp.freeware
Subject: Re: How can I tell if a GPU is using double precision?
Date: Wed, 13 Apr 2022 16:50:43 +0200
Organization: A noiseless patient Spider
Lines: 21
Message-ID: <1wzkrqe1vjr5m$.dlg@br-ederson.eternal-september.org>
References: <op.1kcsnhi5mvhs6z@ryzen.lan> <cmb5K.238340$41E7.76648@fx37.iad> <op.1kh9p9hqmvhs6z@ryzen.lan> <LIg5K.67395$Kdf.29834@fx96.iad> <op.1kjrpfujmvhs6z@ryzen.lan>
Reply-To: br.ederson@arcor.de
Mime-Version: 1.0
Content-Type: text/plain; charset="us-ascii"
Content-Transfer-Encoding: 7bit
Injection-Info: br-ederson.eternal-september.org; posting-host="eb415c66225cde5e5cf538b25e91df50";
logging-data="14700"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/zx6irk9Xmkv4GzBYoQwMiqkGLeVeSIME="
User-Agent: 40tude_Dialog/2.0.15.41 (b4e86176.224.456)
Cancel-Lock: sha1:wtevS/ajR2avdK6YcxwNGH4amOs=
 by: B. R. 'BeAr - Wed, 13 Apr 2022 14:50 UTC

On Wed, 13 Apr 2022 05:49:53 +0100, Commander Kinsey wrote:

> I know what DP and SP each card does. But I don't know what type the
> program is using! I want some kind of task manager graph to show current
> usage seperated into DP and SP.

That's nothing you could expect to see from a simple task manager view
or process monitor result. You'd need a profiler application like Intel
VTune or HPCToolkit:

https://www.intel.com/content/www/us/en/developer/tools/oneapi/vtune-profiler.html
http://hpctoolkit.org

Both suites are not for the faint of heart, though. They require detailed
knowledge for usage as well as results interpretation...

BeAr
--
===========================================================================
= What do you mean with: "Perfection is always an illusion"? =
===============================================================--(Oops!)===

1
server_pubkey.txt

rocksolid light 0.9.8
clearnet tor