Rocksolid Light

Welcome to novaBBS (click a section below)

mail  files  register  newsreader  groups  login

Message-ID:  

Alexander Graham Bell is alive and well in New York, and still waiting for a dial tone.


computers / comp.ai.philosophy / Re: Synchronous Systems

Re: Synchronous Systems

<c777e2a6-1b79-446a-86d1-5690c61a760en@googlegroups.com>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=9859&group=comp.ai.philosophy#9859

  copy link   Newsgroups: comp.ai.philosophy
X-Received: by 2002:a0c:f2d0:0:b0:474:e74:f43c with SMTP id c16-20020a0cf2d0000000b004740e74f43cmr8592557qvm.75.1658717585460;
Sun, 24 Jul 2022 19:53:05 -0700 (PDT)
X-Received: by 2002:a25:30c4:0:b0:670:9351:324f with SMTP id
w187-20020a2530c4000000b006709351324fmr7980630ybw.537.1658717585193; Sun, 24
Jul 2022 19:53:05 -0700 (PDT)
Path: i2pn2.org!i2pn.org!usenet.blueworldhosting.com!feed1.usenet.blueworldhosting.com!peer03.iad!feed-me.highwinds-media.com!news.highwinds-media.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: comp.ai.philosophy
Date: Sun, 24 Jul 2022 19:53:04 -0700 (PDT)
In-Reply-To: <t9u2k6$3al8l$1@dont-email.me>
Injection-Info: google-groups.googlegroups.com; posting-host=173.66.221.140; posting-account=E1NPMgoAAAAYgSBzAPtCmOrfbRMMCH4m
NNTP-Posting-Host: 173.66.221.140
References: <fe4e7cf2-8849-4732-9414-ba34bc95ca8fn@googlegroups.com>
<t7lqhm$2j8$1@dont-email.me> <t7qlqt$isa$1@dont-email.me> <724bc56a-2538-43d1-af12-03e698187c8an@googlegroups.com>
<t7tt73$5lv$1@dont-email.me> <056c68bf-8636-4e70-87b4-e56bbdc32ba9n@googlegroups.com>
<t7ubec$lnm$1@dont-email.me> <53b8834b-3da2-4bf6-a790-dabaf6119277n@googlegroups.com>
<t9u2k6$3al8l$1@dont-email.me>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <c777e2a6-1b79-446a-86d1-5690c61a760en@googlegroups.com>
Subject: Re: Synchronous Systems
From: eagleson...@gmail.com (Douglas Eagleson)
Injection-Date: Mon, 25 Jul 2022 02:53:05 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
X-Received-Bytes: 14373
 by: Douglas Eagleson - Mon, 25 Jul 2022 02:53 UTC

On Monday, July 4, 2022 at 2:50:51 PM UTC+8, Jeff Barnett wrote:
> On 7/3/2022 10:29 PM, Douglas Eagleson wrote:
> > On Friday, June 10, 2022 at 10:48:49 AM UTC+8, Jeff Barnett wrote:
> >> On 6/9/2022 8:19 PM, Douglas Eagleson wrote:
> >>> On Friday, June 10, 2022 at 6:45:59 AM UTC+8, Jeff Barnett wrote:
> >>>> On 6/9/2022 3:32 PM, Douglas Eagleson wrote:
> >>>>> On Thursday, June 9, 2022 at 1:21:36 AM UTC+8, Jeff Barnett wrote:
> >>>>>> On 6/6/2022 3:11 PM, Jeff Barnett wrote:
> >>>>>>> On 6/6/2022 2:10 PM, Douglas Eagleson wrote:
> >>>>>>>> I am working on a project. How does a synchronous system get to be
> >>>>>>>> stated in my version of AI, old Greek theory. These have two or more
> >>>>>>>> systems with a common state. Synchronous can be random/coincidental
> >>>>>>>> caused or caused by a variable/human action. These point to an AI
> >>>>>>>> relation inferable.
> >>>>>>>>
> >>>>>>>> Two clocks can be made synchronous by a simple relative time. Actions
> >>>>>>>> at times can make this relative solvable. Making synchronous an
> >>>>>>>> abstract relation.
> >>>>>>>
> >>>>>>> Warning: my spelling of names is probably not accurate.
> >>>>>>>
> >>>>>>> In the mid 1960s, Alan Perlis then at CMU, was dissertation advisor to
> >>>>>>> two PhD students working in foundations of Computer Science vis à vis
> >>>>>>> program language semantics.
> >>>>>>>
> >>>>>>> One student was Tim Standish who wrote about data structure definition
> >>>>>>> primitives. One could use the proposed set of primitives to explain data
> >>>>>>> structure definition in your favorite languages. In other words, his
> >>>>>>> primitives could be used as a macro language to define the intent of
> >>>>>>> data declarations. This dissertation was noted by a big chunk of the CS
> >>>>>>> community who was, at the time, trying develop better tools for
> >>>>>>> inventing and implementing new languages. Last I knew, Tim was at
> >>>>>>> University of California at Irvine.
> >>>>>>>
> >>>>>>> The other student was Bob(?) Fisher(?) and he did something that on the
> >>>>>>> surface sounded as similar to Tim's work. The difference was that he
> >>>>>>> wanted primitives to define the meaning of /control/ structures. Not
> >>>>>>> only did he handle the usual (sequence, parallel, conditional, etc.) he
> >>>>>>> also dealt with sexier things such as atomic-with-respect-to,
> >>>>>>> wait-for-condition (join), indivisible-with-respect-to, priorities
> >>>>>>> (e.g., to model interrupts), and more. I think Bob(?) was at DARPA soon
> >>>>>>> after school and then disappeared into the wood work.
> >>>>>>>
> >>>>>>> I don't know how you might get a copy of Bob's dissertation but, if you
> >>>>>>> could, a whole panorama of interesting possibilities might be made
> >>>>>>> apparent to you and your endeavor.
> >>>>>>>
> >>>>>>> I'm sorry that I can't be more specific with references and citations
> >>>>>>> but by encounters with the individuals mentioned happened 50+ years ago.
> >>>>>> I did a little poking around and found a correct name: "Dave A. Fisher".
> >>>>>> His dissertation is also available online. Google "Fisher, Control
> >>>>>> Structures" and the first hit is a PDF at the pseudo URL
> >>>>>> "https://citeseers.ist.psu.edu>viewdoc>download". Just click on this
> >>>>>> item in the Google output and whatever your setup does for PDF will happen.
> >>>>>> --
> >>>>>> Jeff Barnett
> >>>>> I downloaded Fisher's dissertation. I need a while to read it.
> >>>>> Basically, my first look is to be understanding the general/abstract
> >>>>> control structure.
> >>>>>
> >>>>> Is there a control structure definable using object theory?
> >>>>> Generalizing the meaning of it's primitive.
> >>>> A short answer to your question is probably no but maybe. The issue is
> >>>> that control cliches define behavior, not "static" relations among data.
> >>>> The "maybe" comes from local nests of related behaviors as abstract
> >>>> objects then defining relations among these sorts of objects.
> >>>>
> >>>> It's been a longtime since I read it so I can't rely on my memory for
> >>>> any real details. What I do remember is that it was a thrill to see a
> >>>> thesis take on such a difficult, abstract problem and get some of it
> >>>> right (IMO). There was nothing like it in the literature so it was a
> >>>> first hack at nailing down one of the most important aspects of
> >>>> computational systems and the whole notion of a computation.
> >>>> Unfortunately, this work was not followed by a second tier of research.
> >>>> --
> >>>> Jeff Barnett
> >>> I did a google search and found some later work of Fisher.
> >>> I believe he went to help with DOD on the foundations of
> >>> the ADA language.
> >> That sounds right. I bumped into him once after he was done at CMU. We
> >> talked for a while - he was amazed that anyone had read his
> >> dissertation. I can't recall where this happened but he mentioned having
> >> been at DARPA and I assumed that he was there in the Information
> >> Technology Office as a Program Manager (PM). In the 1960s and most of
> >> the 70s, most of that office's PMs were recent PhD graduates. Later on,
> >> PMs were either military or civilians who were comfortable in suits
> >> and ties. Big change.
> >>
> >> As I said above, I was disappointed that nobody picked up and continued
> >> his line of research. If taken to the next step it would have an impact
> >> on hardware design, compiling programs with tons of parallelism, and
> >> make it possible to better reason about covert channels when trying to
> >> determine security properties of systems.
> >>
> >> Good luck with your endeavor.
> >> --
> >> Jeff Barnett
> > well I read the first 50 pages and it turned into Einstein level logic.
> > He calls a control structure as defining an Interpreter for an Interpreter.
> > This creator Interpreter does not need to compile itself as per Turing
> > advice. It can be implemented in a language such as C using subroutines
> > and functions and other C structures.
> >
> > He goes into great detail designing a syntax for his language Sol.
> >
> > I did have difficulty reading which "Interpreter" he was writing about.
> >
> > He introduced the operation of a process "monitor". Basically
> > a list of objects held in a main process. He stated quite nicely the
> > idea of only exercising a list item when an input variable has a state change.
> > Maybe a poorman's object monitor can implement object process?
> >
> > I am still looking at the synchronous issue. Basically I need to make
> > a blackbox the checks for this state. The issue of clock error occurs.
> > So a class of input must be a degree of accuracy, maybe as a percentage..
> >
> > I got lost in the realm of math. Does a function simply define
> > a synchronous path? I need some advice.
> >
> > I do believe two numbers always existing together can be called
> > synchronous
> I'll start with the last questions first: You ask what he meant by a
> function and that is not so easy to answer. Mathematically a function is
> an entity that maps some values to other values, where the input values
> are always mapped to the same output values. In the world of software we
> don't mean that at all. Take a similar question: What's a structure? The
> answer is that it's the thing defined by your language's primitive with
> a name like DEFSTRUCT. Similarly a function, subroutine, etc., is that
> thing defined by the primitive your language provides to define such
> things. Does a function define a synchronous path? Depends on the
> language in which it is defined.
>
> I don't know what it means for two numbers to always exist together so I
> couldn't determine if they were synchronous.
>
> It's been (quite) a while since I read this thesis but I might be able
> to add some to what you have got out of it so far. The things said about
> multiple interpreters was the following: In order to interpret a control
> structure, the interpreter must "do" the control structure. Let's take
> an example: PARALLEL(x, y, z), where x, y, and z are program pieces. In
> order to really get the effect of parallel execution, the interpreter
> must, in general, start Ix, Iy, and Iz; three interpreter routines, one
> to interpret x, one to interpret y, and one to interpret z and they must
> execute in parallel. And the same thing needs to happen when each of the
> other control primitives are encountered. (There are, of course,
> optimizations such as subsume a sequential element that appears in a
> sequential, etc.) One may think that you could simulate PARALLEL by some
> sort of interleaving on sequential hardware but when you mix in other
> control relations the interpreter can't be faithful to the implied
> semantics.
>
> The synchronization issue is that, for example, a monitor must
> instantaneously spot that its condition has been satisfied so that a
> declared reaction will occur. This is a hell of a burden on any
> interpretation scheme. Let's look at an example: Let the variable X be a
> sixteen bit integer; let the variable H be the high order 8 bits of X
> and L be the low order 8 bits of X. Assume that there is a monitor on
> the value of X, then that monitor must actively take a peek when either
> H or L is modified. Similarly if there is a monitor on either H or L,
> it must take a peek any time X is modified. This example may seem quite
> artificial but it isn't. Consider interrupt structures of your favorite
> computer. Bits are flipped in registers and interpreted as signals to
> and by the OS. Describing and simulating such capabilities as they
> actually work is quite difficult.
>
> For a moment, set aside the issue of interpreting programs written in
> the control structure language and consider using the language to write
> a detailed spec for a modern CPU with multiple cores and multiple
> threads per core. You want to specify what the range of behaviors are
> allowed. If you think about this for a while, I believe that you will
> appreciate why the dissertation seems so convoluted. It's too bad that a
> second dissertation on the same topic did not follow and clarify all of
> these issues.
>
> At one point in the 1970s, I wanted to abstract the control flow and
> data flow within a speech understanding system so I invented a language
> called CSL (Control Structure Language) in which modules did not know
> about each other. Data communications was over a set of software buses
> (think pipes) and common data stores. CSL provided the primitives to
> move data from module to module and enforce sequential execution among
> threads, an I don't care what order they run in (pseudo parallel), and
> condition monitors. There were some tokens pushed around to simulate
> control signals, etc., something like Petri nets. The point of this
> exercise was to put together a problem solver that did not commit order
> of computation constraints when there was no reason to do so. As we
> learned more, we could modify the CSL to exhibit more directed behavior.
> By the way, the pseudo parallel directive assigned random numbers
> dynamically to parallel threads as priorities so that running the system
> on the same data multiple times could exhibit multiple behaviors and
> generate different answers.
>
> I don't necessarily recommend reading another obscure paper (on CSL) but
> if you are interested, a pdf copy is at
> https://notatt.com/large-systems.pdf
> --
> Jeff Barnett
I am still writing a reply.
Pure synchronous computation is running the same code on multiple
identical computers.

Synthetic synchronous computation has data changing between
running codes. The processes/code being the same?

I don't know.

Having an answer appear at the same time for all running
monitored processes. A process dwell time could be used to enforce
the synchronous.

SubjectRepliesAuthor
o Synchronous Systems

By: Douglas Eagleson on Mon, 6 Jun 2022

11Douglas Eagleson
server_pubkey.txt

rocksolid light 0.9.8
clearnet tor