Rocksolid Light

Welcome to novaBBS (click a section below)

mail  files  register  newsreader  groups  login

Message-ID:  

You will have many recoverable tape errors.


devel / comp.theory / Re: Question words, and what's an answer

SubjectAuthor
o Re: Question words, and what's an answerRoss Finlayson

1
Re: Question words, and what's an answer

<pwqdnf28GtuBt1T4nZ2dnZfqnPqdnZ2d@giganews.com>

  copy mid

https://www.novabbs.com/devel/article-flat.php?id=52751&group=comp.theory#52751

  copy link   Newsgroups: sci.logic comp.theory
Path: i2pn2.org!i2pn.org!usenet.blueworldhosting.com!diablo1.usenet.blueworldhosting.com!feeder.usenetexpress.com!tr3.iad1.usenetexpress.com!69.80.99.27.MISMATCH!Xl.tags.giganews.com!local-2.nntp.ord.giganews.com!news.giganews.com.POSTED!not-for-mail
NNTP-Posting-Date: Sun, 11 Feb 2024 20:12:12 +0000
Subject: Re: Question words, and what's an answer
Newsgroups: sci.logic,comp.theory
References: <d9acc9d2-09f4-49b1-8371-6c3e76fb5038n@googlegroups.com> <cec8e8b0-b143-47f5-ba59-4224e828f177n@googlegroups.com> <2183cbfd-c70d-44de-af70-8bb0a8e282a1n@googlegroups.com> <efdca991-a5db-4fb1-833f-c27577005ac5n@googlegroups.com> <7b287b64-a820-4365-8e20-2cbd5b8bd8a2n@googlegroups.com> <677be551-49c1-47b5-9ba1-b993fd759937n@googlegroups.com> <0cd2b747-849a-475c-8b47-714c4ecb3dfbn@googlegroups.com> <01faee89-0417-4cda-99a7-23feb8b85b40n@googlegroups.com> <e50b3837-74c7-4826-a9e5-eaf3e948d0ccn@googlegroups.com> <2772a6b3-4dc3-4fd9-8ccd-eb62932b6561n@googlegroups.com>
From: ross.a.f...@gmail.com (Ross Finlayson)
Date: Sun, 11 Feb 2024 12:12:27 -0800
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101 Thunderbird/38.6.0
MIME-Version: 1.0
In-Reply-To: <2772a6b3-4dc3-4fd9-8ccd-eb62932b6561n@googlegroups.com>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Message-ID: <pwqdnf28GtuBt1T4nZ2dnZfqnPqdnZ2d@giganews.com>
Lines: 413
X-Usenet-Provider: http://www.giganews.com
X-Trace: sv3-mjafidP/sMn5qeq9r/I0iajknAL9o8bZzuJUW6NkTaRADcEpYyrjgL0++05pkKL7SGyPbP5LLR6W9sA!qTl1UqPEEFFDWiEXPIYFFYul2WEYAFQ8j4Q4QVVDI91CjYOu9f/zOTij7hFaH7Jsa79Mh3AvXDr+
X-Complaints-To: abuse@giganews.com
X-DMCA-Notifications: http://www.giganews.com/info/dmca.html
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
 by: Ross Finlayson - Sun, 11 Feb 2024 20:12 UTC

On 08/05/2023 05:27 PM, Ross Finlayson wrote:
> On Sunday, June 18, 2023 at 9:29:01 PM UTC-7, Ross Finlayson wrote:
>> On Thursday, June 8, 2023 at 2:24:53 PM UTC-7, Ross Finlayson wrote:
>>> On Wednesday, April 26, 2023 at 11:00:27 AM UTC-7, Ross Finlayson wrote:
>>>> On Tuesday, April 25, 2023 at 8:21:14 AM UTC-7, Ross Finlayson wrote:
>>>>> On Monday, March 27, 2023 at 5:46:48 PM UTC-7, Ross Finlayson wrote:
>>>>>> On Friday, March 24, 2023 at 6:59:35 PM UTC-7, Ross Finlayson wrote:
>>>>>>> On Friday, March 24, 2023 at 12:39:19 AM UTC-7, Ross Finlayson wrote:
>>>>>>>> On Wednesday, March 22, 2023 at 6:18:44 PM UTC-7, Ross Finlayson wrote:
>>>>>>>>> On Sunday, March 19, 2023 at 12:02:51 PM UTC-7, Ross Finlayson wrote:
>>>>>>>>>> There are about seven question words in English. What do question words do?
>>>>>>>>>> Question words pose placeholders for statements of referents that when fulfilled make answers.
>>>>>>>>>>
>>>>>>>>>> There are these:
>>>>>>>>>> who why when where
>>>>>>>>>> about referents of persons, reasons, times, locations,
>>>>>>>>>> what which
>>>>>>>>>> about type of referents
>>>>>>>>>> how
>>>>>>>>>> about means, then sorts compound question words
>>>>>>>>>> how many
>>>>>>>>>> how much
>>>>>>>>>> how often
>>>>>>>>>> and various variations and archaic types
>>>>>>>>>> whither whence howso
>>>>>>>>>> that question words form usual parts-of-speech that reflect each when they are posed and well-formed that the
>>>>>>>>>> various parts of speech in statement establish the considerations of the facts or statements that fulfill resolving
>>>>>>>>>> the placeholder of the question words, with a constant or a predicate.
>>>>>>>>>>
>>>>>>>>>> Question words in clauses may be pronouns. The general question word is "what", and the usual
>>>>>>>>>> referent of a pronound can be replaced with "that", or "what":
>>>>>>>>>>
>>>>>>>>>> "what is it that it is so how"
>>>>>>>>>> "what is it that it is so which"
>>>>>>>>>> "what is it that is is so why"
>>>>>>>>>>
>>>>>>>>>> and so on.
>>>>>>>>>>
>>>>>>>>>> (C.f. Faith No More's Epic's "what is it" and Brad Pitt's "what" and "what's in the box" from Fight Club or Seven.
>>>>>>>>>> "What is it" is "you want it all and can't have it" and "what" is "let's fight" and "what's in the box" is "the death of my love".)
>>>>>>>>>>
>>>>>>>>>> There are "would", "could", and "should", with respect to "why", also known as 'wudda cudda shudda',
>>>>>>>>>> posing in the immediate what follows a referent with respect to the posing of a question without question words,
>>>>>>>>>> in English simply reversing object and verb.
>>>>>>>>>>
>>>>>>>>>> "If I would, could you?"
>>>>>>>>>> "What time is love?"
>>>>>>>>>> "So what?"
>>>>>>>>>> ("In case you didn't feel...")
>>>>>>>>>>
>>>>>>>>>> The declaration of a question with question words or immediately poses that there are referents
>>>>>>>>>> that fulfill the placeholders under general substitution. I.e., that a question is well-posed and
>>>>>>>>>> well-formed makes for implicits and explicits that there are referents of objects in their statements
>>>>>>>>>> what result answers to the question, then in terms of right and wrong answers or questions without
>>>>>>>>>> answers or questions with multiple answers or questions with yes/no answers and questions without
>>>>>>>>>> yes/no answers when an answer is assumed or an answer at all is assumed or not assumed.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> "..., desu-ka?": "what?"
>>>>>>>>>> "..., no?": "?"
>>>>>>>>>>
>>>>>>>>>> There's for "what" and "which", that what is for types and which is for elements, in a usual sense. This gets into quantification,
>>>>>>>>>> where for various specialization of quantification:
>>>>>>>>>>
>>>>>>>>>> universal:
>>>>>>>>>>
>>>>>>>>>> for-each
>>>>>>>>>> for-any
>>>>>>>>>> for-every
>>>>>>>>>> for-all
>>>>>>>>>>
>>>>>>>>>> existential:
>>>>>>>>>>
>>>>>>>>>> exists
>>>>>>>>>> exists-unique
>>>>>>>>>> exists-multiple
>>>>>>>>>>
>>>>>>>>>> and various forms of kinds and types and categories.
>>>>>>>>>>
>>>>>>>>>> "what categories ..."
>>>>>>>>>> "which members ..."
>>>>>>>>>>
>>>>>>>>>> have these imply there are whether the question has an answer that categories do or don't exist,
>>>>>>>>>> and members do or don't exist, which match predicates fulfilled by the referents so place-holded.
>>>>>>>>>>
>>>>>>>>>> These have implicits
>>>>>>>>>>
>>>>>>>>>> for -what -all
>>>>>>>>>> for -which -each
>>>>>>>>>> for -how -every
>>>>>>>>>> for -if -any
>>>>>>>>>>
>>>>>>>>>> There's a usual sort of negation that goes along with the fulfillment of predicates, or lack thereof.
>>>>>>>>>>
>>>>>>>>>> not
>>>>>>>>>> none
>>>>>>>>>> never
>>>>>>>>>> not necessarily
>>>>>>>>>>
>>>>>>>>>> There's that "necessity" goes with "negation".
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> So, usually derivations are posed as statements, but, they answer q uestions. Any posed statement
>>>>>>>>>> automatically or as a complement of its declaration, poses questions so answered, that all posed
>>>>>>>>>> statements together must answer the same questions, or that all posed questions together, must
>>>>>>>>>> refer to the same referents and facts, or after science and probability, estimations.
>>>>>>>>>>
>>>>>>>>>> These duals, of statements making referents to fulfilled nouns and questions posing frameworks
>>>>>>>>>> of placeholders and abstractions of referents, make for space in usual languages that admit a
>>>>>>>>>> very direct statement in logic.
>>>>>>>>>>
>>>>>>>>>> I suppose then there's "if", and its role. There's "if", for what's "given".
>>>>>>>>>>
>>>>>>>>>> (For what that that's given, for what that that that's given, and so on, make for that the
>>>>>>>>>> use of "that" can fill in implicits.)
>>>>>>>>>>
>>>>>>>>>> So, let is consider what that and that what defines a well-posed question, and words.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> What's a "gift of words".
>>>>>>>>> So, it seems reflexive for "tests of reflexes" that any given stipulation, makes for according
>>>>>>>>> to arrangements of relations in types, answers, of questions more and less specific and
>>>>>>>>> general, according to specialization of type and abstraction of type, for inference of type.
>>>>>>>>>
>>>>>>>>> Then, there's a usual sort of notion that "according to extensionality, what answer same
>>>>>>>>> questions are same things". This is just like model theory, that "structural" or "duck" typing
>>>>>>>>> of a sort models an interface or value, or behavior, that it is so that behavior is simply a value
>>>>>>>>> over time and space, what is described as a model a configuration space what is described in
>>>>>>>>> a field theory a lattice occupation.
>>>>>>>>>
>>>>>>>>> Then, for various sorts of quantifiers, and where quantifiers are refined into the behavior
>>>>>>>>> of the numbering or counting, the reckoning, the results after quantifiers, what is so answered
>>>>>>>>> and how things so answer, is for a first-class arrangement of questions as establishing of sorts,
>>>>>>>>> types.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> So, I'm curious a development already of what with respect to type theory and the theory of
>>>>>>>>> types and model theory of the theory of types, and, questions and answers, about what in
>>>>>>>>> the affirmative and negative and multivalent or indeterminate, there are types statements,
>>>>>>>>> how it can be formalized quite neatly to encode, that a given collection of evaluated knowledge,
>>>>>>>>> can result a read-out of a cataloged indexed question-and-answer setting, for example to
>>>>>>>>> automatically establish what questions do and don't have answers given various facts,
>>>>>>>>> and what those are, inclusive of the specifics and the general, and incorporating how to
>>>>>>>>> address the many dimensions of reference and relation in fact, as about the implicit quantifiers
>>>>>>>>> so described after formation in natural language, that of course it results that a subset of natural
>>>>>>>>> language is curated to a perfectly and demonstracy optimal symbolic language, that reads out
>>>>>>>>> in plain, natural language.
>>>>>>>>>
>>>>>>>>> I'd imagine that the formal theories of linguistics and knowledge representation have these,
>>>>>>>>> then for what it results how there are fundamentally and elementarily normal forms, of these
>>>>>>>>> cataloged and collected projections and their referents, to make it so that it's easy to define
>>>>>>>>> and refine and narrow and expand, all sorts matters making for analytical deduction.
>>>>>>>> The idea of coding up the question words or a "query language, strongly typed", and including for
>>>>>>>> example that "questions are answers to what kind of questions are answers" and in the usual notion
>>>>>>>> of cultural reference that "Alex Trebek's gameshows include Jeapordy a trivia game in the format of
>>>>>>>> your response must be in the form of a question, that is posed as if the given answer was instead a
>>>>>>>> question that it's unambiguous according to unique existence it's right or wrong there's one question",
>>>>>>>> here is for a brief formal outlay of an encoding of question words, that represent a subset of natural
>>>>>>>> language, a symbolic language, that happens to structurally reflect category, naturally.
>>>>>>>>
>>>>>>>> "This ... is a referent."
>>>>>>>>
>>>>>>>> So I'm curious if anyone here can tell me that of course there must be all sorts various efforts in
>>>>>>>> linguistics and knowledge representation, and, _lack-of-knowledge_ representation, here for
>>>>>>>> what results a, "theory of potentials", for knowledge, after questions words, and as above,
>>>>>>>> for "what" and "that" or "that that that", and, quantifier disambiguation according to comprehension,
>>>>>>>> and, the conditional or "what-if", and, necessity and negation, working up a brief algebra, in symbolic
>>>>>>>> terms, that results a natural encoding of statement and question, of a "theory of potentials" or
>>>>>>>> potential, for classical theories of knowledge, and, a, "theory of potentials" theory of knowledge.
>>>>>>>>
>>>>>>>> Then, it might help if you're already familiar with "theory of potentials" in other usual theories,
>>>>>>>> like mechanics and physics and here for the notions of conservation of knowledge for conservation
>>>>>>>> of truth for continuity and conservation laws, usual theories, or, at least one.
>>>>>>>>
>>>>>>>> Then, it's figured "it may be totally easy to make all sorts knowledge representation what is
>>>>>>>> fulfilled by a terse encoding, what makes for most usual cases of knowledge inference, in
>>>>>>>> small and well-defined and contained blocks of code, not requiring much more than a few
>>>>>>>> kilobytes of parallelizable algorithmic code according to an abstract machine's general-purpose
>>>>>>>> machine code", to represent knowledge.
>>>>>>>>
>>>>>>>>
>>>>>>>> It's like, today I was reading a McGann's "Towards a Literature of Knowledge", how brief can ontology be?
>>>>>>>>
>>>>>>>>
>>>>>>>> Then, the idea that after "what" and for "that", and "that*", then for types and group nouns
>>>>>>>> and collection verbs and collected nouns, usual genera of the other question words according
>>>>>>>> these, for the proper and individuals and so on, isn't it so that this makes a dual and complement,
>>>>>>>> and, a space of after affinity, a "theory of potentials of quest in knowledge"?
>>>>>>>>
>>>>>>>> Then, this idea of a "normal quest" goes along pretty well with other usual, questions,
>>>>>>>> in metaphysics and logic and of course for that language is intent, natural symbols for knowledge,
>>>>>>>> and more-than-less a rigorous, quantified, strongly-typed, working subset of formalizable,
>>>>>>>> symbolically, natural language.
>>>>>>>>
>>>>>>>> Then, of course, that readily extends to the verification of natural language generally,
>>>>>>>> accordingly to well-formed and well-reasoned natural language.
>>>>>>> Here's some more reading as getting into Maugin's development,
>>>>>>> for phonons and about Rayleigh then into Feynman and London,
>>>>>>> with comment about reading and dogma and then for models of light.
>>>>>>>
>>>>>>> https://www.youtube.com/watch?v=gMBcectYDws
>>>>>>>
>>>>>>>
>>>>>>> After the three R's, you figure there's relevance three G's:
>>>>>>> grammar, geometry, and gnosomy
>>>>>>> about then three C's:
>>>>>>> calculus, construction, conventions (or calluses)
>>>>>>> that there are already words and English is a reasonably
>>>>>>> rich language, and there are conventions like IUPAC naming
>>>>>>> outrageously long compounds, I expect to research some terms
>>>>>>> and find various sorts of detail about them.
>>>>>>>
>>>>>>> technical detail
>>>>>>> artistic detail
>>>>>>> abstract figurative detail
>>>>>>>
>>>>>>> So, I wonder to look for "algebraic grammar", in terms of that parts of
>>>>>>> speech assemble to clauses combine to phrases, in sentences and of
>>>>>>> their para-graphs.
>>>>>>>
>>>>>>> "Algebraic Grammar": Clough, 1930
>>>>>>> "Algebraic Grammar, Geometric Grammar": Vailati, 2020
>>>>>>>
>>>>>>> I generally consider to rely on the field of linguistics with respect to
>>>>>>> natural languages, where in terms of "formal languages" those are
>>>>>>> mostly as totally unstructured "the alphabet of a formal language"
>>>>>>> as being untyped symbols-of-coding, then that grammars in the usual
>>>>>>> context of program languages, are as up after Chomsky's hierarchy of
>>>>>>> the context of lookahead and context-free and what are formal methods
>>>>>>> about the "parse-ability" of languages, and various results from usual
>>>>>>> old formal grammars what make "too long, didn't read" and "language
>>>>>>> parse too hard you obviously aren't doing it".
>>>>>>>
>>>>>>> I.e., even parsing railroad-diagrams makes for a simple approach of
>>>>>>> recursion and back-and-forth which result much larger classes of parse-able
>>>>>>> grammars than usual right-linear productions in Backus-Naur grammars
>>>>>>> with usual old "the head only moves forward" and "the window is fixed
>>>>>>> and closed", that incorporating references in natural language is a simple
>>>>>>> enough matter of multiple passes over the content what results the
>>>>>>> brakceting and grouping and commenting and joining and up after the
>>>>>>> identification of the parts of speech, thus that it's framed from the beginning
>>>>>>> in a higher-order allotment of resources, resulting being closed up in some
>>>>>>> square and factorial resources, instead of blowing up and running out on down.
>>>>>>>
>>>>>>> "A Methodical English Grammar"
>>>>>>> "A Comprensive Grammar of the English Language"
>>>>>>>
>>>>>>>
>>>>>>> "Fundamentals of Language", Jakobson, 1956
>>>>>>>
>>>>>>> "Introduction to Language and Linguistics", Fasold, Connon-Linton, eds., 2006
>>>>>>>
>>>>>>> "The Cambridge Grammar of the English Language",
>>>>>>> https://en.wikipedia.org/wiki/The_Cambridge_Grammar_of_the_English_Language
>>>>>>>
>>>>>>> "... why should Oxford have all the fun."
>>>>>>>
>>>>>>>
>>>>>>> "Standard American English, Definition, Accent, & Use", not so much
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> "[CamGEL] is both a modern complement to existing descriptive grammars
>>>>>>> (Quirk et al. 1985; Biber et al. 1999) and an important resource for anyone interested
>>>>>>> in working with or finding out about English. In addition, the book is a very complete
>>>>>>> and convincing demonstration that the ideas of modern theoretical linguistics can be
>>>>>>> deployed in the detailed description of a particular language.[20]" -- Brew
>>>>>>>
>>>>>>> https://en.wikipedia.org/wiki/Raising_(linguistics)
>>>>>>>
>>>>>>> https://en.wikipedia.org/wiki/Phrase_structure_grammar
>>>>>>> https://en.wikipedia.org/wiki/Dependency_grammar
>>>>>>>
>>>>>>> hmm, it seems these are the two major diagrams of grammars, I wonder which we learned.
>>>>>>>
>>>>>>> Ah, it is surely the"dependency grammar" of the two, not the "AST" or "constituency".
>>>>>>>
>>>>>>>
>>>>>>> "The distinction between dependency and phrase structure grammars
>>>>>>> derives in large part from the initial division of the clause.
>>>>>>> The phrase structure relation derives from an initial binary division,
>>>>>>> whereby the clause is split into a subject noun phrase (NP) and a
>>>>>>> predicate verb phrase (VP). This division is certainly present in the
>>>>>>> basic analysis of the clause that we find in the works of, for instance,
>>>>>>> Leonard Bloomfield and Noam Chomsky. Tesnière, however, argued
>>>>>>> vehemently against this binary division, preferring instead to position
>>>>>>> the verb as the root of all clause structure. Tesnière's stance was that
>>>>>>> the subject-predicate division stems from term logic and has no place
>>>>>>> in linguistics.[8] The importance of this distinction is that if one acknowledges
>>>>>>> the initial subject-predicate division in syntax is real, then one is likely
>>>>>>> to go down the path of phrase structure grammar, while if one rejects
>>>>>>> this division, then one must consider the verb as the root of all structure,
>>>>>>> and so go down the path of dependency grammar.
>>>>>>> -- https://en.wikipedia.org/wiki/Dependency_grammar#Dependency_vs._phrase_structure
>>>>>>>
>>>>>>> Oh, no, "phase structure" is only for parsing "dit dit dit dot dot dot Morse code"
>>>>>>> not "the structure of an utterance its meaning in words".
>>>>>>>
>>>>>>>
>>>>>>> "The traditional focus on hierarchical order generated the impression
>>>>>>> that DGs have little to say about linear order, and it has contributed to
>>>>>>> the view that DGs are particularly well-suited to examine languages with
>>>>>>> free word order. A negative result of this focus on hierarchical order, however,
>>>>>>> is that there is a dearth of DG explorations of particular word order phenomena,
>>>>>>> such as of standard discontinuities. Comprehensive dependency grammar accounts
>>>>>>> of topicalization, wh-fronting, scrambling, and extraposition are mostly absent
>>>>>>> from many established DG frameworks. This situation can be contrasted with
>>>>>>> phrase structure grammars, which have devoted tremendous effort to exploring
>>>>>>> these phenomena. "
>>>>>>>
>>>>>>> https://en.wikipedia.org/wiki/Dependency_grammar#References
>>>>>>>
>>>>>>> Hm. "Delta, Delta, Delta, can I help ya help ya help ya."
>>>>>>>
>>>>>>> "Yes? This is Letters...."
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> http://attempto.ifi.uzh.ch/site/
>>>>>>>
>>>>>>> "ACE texts are computer-processable and can be unambiguously translated
>>>>>>> into discourse representation structures, a syntactic variant of first-order logic."
>>>>>>>
>>>>>>> https://en.wikipedia.org/wiki/Discourse_representation_theory
>>>>>>>
>>>>>>> "In one sense, DRT offers a variation of first-order predicate calculus—its
>>>>>>> forms are pairs of first-order formulae and the free variables that occur in them."
>>>>>>>
>>>>>>> https://en.wikipedia.org/wiki/Simplified_Technical_English
>>>>>>>
>>>>>>> "What is question word, that."
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Alright then, that's sort of what I'm looking for, then to make for the abstraction
>>>>>>> above that, and meaning, to be, what is the question, to make for natural views of
>>>>>>> representation, according to the reader, and their knowledge.
>>>>>> https://www.youtube.com/watch?v=3LnC4srnwtY
>>>>>>
>>>>>> Here's some more reading from Maugin and also some
>>>>>> about the path integral and atomic theory, or Knuth, Noether, and Feynman.
>>>>>>
>>>>>>
>>>>>> I found a copy of Curme Volume 1 or "Parts of Speech, and Accidence", so feel pretty
>>>>>> comfortable that for English at least those are mostly the terms I would use. The bookseller
>>>>>> referred me to Skeat's Etymology and so I'm also interested in the Wortenbuch, or some
>>>>>> apocryphal "a dictionary of all the words in all the languages, giving definitions and pointing
>>>>>> to the languages and origins". I haven't found a copy of Cambridge Grammar of the English
>>>>>> Language yet, but Curme goes a long way, and I like that mostly he only has examples of
>>>>>> the words and their parts of speech, not much entanglement with examples of usage,
>>>>>> so I imagine that this kind of multi-pass parser approach and resulting as of dependency
>>>>>> grammars for building terse, concise models of the referents, will go a long ways towards
>>>>>> natural, normal forms for the codification of natural language, and for of course showing
>>>>>> directly where it's possible that according to type theory there's natural theorem-proving.
>>>>>
>>>>>
>>>>> Reading through Curme, I've been learning some names of parts-of-speech
>>>>> that really help a lot, about all the appositives and adjectival and adverbial force,
>>>>> copulas and the English languages' success courtesy copulas, lots of things.
>>>>>
>>>>> "Once an argument has been transformed into symbolic form
>>>>> an important part of logical analysis has been accomplished.
>>>>> The next step involves the applications of the Rules of Distribution". -- Huppe and Kaminsky
>>>>>
>>>>> Then they go on to describe enthymemes and Sorites.
>>>>>
>>>>> So, I'll be looking to go through the type theory expressed in Curme,
>>>>> and making out for it a neat system of derivations and hierarchy.
>>>> It's said that "Roger Bacon" was working on "Universal Grammar" back in the "Dark Ages".
>>> I found a copy of Curme's "Volume II: Syntax" to complement "Volume I: Parts of
>>> Speech and Accidence", I'm quite excited about it and looking forward to figure
>>> out how to codify it into a model of natural speech.
>> Here's an interesting paper about inference. https://aclanthology.org/L14-1119/" rel="nofollow" target="_blank">https://aclanthology.org/L14-1119/
>>
>> The authors describe that there are _simple_ and there are _hard_ inferences,
>> and about "contextual" readings and "generosity".
>>
>> Lots of issues in the semantics of knowledge representation are explored in https://aclanthology.org papers.
>>
>> There's something to be said for looking through those and learning models of models
>> of learning of languages of languages.
>>
>> So, from Volumes I and II of Curme, Parts of Speech and Accidence and Syntax, one might aver that
>> this would be treating English like a dead language, but, it had already had a very long life.
>> Indeed it's not a dead language, neither for that matter is Latin in Italian or Sanskrit in Hindi,
>> but what Curme's descriptive grammar outlines is a very full treatment of usage and structure
>> of statement in English (and what would be models of such statements in other natural languages).
>
> So far my favorite part of Curme is the index at the end of Volume II. There is
> collected quite a useful ontology of the elements of syntax and their intent
> and their force and their organization. In a section of about twenty pages of
> entries, it seems a summary richer even than the usual linguists' of the day's.
>
>


Click here to read the complete article
1
server_pubkey.txt

rocksolid light 0.9.81
clearnet tor