Syntactic Transformations
Syntactic Transformations
To discuss the import of the above definitions in greater depth, it serves to establish a number of logical relations and set-theoretic identities that can be found to hold among this array of conceptions and constructions. Facilitating this task requires in turn a number of auxiliary concepts and notations.
The diverse notions of indication under discussion are expressed in a variety of different notations, in particular, the logical language of sentences, the functional language of propositions, and the geometric language of sets. Thus, one way to explain the relationships that exist among these concepts is to describe the translations that they induce among the allied families of notation.
Syntactic Transformation Rules
A good way to summarize these translations and to organize their use in practice is by means of the syntactic transformation rules (STRs) that partially formalize them. A rudimentary example of a STR is readily mined from the raw materials that are already available in this area of discussion. To begin, let the definition of an indicator function be recorded in the following form:
| |||||||||
| |||||||||
|
In practice, a definition like this is commonly used to substitute one logically equivalent expression or sentence for another in a context where the conditions of using the definition this way are satisfied and where the change is perceived to advance a proof. This employment of a definition can be expressed in the form of a STR that allows one to exchange two expressions of logically equivalent forms for one another in every context where their logical values are the only consideration. To be specific, the logical value of an expression is the value in the boolean domain \(\underline\mathbb{B} = \{ \underline{0}, \underline{1} \} = \{ \operatorname{false}, \operatorname{true} \}\) that the expression stands for in its context or represents to its interpreter.
In the case of Definition 1, the corresponding STR permits one to exchange a sentence of the form \(x \in Q\) with an expression of the form \(\upharpoonleft Q \upharpoonright (x)\) in any context that satisfies the conditions of its use, namely, the conditions of the definition that lead up to the stated equivalence. The relevant STR is recorded in Rule 1. By way of convention, I list the items that fall under a rule roughly in order of their ascending conceptual subtlety or their increasing syntactic complexity, without regard to their normal or typical orders of exchange, since this can vary widely from case to case.
\(\text{Rule 1}\!\) | |||
\(\text{If}\!\) | \(Q \subseteq X\) | ||
\(\text{then}\!\) | \(\upharpoonleft Q \upharpoonright ~:~ X \to \underline\mathbb{B}\) | ||
\(\text{and if}\!\) | \(x \in X\) | ||
\(\text{then}\!\) | \(\text{the following are equivalent:}\!\) | ||
\(\text{R1a.}\!\) | \(x \in Q\) | ||
\(\text{R1b.}\!\) | \(\upharpoonleft Q \upharpoonright (x)\) |
Conversely, any rule of this sort, properly qualified by the conditions under which it applies, can be turned back into a summary statement of the logical equivalence that is involved in its application. This mode of conversion a static principle and a transformational rule, that is, between a statement of equivalence and an equivalence of statements, is so automatic that it is usually not necessary to make a separate note of the "horizontal" versus the "vertical" versions.
As another example of a STR, consider the following logical equivalence, that holds for any \(Q \subseteq X\) and for all \(x \in X.\)
\(\upharpoonleft Q \upharpoonright (x) ~\Leftrightarrow~ \upharpoonleft Q \upharpoonright (x) = \underline{1}.\) |
In practice, this logical equivalence is used to exchange an expression of the form \(\upharpoonleft Q \upharpoonright (x)\) with a sentence of the form \(\upharpoonleft Q \upharpoonright (x) = \underline{1}\) in any context where one has a relatively fixed \(Q \subseteq X\) in mind and where one is conceiving \(x \in X\) to vary over its whole domain, namely, the universe \(X.\!\) This leads to the STR that is given in Rule 2.
\(\text{Rule 2}\!\) | |||
\(\text{If}\!\) | \(f : X \to \underline\mathbb{B}\) | ||
\(\text{and}\!\) | \(x \in X\) | ||
\(\text{then}\!\) | \(\text{the following are equivalent:}\!\) | ||
\(\text{R2a.}\!\) | \(f(x)\!\) | ||
\(\text{R2b.}\!\) | \(f(x) = \underline{1}\) |
Rules like these can be chained together to establish extended rules, just so long as their antecedent conditions are compatible. For example, Rules 1 and 2 combine to give the equivalents that are listed in Rule 3. This follows from a recognition that the function \(\upharpoonleft Q \upharpoonright ~:~ X \to \underline\mathbb{B}\) that is introduced in Rule 1 is an instance of the function \(f : X \to \underline\mathbb{B}\) that is mentioned in Rule 2. By the time one arrives in the "consequence box" of either Rule, then, one has in mind a comparatively fixed \(Q \subseteq X,\) a proposition \(f\!\) or \(\upharpoonleft Q \upharpoonright\) about things in \(X,\!\) and a variable argument \(x \in X.\)
| ||||||||||||||||||||
| ||||||||||||||||||||
|
A large stock of rules can be derived in this way, by chaining together segments that are selected from a stock of previous rules, with perhaps the whole process of derivation leading back to an axial body or a core stock of rules, with all recurring to and relying on an axiomatic basis. In order to keep track of their derivations, as their pedigrees help to remember the reasons for trusting their use in the first place, derived rules can be annotated by citing the rules from which they are derived.
In the present discussion, I am using a particular style of annotation for rule derivations, one that is called proof by grammatical paradigm or proof by syntactic analogy. The annotations in the right hand margin of the Rule Box interweave the numerators and the denominators of the paradigm being employed, in other words, the alternating terms of comparison in a sequence of analogies. Taking the syntactic transformations marked in the Rule Box one at a time, each step is licensed by its formal analogy to a previously established rule.
For example, the annnotation \(X_1 : A_1 :: X_2 : A_2\!\) may be read to say that \(X_1\!\) is to \(A_1\!\) as \(X_2\!\) is to \(A_2,\!\) where the step from \(A_1\!\) to \(A_2\!\) is permitted by a previously accepted rule.
This can be illustrated by considering the derivation of Rule 3 in the augmented form that follows:
\(\begin{array}{lcclc} \text{R3a.} & x \in Q & \text{is to} & \text{R1a.} & x \in Q \\[6pt] & & \text{as} & & \\[6pt] \text{R3b.} & \upharpoonleft Q \upharpoonright (x) & \text{is to} & \text{R1b.} & \upharpoonleft Q \upharpoonright (x) \\[6pt] & & \text{and} & & \\[6pt] \text{R3b.} & \upharpoonleft Q \upharpoonright (x) & \text{is to} & \text{R2a.} & f(x) \\[6pt] & & \text{as} & & \\[6pt] \text{R3c.} & \upharpoonleft Q \upharpoonright (x) = \underline{1} & \text{is to} & \text{R2b.} & f(x) = \underline{1} \end{array}\) |
Notice how the sequence of analogies pivots on the term \(\text{R3b},\!\) viewing it first under the aegis of \(\text{R1b},\!\) as the second term of the first analogy, and then turning to view it again under the guise of \(\text{R2a},\!\) as the first term of the second analogy.
By way of convention, rules that are tailored to a particular application, case, or subject, and rules that are adapted to a particular goal, object, or purpose, I frequently refer to as Facts.
Besides linking rules together into extended sequences of equivalents, there is one other way that is commonly used to get new rules from old. Novel starting points for rules can be obtained by extracting pairs of equivalent expressions from a sequence that falls under an established rule and then stating their equality in the appropriate form of equation.
For example, extracting the expressions \(\text{R3a}\!\) and \(\text{R3c}\!\) that are given as equivalents in Rule 3 and explicitly stating their equivalence produces the equation recorded in Corollary 1.
\(\text{Corollary 1}\!\) | |||
\(\text{If}\!\) | \(Q \subseteq X\) | ||
\(\text{and}\!\) | \(x \in X\) | ||
\(\text{then}\!\) | \(\text{the following statement is true:}\!\) | ||
\(\text{C1a.}\!\) |
\(x \in Q ~\Leftrightarrow~ \upharpoonleft Q \upharpoonright (x) = \underline{1}\) |
\(\text{R3a} \Leftrightarrow \text{R3c}\) |
There are a number of issues, that arise especially in establishing the proper use of STRs, that are appropriate to discuss at this juncture. The notation \(\downharpoonleft s \downharpoonright\) is intended to represent the proposition denoted by the sentence \(s.\!\) There is only one problem with the use of this form. There is, in general, no such thing as "the" proposition denoted by \(s.\!\) Generally speaking, if a sentence is taken out of context and considered across a variety of different contexts, there is no unique proposition that it can be said to denote. But one is seldom ever speaking at the maximum level of generality, or even found to be thinking of it, and so this notation is usually meaningful and readily understandable whenever it is read in the proper frame of mind. Still, once the issue is raised, the question of how these meanings and understandings are possible has to be addressed, especially if one desires to express the regulations of their syntax in a partially computational form. This requires a closer examination of the very notion of context, and it involves engaging in enough reflection on the contextual evaluation of sentences that the relevant principles of its successful operation can be discerned and rationalized in explicit terms.
A sentence that is written in a context where it represents a value of \(\underline{1}\) or \(\underline{0}\) as a function of things in the universe \(X,\!\) where it stands for a value of \(\operatorname{truth}\) or \(\operatorname{falsehood},\) depending on how the signs that constitute its proper syntactic arguments are interpreted as denoting objects in \(X,\!\) in other words, where it is bound to lead its interpreter to view its own truth or falsity as determined by a choice of objects in \(X,\!\) is a sentence that might as well be written in the context \(\downharpoonleft \ldots \downharpoonright,\) whether this frame is explicitly marked around it or not.
More often than not, the context of interpretation fixes the denotations of most of the signs that make up a sentence, and so it is safe to adopt the convention that only those signs whose objects are not already fixed are free to vary in their denotations. Thus, only the signs that remain in default of prior specification are subject to treatment as variables, with a decree of functional abstraction hanging over all of their heads.
\(\downharpoonleft x \in Q \downharpoonright ~=~ \lambda (x, \in, Q).(x \in Q).\) |
Going back to Rule 1, we see that it lists a pair of concrete sentences and authorizes exchanges in either direction between the syntactic structures that have these two forms. But a sentence is any sign that denotes a proposition, and so there are any number of less obvious sentences that can be added to this list, extending the number of items that are licensed to be exchanged. For example, a larger collection of equivalent sentences is recorded in Rule 4.
\(\text{Rule 4}\!\) | |||
\(\text{If}\!\) | \(Q \subseteq X ~\text{is fixed}\) | ||
\(\text{and}\!\) | \(x \in X ~\text{is varied}\) | ||
\(\text{then}\!\) | \(\text{the following are equivalent:}\!\) | ||
\(\text{R4a.}\!\) | \(x \in Q\) | ||
\(\text{R4b.}\!\) | \(\downharpoonleft x \in Q \downharpoonright\) | ||
\(\text{R4c.}\!\) | \(\downharpoonleft x \in Q \downharpoonright (x)\) | ||
\(\text{R4d.}\!\) | \(\upharpoonleft Q \upharpoonright (x)\) | ||
\(\text{R4e.}\!\) | \(\upharpoonleft Q \upharpoonright (x) = \underline{1}\) |
The first and last items on this list, namely, the sentence \(\text{R4a}\!\) stating \(x \in Q\) and the sentence \(\text{R4e}\!\) stating \(\upharpoonleft Q \upharpoonright (x) = \underline{1},\) are just the pair of sentences from Rule 3 whose equivalence for all \(x \in X\) is usually taken to define the idea of an indicator function \(\upharpoonleft Q \upharpoonright ~:~ X \to \underline\mathbb{B}.\) At first sight, the inclusion of the other items appears to involve a category confusion, in other words, to mix the modes of interpretation and to create an array of mismatches between their ostensible types and the ruling type of a sentence. On reflection, and taken in context, these problems are not as serious as they initially seem. For example, the expression \(^{\backprime\backprime} \downharpoonleft x \in Q \downharpoonright \, ^{\prime\prime}\) ostensibly denotes a proposition, but if it does, then it evidently can be recognized, by virtue of this very fact, to be a genuine sentence. As a general rule, if one can see it on the page, then it cannot be a proposition but can at most be a sign of one.
The use of the basic logical connectives can be expressed in the form of a STR as follows:
| |||||||||||||||
| |||||||||||||||
|
As a general rule, the application of a STR involves the recognition of an antecedent condition and the facilitation of a consequent condition. The antecedent condition is a state whose initial expression presents a match, in a formal sense, to one of the sentences that are listed in the STR, and the consequent condition is achieved by taking its suggestions seriously, in other words, by following its sequence of equivalents and implicants to some other link in its chain.
Generally speaking, the application of a rule involves the recognition of an antecedent condition as a case that falls under a clause of the rule. This means that the antecedent condition is able to be captured in the form, conceived in the guise, expressed in the manner, grasped in the pattern, or recognized in the shape of one of the sentences in a list of equivalents or a chain of implicants.
A condition is amenable to a rule if any of its conceivable expressions formally match any of the expressions that are enumerated by the rule. Further, it requires the relegation of the other expressions to the production of a result. Thus, there is the choice of an initial expression that needs to be checked on input for whether it fits the antecedent condition and there are several types of output that are generated as a consequence, only a few of which are usually needed at any given time.
Editing Note. Need a transition here. Give a brief description of the Tables of Translation Rules that have now been moved to the Appendices, and then move on to the rest of the Definitions and Proof Schemata.
Value Rule 1 If v, w C B then "v = w" is a sentence about <v, w> C B2, [v = w] is a proposition : B2 -> B, and the following are identical values in B: V1a. [ v = w ](v, w) V1b. [ v <=> w ](v, w) V1c. ((v , w))
Value Rule 1 If v, w C B, then the following are equivalent: V1a. v = w. V1b. v <=> w. V1c. (( v , w )).
A rule that allows one to turn equivalent sentences into identical propositions: (S <=> T) <=> ([S] = [T]) Consider [ v = w ](v, w) and [ v(u) = w(u) ](u) Value Rule 1 If v, w C B, then the following are identical values in B: V1a. [ v = w ] V1b. [ v <=> w ] V1c. (( v , w ))
Value Rule 1 If f, g : U -> B, and u C U then the following are identical values in B: V1a. [ f(u) = g(u) ] V1b. [ f(u) <=> g(u) ] V1c. (( f(u) , g(u) ))
Value Rule 1 If f, g : U -> B, then the following are identical propositions on U: V1a. [ f = g ] V1b. [ f <=> g ] V1c. (( f , g ))$
Evaluation Rule 1 If f, g : U -> B and u C U, then the following are equivalent: E1a. f(u) = g(u). :V1a :: E1b. f(u) <=> g(u). :V1b :: E1c. (( f(u) , g(u) )). :V1c :$1a :: E1d. (( f , g ))$(u). :$1b
Evaluation Rule 1 If S, T are sentences about things in the universe U, f, g are propositions: U -> B, and u C U, then the following are equivalent: E1a. f(u) = g(u). :V1a :: E1b. f(u) <=> g(u). :V1b :: E1c. (( f(u) , g(u) )). :V1c :$1a :: E1d. (( f , g ))$(u). :$1b
| ||||||
| ||||||
|
| ||||||
| ||||||
|
| ||||||
| ||||||
|
| ||||||
| ||||||
|
Given an indexed set of sentences, \(s_j\!\) for \(j \in J,\) it is possible to consider the logical conjunction of the corresponding propositions. Various notations for this concept are be useful in various contexts, a sufficient sample of which are recorded in Definition 6.
| |||||||||
| |||||||||
|
| ||||||
| ||||||
|
Rule 5 If X, Y c U, then the following are equivalent: R5a. X = Y. :D2a :: R5b. u C X <=> u C Y, for all u C U. :D2b :D7a :: R5c. [u C X] = [u C Y], for all u C U. :D7b :??? :: R5d. {<u, v> C UxB : v = [u C X]} = {<u, v> C UxB : v = [u C Y]}. :??? :D5b :: R5e. {X} = {Y}. :D5a
Rule 6 If f, g : U -> V, then the following are equivalent: R6a. f = g. :D3a :: R6b. f(u) = g(u), for all u C U. :D3b :D6a :: R6c. ConjUu (f(u) = g(u)). :D6e
Rule 7 If P, Q : U -> B, then the following are equivalent: R7a. P = Q. :R6a :: R7b. P(u) = Q(u), for all u C U. :R6b :: R7c. ConjUu (P(u) = Q(u)). :R6c :P1a :: R7d. ConjUu (P(u) <=> Q(u)). :P1b :: R7e. ConjUu (( P(u) , Q(u) )). :P1c :$1a :: R7f. ConjUu (( P , Q ))$(u). :$1b
Rule 8 If S, T are sentences about things in the universe U, then the following are equivalent: R8a. S <=> T. :D7a :: R8b. [S] = [T]. :D7b :R7a :: R8c. [S](u) = [T](u), for all u C U. :R7b :: R8d. ConjUu ( [S](u) = [T](u) ). :R7c :: R8e. ConjUu ( [S](u) <=> [T](u) ). :R7d :: R8f. ConjUu (( [S](u) , [T](u) )). :R7e :: R8g. ConjUu (( [S] , [T] ))$(u). :R7f For instance, the observation that expresses the equality of sets in terms of their indicator functions can be formalized according to the pattern in Rule 9, namely, at lines (a, b, c), and these components of Rule 9 can be cited in future uses as "R9a", "R9b", "R9c", respectively. Using Rule 7, annotated as "R7", to adduce a few properties of indicator functions to the account, it is possible to extend Rule 9 by another few steps, referenced as "R9d", "R9e", "R9f", "R9g".
Rule 9 If X, Y c U, then the following are equivalent: R9a. X = Y. :R5a :: R9b. {X} = {Y}. :R5e :R7a :: R9c. {X}(u) = {Y}(u), for all u C U. :R7b :: R9d. ConjUu ( {X}(u) = {Y}(u) ). :R7c :: R9e. ConjUu ( {X}(u) <=> {Y}(u) ). :R7d :: R9f. ConjUu (( {X}(u) , {Y}(u) )). :R7e :: R9g. ConjUu (( {X} , {Y} ))$(u). :R7f
Rule 10 If X, Y c U, then the following are equivalent: R10a. X = Y. :D2a :: R10b. u C X <=> u C Y, for all u C U. :D2b :R8a :: R10c. [u C X] = [u C Y]. :R8b :: R10d. For all u C U, [u C X](u) = [u C Y](u). :R8c :: R10e. ConjUu ( [u C X](u) = [u C Y](u) ). :R8d :: R10f. ConjUu ( [u C X](u) <=> [u C Y](u) ). :R8e :: R10g. ConjUu (( [u C X](u) , [u C Y](u) )). :R8f :: R10h. ConjUu (( [u C X] , [u C Y] ))$(u). :R8g
Rule 11 If X c U then the following are equivalent: R11a. X = {u C U : S}. :R5a :: R11b. {X} = { {u C U : S} }. :R5e :: R11c. {X} c UxB : {X} = {<u, v> C UxB : v = [S](u)}. :R :: R11d. {X} : U -> B : {X}(u) = [S](u), for all u C U. :R :: R11e. {X} = [S]. :R An application of Rule 11 involves the recognition of an antecedent condition as a case under the Rule, that is, as a condition that matches one of the sentences in the Rule's chain of equivalents, and it requires the relegation of the other expressions to the production of a result. Thus, there is the choice of an initial expression that has to be checked on input for whether it fits the antecedent condition, and there is the choice of three types of output that are generated as a consequence, only one of which is generally needed at any given time. More often than not, though, a rule is applied in only a few of its possible ways. The usual antecedent and the usual consequents for Rule 11 can be distinguished in form and specialized in practice as follows: a. R11a marks the usual starting place for an application of the Rule, that is, the standard form of antecedent condition that is likely to lead to an invocation of the Rule. b. R11b records the trivial consequence of applying the spiny braces to both sides of the initial equation. c. R11c gives a version of the indicator function with {X} c UxB, called its "extensional form". d. R11d gives a version of the indicator function with {X} : U->B, called its "functional form". Applying Rule 9, Rule 8, and the Logical Rules to the special case where S <=> (X = Y), one obtains the following general fact.
Fact 1 If X,Y c U, then the following are equivalent: F1a. S <=> X = Y. :R9a :: F1b. S <=> {X} = {Y}. :R9b :: F1c. S <=> {X}(u) = {Y}(u), for all u C U. :R9c :: F1d. S <=> ConjUu ( {X}(u) = {Y}(u) ). :R9d :R8a :: F1e. [S] = [ ConjUu ( {X}(u) = {Y}(u) ) ]. :R8b :??? :: F1f. [S] = ConjUu [ {X}(u) = {Y}(u) ]. :??? :: F1g. [S] = ConjUu (( {X}(u) , {Y}(u) )). :$1a :: F1h. [S] = ConjUu (( {X} , {Y} ))$(u). :$1b /// {u C U : (f, g)$(u)} = {u C U : (f(u), g(u))} = {u C ///
Derived Equivalence Relations
One seeks a method of general application for approaching the individual sign relation, a way to select an aspect of its form, to analyze it with regard to its intrinsic structure, and to classify it in comparison with other sign relations. With respect to a particular sign relation, one approach that presents itself is to examine the relation between signs and interpretants that is given directly by its connotative component and to compare it with the various forms of derived, indirect, mediate, or peripheral relationships that can be found to exist among signs and interpretants by way of secondary considerations or subsequent studies. Of especial interest are the relationships among signs and interpretants that can be obtained by working through the collections of objects that they commonly or severally denote. A classic way of showing that two sets are equal is to show that every element of the first belongs to the second and that every element of the second belongs to the first. The problem with this strategy is that one can exhaust a considerable amount of time trying to prove that two sets are equal before it occurs to one to look for a counterexample, that is, an element of the first that does not belong to the second or an element of the second that does not belong to the first, in cases where that is precisely what one ought to be seeking. It would be nice if there were a more balanced, impartial, neutral, or nonchalant way to go about this task, one that did not require such an undue commitment to either side, a technique that helps to pinpoint the counterexamples when they exist, and a method that keeps in mind the original relation of "proving that" and "showing that" to probing, testing, and seeing "whether". A different way of seeing that two sets are equal, or of seeing whether two sets are equal, is based on the following observation: Two sets are equal as sets <=> the indicator functions of these sets are equal as functions <=> the values of these functions are equal on all domain elements. It is important to notice the hidden quantifier, of a universal kind, that lurks in all three equivalent statements but is only revealed in the last. In making the next set of definitions and in using the corresponding terminology it is taken for granted that all of the references of signs are relative to a particular sign relation R c OxSxI that either remains to be specified or is already understood. Further, I continue to assume that S = I, in which case this set is called the "syntactic domain" of R. In the following definitions let R c OxSxI, let S = I, and let x, y C S. Recall the definition of Con(R), the connotative component of R, in the following form: Con(R) = RSI = {<s, i> C SxI : <o, s, i> C R for some o C O}. Equivalent expressions for this concept are recorded in Definition 8. Definition 8 If R c OxSxI, then the following are identical subsets of SxI: D8a. RSI D8b. ConR D8c. Con(R) D8d. PrSI(R) D8e. {<s, i> C SxI : <o, s, i> C R for some o C O} The dyadic relation RIS that constitutes the converse of the connotative relation RSI can be defined directly in the following fashion: Con(R)^ = RIS = {<i, s> C IxS : <o, s, i> C R for some o C O}. A few of the many different expressions for this concept are recorded in Definition 9. Definition 9 If R c OxSxI, then the following are identical subsets of IxS: D9a. RIS D9b. RSI^ D9c. ConR^ D9d. Con(R)^ D9e. PrIS(R) D9f. Conv(Con(R)) D9g. {<i, s> C IxS : <o, s, i> C R for some o C O} Recall the definition of Den(R), the denotative component of R, in the following form: Den(R) = ROS = {<o, s> C OxS : <o, s, i> C R for some i C I}. Equivalent expressions for this concept are recorded in Definition 10. Definition 10 If R c OxSxI, then the following are identical subsets of OxS: D10a. ROS D10b. DenR D10c. Den(R) D10d. PrOS(R) D10e. {<o, s> C OxS : <o, s, i> C R for some i C I} The dyadic relation RSO that constitutes the converse of the denotative relation ROS can be defined directly in the following fashion: Den(R)^ = RSO = {<s, o> C SxO : <o, s, i> C R for some i C I}. A few of the many different expressions for this concept are recorded in Definition 11. Definition 11 If R c OxSxI, then the following are identical subsets of SxO: D11a. RSO D11b. ROS^ D11c. DenR^ D11d. Den(R)^ D11e. PrSO(R) D11f. Conv(Den(R)) D11g. {<s, o> C SxO : <o, s, i> C R for some i C I} The "denotation of x in R", written "Den(R, x)", is defined as follows: Den(R, x) = {o C O : <o, x> C Den(R)}. In other words: Den(R, x) = {o C O : <o, x, i> C R for some i C I}. Equivalent expressions for this concept are recorded in Definition 12. Definition 12 If R c OxSxI, and x C S, then the following are identical subsets of O: D12a. ROS.x D12b. DenR.x D12c. DenR|x D12d. DenR(, x) D12e. Den(R, x) D12f. Den(R).x D12g. {o C O : <o, x> C Den(R)} D12h. {o C O : <o, x, i> C R for some i C I} Signs are "equiferent" if they refer to all and only the same objects, that is, if they have exactly the same denotations. In other language for the same relation, signs are said to be "denotatively equivalent" or "referentially equivalent", but it is probably best to check whether the extension of this concept over the syntactic domain is really a genuine equivalence relation before jumpimg to the conclusions that are implied by these latter terms. To define the "equiference" of signs in terms of their denotations, one says that "x is equiferent to y under R", and writes "x =R y", to mean that Den(R, x) = Den(R, y). Taken in extension, this notion of a relation between signs induces an "equiference relation" on the syntactic domain. For each sign relation R, this yields a binary relation Der(R) c SxI that is defined as follows: Der(R) = DerR = {<x, y> C SxI : Den(R, x) = Den(R, y)}. These definitions and notations are recorded in the following display. Definition 13 If R c OxSxI, then the following are identical subsets of SxI: D13a. DerR D13b. Der(R) D13c. {<x,y> C SxI : DenR|x = DenR|y} D13d. {<x,y> C SxI : Den(R, x) = Den(R, y)} The relation Der(R) is defined and the notation "x =R y" is meaningful in every situation where Den(-,-) makes sense, but it remains to check whether this relation enjoys the properties of an equivalence relation. 1. Reflexive property. Is it true that x =R x for every x C S = I? By definition, x =R x if and only if Den(R, x) = Den(R, x). Thus, the reflexive property holds in any setting where the denotations Den(R, x) are defined for all signs x in the syntactic domain of R. 2. Symmetric property. Does x =R y => y =R x for all x, y C S? In effect, does Den(R, x) = Den(R, y) imply Den(R, y) = Den(R, x) for all signs x and y in the syntactic domain S? Yes, so long as the sets Den(R, x) and Den(R, y) are well�defined, a fact which is already being assumed. 3. Transitive property. Does x =R y & y =R z => x =R z for all x, y, z C S? To belabor the point, does Den(R, x) = Den(R, y) and Den(R, y) = Den(R, z) imply Den(R, x) = Den(R, z) for all x, y, z in S? Yes, again, under the stated conditions. It should be clear at this point that any question about the equiference of signs reduces to a question about the equality of sets, specifically, the sets that are indexed by these signs. As a result, so long as these sets are well�defined, the issue of whether equiference relations induce equivalence relations on their syntactic domains is almost as trivial as it initially appears. Taken in its set�theoretic extension, a relation of equiference induces a "denotative equivalence relation" (DER) on its syntactic domain S = I. This leads to the formation of "denotative equivalence classes" (DEC's), "denotative partitions" (DEP's), and "denotative equations" (DEQ's) on the syntactic domain. But what does it mean for signs to be equiferent? Notice that this is not the same thing as being "semiotically equivalent", in the sense of belonging to a single "semiotic equivalence class" (SEC), falling into the same part of a "semiotic partition" (SEP), or having a "semiotic equation" (SEQ) between them. It is only when very felicitous conditions obtain, establishing a concord between the denotative and the connotative components of a sign relation, that these two ideas coalesce. In general, there is no necessity that the equiference of signs, that is, their denotational equivalence or their referential equivalence, induces the same equivalence relation on the syntactic domain as that defined by their semiotic equivalence, even though this state of accord seems like an especially desirable situation. This makes it necessary to find a distinctive nomenclature for these structures, for which I adopt the term "denotative equivalence relations" (DER's). In their train they bring the allied structures of "denotative equivalence classes" (DEC's) and "denotative partitions" (DEP's), while the corresponding statements of "denotative equations" (DEQ's) are expressible in the form "x =R y". The uses of the equal sign for denoting equations or equivalences are recalled and extended in the following ways: 1. If E is an arbitrary equivalence relation, then the equation "x =E y" means that <x, y> C E. 2. If R is a sign relation such that RSI is a SER on S = I, then the semiotic equation "x =R y" means that <x, y> C RSI. 3. If R is a sign relation such that F is its DER on S = I, then the denotative equation "x =R y" means that <x, y> C F, in other words, that Den(R, x) = Den(R, y). The uses of square brackets for denoting equivalence classes are recalled and extended in the following ways: 1. If E is an arbitrary equivalence relation, then "[x]E" denotes the equivalence class of x under E. 2. If R is a sign relation such that Con(R) is a SER on S = I, then "[x]R" denotes the SEC of x under Con(R). 3. If R is a sign relation such that Der(R) is a DER on S = I, then "[x]R" denotes the DEC of x under Der(R). By applying the form of Fact 1 to the special case where X = Den(R, x) and Y = Den(R, y), one obtains the following facts. Fact 2.1 If R c OxSxI, then the following are identical subsets of SxI: F2.1a. DerR :D13a :: F2.1b. Der(R) :D13b :: F2.1c. {<x, y> C SxI : Den(R, x) = Den(R, y) } :D13c :R9a :: F2.1d. {<x, y> C SxI : {Den(R, x)} = {Den(R, y)} } :R9b :: F2.1e. {<x, y> C SxI : for all o C O {Den(R, x)}(o) = {Den(R, y)}(o) } :R9c :: F2.1f. {<x, y> C SxI : Conj(o C O) {Den(R, x)}(o) = {Den(R, y)}(o) } :R9d :: F2.1g. {<x, y> C SxI : Conj(o C O) (( {Den(R, x)}(o) , {Den(R, y)}(o) )) } :R9e :: F2.1h. {<x, y> C SxI : Conj(o C O) (( {Den(R, x)} , {Den(R, y)} ))$(o) } :R9f :D12e :: F2.1i. {<x, y> C SxI : Conj(o C O) (( {ROS.x} , {ROS.y} ))$(o) } :D12a Fact 2.2 If R c OxSxI, then the following are equivalent: F2.2a. DerR = {<x, y> C SxI : Conj(o C O) {Den(R, x)}(o) = {Den(R, y)}(o) } :R11a :: F2.2b. {DerR} = { {<x, y> C SxI : Conj(o C O) {Den(R, x)}(o) = {Den(R, y)}(o) } } :R11b :: F2.2c. {DerR} c SxIxB : {DerR} = {<x, y, v> C SxIxB : v = [ Conj(o C O) {Den(R, x)}(o) = {Den(R, y)}(o) ] } :R11c :: F2.2d. {DerR} = {<x, y, v> C SxIxB : v = Conj(o C O) [ {Den(R, x)}(o) = {Den(R, y)}(o) ] } :Log F2.2e. {DerR} = {<x, y, v> C SxIxB : v = Conj(o C O) (( {Den(R, x)}(o), {Den(R, y)}(o) )) } :Log F2.2f. {DerR} = {<x, y, v> C SxIxB : v = Conj(o C O) (( {Den(R, x)}, {Den(R, y)} ))$(o) } :$ Fact 2.3 If R c OxSxI, then the following are equivalent: F2.3a. DerR = {<x, y> C SxI : Conj(o C O) {Den(R, x)}(o) = {Den(R, y)}(o) } :R11a :: F2.3b. {DerR} : SxI �> B : {DerR}(x, y) = [ Conj(o C O) {Den(R, x)}(o) = {Den(R, y)}(o) ] :R11d :: F2.3c. {DerR}(x, y) = Conj(o C O) [ {Den(R, x)}(o) = {Den(R, y)}(o) ] :Log :: F2.3d. {DerR}(x, y) = Conj(o C O) [ {DenR}(o, x) = {DenR}(o, y) ] :Def :: F2.3e. {DerR}(x, y) = Conj(o C O) (( {DenR}(o, x), {DenR}(o, y) )) :Log :D10b :: F2.3f. {DerR}(x, y) = Conj(o C O) (( {ROS}(o, x), {ROS}(o, y) )) :D10a
Digression on Derived Relations
A better understanding of derived equivalence relations (DER's) can be achieved by placing their constructions within a more general context, and thus comparing the associated type of derivation operation, namely, the one that takes a triadic relation R into a dyadic relation Der(R), with other types of operations on triadic relations. The proper setting would permit a comparative study of all their constructions from a basic set of projections and a full array of compositions on dyadic relations. To that end, let the derivation Der(R) be expressed in the following way: {DerR}(x, y) = Conj(o C O) (( {RSO}(x, o) , {ROS}(o, y) )). From this abstract a form of composition, temporarily notated as "P#Q", where P c XxM and Q c MxY are otherwise arbitrary dyadic relations, and where P#Q c XxY is defined as follows: {P#Q}(x, y) = Conj(m C M) (( {P}(x, m) , {Q}(m, y) )). Compare this with the usual form of composition, typically notated as "P.Q" and defined as follows: {P.Q}(x, y) = Disj(m C M) ( {P}(x, m) . {Q}(m, y) ).
Appendices
Logical Translation Rule 1
| ||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||
|
Geometric Translation Rule 1
| ||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||
|
Logical Translation Rule 2
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Geometric Translation Rule 2
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Document History
| Subject: Inquiry Driven Systems : An Inquiry Into Inquiry | Contact: Jon Awbrey | Version: Draft 8.70 | Created: 23 Jun 1996 | Revised: 06 Jan 2002 | Advisor: M.A. Zohdy | Setting: Oakland University, Rochester, Michigan, USA | Excerpt: Section 1.3.10 (Recurring Themes) | Excerpt: Subsections 1.3.10.8 - 1.3.10.13