Related work
This page contains links to work related to Chris Langan's Cognitive-Theoretic Model of the Universe (CTMU).
Contents
- 1 Chu spaces, Linear Logic & Coalgebras
- 2 Categories, Orders & Chaos
- 3 Language, Logic & Information
- 4 Evolution, Ecology & Autopoiesis
- 5 Economics, Game Formats & Decision Theory
- 6 Computational, Knowledge & Systems Engineering
- 7 Meta-Theory, Philosophy & Third Culture
- 8 Science, Technology & Society
- 9 Cosmology, Physics & Phenomenology
- 10 Isotelesis, Mereon, Aryan Overlord Research Archives
Chu spaces, Linear Logic & Coalgebras
"A Chu space is a transformable matrix whose rows transform forwards while its columns transform backwards.
Chu spaces unify a wide range of mathematical structures, including the following:
Relational structures such as sets, directed graphs, posets, and small categories.
Algebraic structures such as groups, rings, fields, modules, vector spaces, lattices, and Boolean algebras.
Topologized versions of the above, such as topological spaces, compact Hausdorff spaces, locally compact abelian groups, and topological Boolean algebras.
Algebraic structures can be reduced to relational structures by a technique described below. Relational structures constitute a large class in their own right. However when adding topology to relational structures, the topology cannot be incorporated into the relational structure but must continue to use open sets.
Chu spaces offer a uniform way of representing relational and topological structure simultaneously. This is because Chu spaces can represent relational structures via a generalization of topological spaces which allows them to represent topological structure at the same time using the same machinery.
Definition:
Surprisingly this degree of generality can be achieved with a remarkably simple form of structure. A Chu space (X,r,A) consists of just three things: a set X of points, individuals, or subjects, a set A of states or predicates, and a lookup table or matrix r: X×A → K which specifies for every subject x and predicate a the value a(x) of that predicate for that subject. The value of a(x) is given by the matrix r as the entry r(x,a). These matrix entries are drawn from a set K.
K can be as simple as {0,1}, as when representing topological spaces or Boolean algebras. Or it can be as complex as the set of all complex numbers, as when representing Hilbert spaces. Or it can be something in between, such as the set of 16 subsets of {0,1,2,3} when representing topological groups. The full generality of Chu spaces derives from the ability of predicates to take other values than simply true or false (the case K={0,1})." http://chu.stanford.edu/
"The rows present the physical, concrete, conjunctive, or yang aspects of the space, while the columns present the mental, coconcrete, disjunctive, or yin aspects. We may regard rows and columns as characteristic functions of subsets of respectively X and A, where K is taken to consist of the degrees of membership, with K = 2 = {0, 1} giving the ordinary notion of membership, 1 for in and 0 for out."Chu Spaces and their Interpretation as Concurrent Objects
"A Chu space is a binary relation from a set A to an antiset X which is defined as a set which transforms via converse functions." [1]
"The essence of the points and states (dual points) of Chu spaces may be found in the subjects and predicates of linguistics, the individuals and predicates of logic, the positions and momenta of physics, the inbound and outbound arrows at categorical objects, and, most importantly for this talk, the events and states of computational process algebra." [2]
"We give an elementary introduction to Chu spaces viewed as a set of strings all of the same length. This perspective dualizes the alternative view of Chu spaces as generalized topological spaces, and has the advantage of substituting the intuitions of formal language theory for those of topology." Chu Spaces from the Representational Viewpoint, *-Autonomous categories: Quantifiers in Action (Generalized Quantification in Query, Logical and Natural Languages, [3], [4]
"One of the approximation tools we will use is that of the nerve. Various nerve constructions are widely used in mathematics and computer science (including AI). They have also served as a link between continuous and discrete models in other contexts, for instance, relating to economics (even explicitly, cf. Baryshnikov, [3]). They give, at any `resolution' a simplicial complex and behave moderately well under refinement of the resolution. The title of the talks at the workshop contained the word `fractafold'. What are they and how are they relevant to this theme? In this research, we are investigating a class of idealised spaces that could be used as a testbed for any emerging analogue of the tools of differential geometry in the observational context. These spaces should include some well known ones such as the Cantor set and less well known ones such as the solenoids and Menger manifolds, (for which see later in this paper). These example spaces have various common features including reasonably constructive local properties, specified by local iterated function schemes. They are fractal, but not too wildly so!" A geometry of information, I: Nerves, posets and differential forms
"In this second part of our contribution to the workshop, we look in more detail at the Sorkin model, its relationship to constructions in Chu space theory, and then compare it with the Nerve constructions given in the first part."A geometry of information, II: Sorkin models, and biextensional collapses
"The nerve of a category is often used to construct topological versions of moduli spaces. If X is an object of C, its moduli space should somehow encode all objects isomorphic to X and keep track of the various isomorphisms between all of these objects in that category. This can become rather complicated, especially if the objects have many non-identity automorphisms. The nerve provides a combinatorial way of organizing this data. Since simplicial sets have a good homotopy theory, one can ask questions about the meaning of the various homotopy groups πn(N(C)). One hopes that the answers to such questions provide interesting information about the original category C, or about related categories. The notion of nerve is a direct generalization of the classical notion of classifying space of a discrete group"[5], Simplicial Sets, Model Category, Model Structure on Simplicial Sets
"The superficial similarity between the Chu construction and the Hyland Tan double glueing construction G has been observed widely. This paper establishes a more formal mathematical relationship between the two. We show that double glueing on relations subsumes the Chu construction on sets: we present a full monoidal embedding of the category chu(Set,K) of biextensional Chu spaces over K into G(Rel^K), and a full monoidal embedding of the category Chu(Set,K) of Chu spaces over K into IG(Rel^K), where we define IG, the intensional double glueing construction, by substituting multisets for sets in G. We define a biextensional collapse from IG to G which extends the familiar notion on Chu spaces. This yields a new interpretation of the monic specialisation implicit in G as a form of biextensionality."Intensional Double Glueing, Biextensional Collapse, and the Chu Construction, [6], [7], [8], [9]
"This study opens up opportunities for further investigations into recursively defined Chu spaces, as well as constructive models of linear logic." [10]
"A model of concurrency usually consist of a set (domain) whose elements denote concurrent systems, together with some structure. The structure takes the shape of a collection of operators, turning the domain into an algebra (a process algebra), optionally in combination with a collection of predicates, or a collection of morphisms between the elements, making the domain into a category." Classifying Models of Concurrency, Modeling Concurrency with Geometry, Modeling Concurrency with Partial Orders
Algebraic Process Calculi: The First Twenty-Five Years and Beyond
Chu Spaces: Automata with Quantum Aspects
Information Transfer across Chu Spaces
Rational Mechanics and Natural Mathematics
Chu Spaces: Toward New Justification for Fuzzy Heuristics
Dialectica and Chu Construction: Cousins?, Dialectica Space
"We pursue a model-oriented rather than axiomatic approach to the foundations of Quantum Mechanics, with the idea that new models can often suggest new axioms. This approach has often been fruitful in Logic and Theoretical Computer Science. Rather than seeking to construct a simplified toy model, we aim for a `big toy model', in which both quantum and classical systems can be faithfully represented - as well as, possibly, more exotic kinds of systems. To this end, we show how Chu spaces can be used to represent physical systems of various kinds. In particular, we show how quantum systems can be represented as Chu spaces over the unit interval in such a way that the Chu morphisms correspond exactly to the physically meaningful symmetries of the systems - the unitaries and antiunitaries. In this way we obtain a full and faithful functor from the groupoid of Hilbert spaces and their symmetries to Chu spaces. We also consider whether it is possible to use a finite value set rather than the unit interval; we show that three values suffice, while the two standard possibilistic reductions to two values both fail to preserve fullness." Big Toy Models: Representing Physical Systems as Chu Spaces, [11], [12]
Coalgebras, Chu Spaces, and Representations of Physical Systems, Introduction to Coalgebra: Towards Mathematics of States and Observations, Coalgebra and Circularity, [13]
"Linear Logic was introduced by J.-Y. Girard in 1987 and it has attracted much attention from computer scientists, as it is a logical way of coping with resources and resource control. The focus of this technical report will be on proof-theory and computational interpretation of proofs, that is, we will focus on the question of how to interpret proofs as programs and reduction (cut-elimination) as evaluation. We first introduce Classical Logic. This is the fundamental idea of the proofs-as-programs paradigm. Cut-elimination for Classical Logic is highly non-deterministic; it is shown how this can be remedied either by moving to Intuitionistic Logic or to Linear Logic. In the case on Linear Logic we consider Intuitionistic Linear Logic as well as Classical Linear Logic. Furthermore, we take a look at the Girard Translation translating Intuitionistic Logic into Intuitionistic Linear Logic. Also, we give a brief introduction to some concrete models of Intuitionistic Linear Logic." Introduction to Linear Logic
Linear Logic, Ludics, Implicit Complexity, Operator Algebras: Geometry of Interaction
"We consider foundational questions related to the definition of functions by corecursion. This method is especially suited to functions into the greatest fixed point of some monotone operator, and it is most applicable in the context of non-wellfounded sets." On the Foundations of Corecursion
"This entry is about two kinds of circularity: object circularity, where an object is taken to be part of itself in some sense; and definition circularity, where a collection is defined in terms of itself. Instances of these two kinds of circularity are sometimes problematic, and sometimes not. We are primarily interested in object circularity in this entry, especially instances which look problematic when one tries to model them in set theory. But we shall also discuss circular definitions.
The term non-wellfounded set refers to sets which contain themselves as members, and more generally which are part of an infinite sequence of sets each term of which is an element of the preceding set. So they exhibit object circularity in a blatant way. Discussion of such sets is very old in the history of set theory, but non-wellfounded sets are ruled out of Zermelo-Fraenkel set theory (the standard theory) due to the Foundation Axiom (FA). As it happens, there are alternatives to this axiom FA. This entry is especially concerned with one of them, an axiom first formulated by Marco Forti and Furio Honsell in a 1983 paper. It is now standard to call this principle the Anti-Foundation Axiom (AFA), following its treatment in an influential book written by Peter Aczel in 1988.
The attraction of using AFA is that it gives a set of tools for modeling circular phenomena of various sorts. These tools are connected to important circular definitions, as we shall see. We shall also be concerned with situating both the mathematics and the underlying intuitions in a broader picture, one derived from work in coalgebra. Incorporating concepts and results from category theory, coalgebra leads us to concepts such as corecursion and coinduction; these are in a sense duals to the more standard notions of recursion and induction."Non-wellfounded set theory, [14]
"In mathematics, coalgebras or cogebras are structures that are dual (in the sense of reversing arrows) to unital associative algebras. The axioms of unital associative algebras can be formulated in terms of commutative diagrams. Turning all arrows around, one obtains the axioms of coalgebras. Every coalgebra, by (vector space) duality, gives rise to an algebra, but not in general the other way. In finite dimensions, this duality goes in both directions (see below). Coalgebras occur naturally in a number of contexts (for example, universal enveloping algebras and group schemes)."[15], Introduction to Coalgebra
Bisimulation and Coinduction, Learning Lounge, A Brief Introduction to Coalgebra Representation Theory, [16]
Jets and differential linear logic https://arxiv.org/abs/1811.06235?fbclid=IwAR3hx5nd0DbwREZ6vmabkr1RMjgKR_nXtVzp5WlaL1zcMFWA9f-vlmxVBxY
Formal manifolds and synthetic theory of jet bundles http://www.numdam.org/item/CTGDC_1980__21_3_227_0/
Categories, Orders & Chaos
Spekkens's toy theory as a category of processes http://arxiv.org/abs/1108.1978
Non-Locality in Categorical Quantum Mechanics: http://www.cs.ox.ac.uk/people/bob.coecke/BillThesis
Universal Algebra ... A more generalised programme along these lines is carried out by category theory. Given a list of operations and axioms in universal algebra, the corresponding algebras and homomorphisms are the objects and morphisms of a category. Category theory applies to many situations where universal algebra does not, extending the reach of the theorems. Conversely, many theorems that hold in universal algebra do not generalise all the way to category theory. Thus both fields of study are useful.
A more recent development in category theory that generalizes operations is operad theory – an operad is a set of operations, similar to a universal algebra. http://en.wikipedia.org/wiki/Universal_algebra
"There are various hints in the literature as to categorical foundations for model-theory. The type spaces seem fundamental, the models much less so. Now is perhaps the time to give new foundations, with the flexibility of those of algebraic geometry. It now seems to me natural to have distinguished quantifiers for various particularly significant kinds of morphism (proper, étale, flat, finite, etc), thus giving more suggestive quantifier-eliminations. The traditional emphasis on logical generality generally obscures geometrically significant features." ... "I sense that we should be a bit bolder by now. There are many issues of uniformity associated with the Weil Cohomology Theories, and major definability issues relating to Grothendieck’s Standard Conjectures. Model theory (of Henselian fields) has made useful contact with motivic considerations, including Kontsevich’s motivic integration. Maybe it has something useful to say about “algebraic geometry over the one element field”, ultimately a question in definability theory."
Lawvere’s Quantifiers and Sheaves, Actes Congrès. Intern. Math. 1970, pp. 329-334. I wonder what model theorists think of Makkai and Paré (1989), Accessible categories: The foundation of Categorical Model Theory, Contemporary Mathematics, AMS.Category Theory and Model Theory
"Simplicial sets generalize the idea of simplicial complexes: a simplicial set is like a combinatorial space built up out of gluing abstract simplices to each other. Equivalently, it is an object equipped with a rule for how to consistently map the objects of the simplex category into it." Simplical Set
"In mathematics, a directed set (or a directed preorder or a filtered set) is a nonempty set A together with a reflexive and transitive binary relation ≤ (that is, a preorder), with the additional property that every pair of elements has an upper bound. Directed sets are a generalization of nonempty totally ordered sets, that is, all totally ordered sets are directed sets (but not all partially ordered sets). In topology, directed sets are used to define nets, which generalize sequences and unite the various notions of limit used in analysis. Directed sets also give rise to direct limits in abstract algebra and (more generally) category theory." Directed Set
"In mathematical order theory, an ideal is a special subset of a partially ordered set (poset). Although this term historically was derived from the notion of a ring ideal of abstract algebra, it has subsequently been generalized to a different notion. Ideals are of great importance for many constructions in order and lattice theory. ... The dual notion of an ideal, i.e. the concept obtained by reversing all ≤ and exchanging with , is a filter. The terms order ideal and order filter are sometimes used for arbitrary lower or upper sets."Ideal (Order Theory)
"Even though most mathematicians do not accept the constructivist's thesis, that only mathematics done based on constructive methods is sound, constructive methods are increasingly of interest on non-ideological grounds. For example, constructive proofs in analysis may ensure witness extraction, in such a way that working within the constraints of the constructive methods may make finding witnesses to theories easier than using classical methods. Applications for constructive mathematics have also been found in typed lambda calculi, topos theory and categorical logic,which are notable subjects in foundational mathematics and computer science. In algebra, for such entities as toposes and Hopf algebras, the structure supports an internal language that is a constructive theory; working within the constraints of that language is often more intuitive and flexible than working externally by such means as reasoning about the set of possible concrete algebras and their homomorphisms. Physicist Lee Smolin writes in Three Roads to Quantum Gravity that topos theory is "the right form of logic for cosmology" (page 30) and "In its first forms it was called 'intuitionistic logic'" (page 31). "In this kind of logic, the statements an observer can make about the universe are divided into at least three groups: those that we can judge to be true, those that we can judge to be false and those whose truth we cannot decide upon at the present time" (page 28)." Constructivism (mathematics)
"Lattice theory is the study of sets of objects known as lattices. It is an outgrowth of the study of Boolean algebras, and provides a framework for unifying the study of classes or ordered sets in mathematics. The study of lattice theory was given a great boost by a series of papers and subsequent textbook written by Birkhoff (1967)."Lattice Theory, [17], [18], [19]
Foundations of Mathematics, PlanetMath Encyclopedia
Make Category Theory Intuitive!, Conceptual Mathematics, Theory and Applications of Categories, Applied and Computational Category Theory, North American School of Category Theory, Theory and Applications of Categories
Physics, Topology, Logic and Computation: A Rosetta Stone, Categories for the Practising Physicist, Category Theory as the Language of Consciousness, Category Theory & Consciousness, Consciousness (Scholarpedia), Modeling Cognitive Systems with Category Theory, Neural Algebra and Consciousness: Theory of Structural Functionality in Neural Nets, Quantum Approaches to Consciousness (Stanford Encyclopedia), TGD & Consciousness, Neurophenomenology, Category Theory, Space and Time, [20], Categories, Sheaves, Presheaves and Cohomologies for the Theory of Consciousness Non-wellfounded Set Theory
Finsler set theory: platonism and circularity, Non-well-founded Set Theory
"Indeed, the first significant work in this style on the applications of category theory to the study of set theory was the monograph Algebraic Set Theory (Cambridge, 1995) by Andre Joyal and Ieke Moerdijk. The second reason is that we believe the locution "algebraic logic" should properly refer to categorical logic rather than just the logic of Boole and his modern proponents, since categorical logic subsumes such lattice theoretic methods and not the other way around. Hence the term "algebraic set theory" rather than "categorical set theory". This is in keeping with the use of "algebraic" to mean, essentially, "functorial" in modern algebraic topology, algebraic geometry, etc." Algebraic Set Theory, Predicative Algebraic Set Theory, [21], [22], [23], [24],
From Partition Logic to Information Theory: Some basic analogies between subset logic and partition logic, Two Problems that Shaped a Century of Set Theory, Discovering Modern Set Theory
A Geometry of Approximation: Rough Set Theory: Logic, Algebra and Topology
Introduction to Lattices and Order, Lattice Computing: Lattice Theory Based Computational Intelligence, Concept Lattices and Co-Covering Graphs
The Mathematics of Boolean Algebra
Differential Algebraic Topology, Stratifolds, [25]
"The remaining chapters present a detailed exposition of one of these trends (the homotopic version of Plateau's problem in terms of stratified multivarifolds) and the Plateau problem in homogeneous symplectic spaces. Dao Trong Thi formulated the new concept of a multivarifold, which is the functional analog of a geometrical stratified surface and enables us to solve Plateau's problem in a homotopy class."Protocomputational Multivarifolds, Heirarchical Stratifolds, Fractal Bubbles, Nested Hyperstructures, Cosmologic Minimalism, Hyperincursive Conspansion
Comparative Smootheology: Workshop on Smooth Structures in Logic, Category Theory, and Physics
Dualizing Object, Isbell Duality, Enriched Stratified systems for the Foundations of Category Theory
Hyperstructure, Applications of Hyperstructure Theory, Hypergraphs, Hypernetworks for Reconstructing the Dynamics of Multilevel Systems
"Elemental perspective:
Generative sciences explores the natural phenomena at several levels including physical, biological and social processes as emergent processes. It explores complex natural processes as generating through continuous interactions between elemental entities on parsimonious and simple universal rules and parameters.
Scientific and philosophical origins:
The generative sciences originate from the monadistic philosophy of Leibniz. This was further developed by the neural model of Walter Pitts and Warren McCulloch. The development of computers or Turing Machines laid a technical source for the growth of the generative sciences. However, the cornerstones of the generative sciences came from the work on cellular automaton theory by John Von Neumann, which was based on the Walter Pitts and Warren McCulloch model of the neuron. Cellular automata were mathematical representations of simple entities interacting under common rules and parameters to manifest complex behaviors. The generative sciences were further unified by the cybernetics theories of Norbert Wiener and the information theory of Claude E. Shannon and Warren Weaver in 1948. The mathematician Shannon gave the theory of the bit as a unit of information to make a basic decision, in his paper A mathematical theory of communication (1948). On this was further built the idea of uniting the physical, biological and social sciences into a holistic discipline of Generative Philosophy under the rubric of General Systems Theory, by Bertalanffy, Anatol Rapoport, Ralph Gerard, and Kenneth Boulding. This was further advanced by the works of Stuart Kauffman in the field of self-organization. It also has advanced through the works of Heinz von Foerster, Ernst von Glasersfeld, Gregory Bateson and Humberto Maturana in what came to be called constructivist epistemology or radical constructivism. The most influential advance in the generative sciences came from the development of the cognitive sciences through the theory of generative grammar by the American linguist Noam Chomsky (1957). At the same time the theory of the perceptron was advanced by Marvin Minsky and Seymour Papert at MIT. It was also in the early 1950s that Crick and Watson gave the double helix model of the DNA, at the same time as psychologists at the MIT including Kurt Lewin, Jacob Ludwig Moreno and Fritz Heider laid the foundations for group dynamics research which later developed into social network analysis. In 1996 Joshua M. Epstein and Robert Axtell wrote the seminal work Sugarscape. In their work they expressed the idea of Generative science which would explore and simulate the world through generative processes. Michael Leyton, professor of Cognitive Psychology at Rutgers University, has written an interesting "Generative Geometry."" Generative Sciences
"A variety of modeling frameworks have been proposed and utilized in complex systems studies, including dynamical systems models that describe state transitions on a system of fixed topology, and self-organizing network models that describe topological transformations of a network with little attention paid to dynamical state changes. Earlier network models typically assumed that topological transformations are caused by exogenous factors, such as preferential attachment of new nodes and stochastic or targeted removal of existing nodes. However, many real-world complex systems exhibit both of these two dynamics simultaneously, and they evolve largely autonomously based on the system's own states and topologies. Here we show that, by using the concept of graph rewriting, both state transitions and autonomous topology transformations of complex systems can be seamlessly integrated and represented in a unified computational framework.We call this novel modeling framework "Generative Network Automata (GNA)". In this chapter, we introduce basic concepts of GNA, its working definition, its generality to represent other dynamical systems models, and some of our latest results of extensive computational experiments that exhaustively swept over possible rewriting rules of simple binary-state GNA. The results revealed several distinct types of the GNA dynamics." Generative Network Automata: A Generalized Framework for Modeling Adaptive Network Dynamics Using Graph Rewritings, Unifying Dynamical Systems and Complex Network Theories: A Proposal of "Generative Network Automata"(GNA)
Simplicial Complexes of Graphs, Graphs, Simplical Complexes, and Beyond: Topological Tools for Multi-Agent Coordination
"The ultimate goal of our work is to show how sheaf theory can be used for studying cooperating robotics scenarios. In this paper we propose a formal definition for systems and define a category of systems. The main idea of the paper is that relationships between systems can be expressed by a suitable Grothendieck topology on the category of systems. We show that states and (parallel) actions can be expressed by sheaves and use this in order to study the behavior of systems in time.
"Sheaf theory was developed in mathematics because of the necessity of studying the relationship between "local" and "global" phenomena...The alternance "local-global" that occurs in this case suggests that a sheaf-theoretical approach to the study of systems of cooperating agents (and in the study of concurrency in general) would be natural...In a series of papers, J. Pfalzgraf develops the idea of "logical fiberings", with the goal of developing a (non-classical) "fibered logical calculus", by means of which one could construct logical controllers for multi-tasking scenerios in a formal way." Towards a Sheaf Semantics for Cooperating Agents Scenarios [26]
"Monge has identified nine major elements of systems as defined by systems theory, namely, that they are isomorphic, hierarchical, interdependent, teleological, nonsummative, self-regulative, equifinite, adaptable, and (in the case of open systems) interactive with the environment. [Monge, 1977]"[27]
Nonlinear Dynamics and Complex Systems Theory- Glossary of Terms
Complex Systems From the Perspective of Category Theory: I. Functioning of the Adjunction Concept.
"We develop a category theoretical scheme for the comprehension of the information structure associated with a complex system, in terms of families of partial or local information carriers. The scheme is based on the existence of a categorical adjunction, that provides a theoretical platform for the descriptive analysis of the complex system as a process of functorial information communication." http://users.uoa.gr/~ezafiris/
Category Theory and Universal Models: Adjoints and Brain Functors
"Since its formal definition over sixty years ago, category theory has been increasingly recognized as having a foundational role in mathematics. It provides the conceptual lens to isolate and characterize the structures with importance and universality in mathematics. The notion of an adjunction (a pair of adjoint functors) has moved to center-stage as the principal lens. The central feature of an adjunction is what might be called "internalization through a universal" based on universal mapping properties. A recently developed "heteromorphic" theory of adjoint functors allows the concepts to be more easily applied empirically. This suggests a conceptual structure, albeit abstract, to model a range of selectionist mechanisms (e.g., in evolution and in the immune system). Closely related to adjoints is the notion of a "brain functor" which abstractly models structures of cognition and action (e.g., the generative grammar view of language)." http://www.ellerman.org/Davids-Stuff/Maths/Math.htm
http://en.wikipedia.org/wiki/Mereology
Charles Muses: Time and Destiny (excerpt) -- A Thinking Allowed https://youtu.be/XiMpUxYSiIM
“Muses also envisioned a mathematical number concept, Musean hypernumbers, that includes hypercomplex number algebras such as complex numbers and split-complex numbers as primitive types. He assigned them levels based on certain arithmetical properties they may possess. While many open questions remain, in particular about defining the relations of these levels, Muses pictured a wide range of applicability for this concept. Some of these are based on properties of magic squares,[1] and even related to religious belief. He believed that these hypernumbers were central to issues of consciousness.” https://en.wikipedia.org/wiki/Charles_Musès
Language, Logic & Information
UNIVERSAL LOGIC: Towards a general theory of logics
Network for Philosophical Logic and Its Applications, natural language semanticists and philosophers of language
Inter-Stratal Tension: a place to review the delicate balance between language and reality, Linguistic and Cognitive Networks, [28]
North American Summer School for Logic, Language, and Information
Center for the Study of Language and Information
Institute for Logic, Language, and Computation
Association for Logic, Language, and Information
Adaptive Predicates in Empty-Start Natural Language Parsing (2001) Q. T. Jackson and C.M. Langan
"In 1985, Robert Rosen defined an anticipatory system as follows: A system containing a predictive model of itself and/or its environment, which allows it to change state at an instant in accord with the model's predictions pertaining to a later instant." Anticipation (Artificial Intelligence), Computing Anticipatory Systems with Incursion and Hyperincursion
"Quine's work in logic gradually became dated in some respects. Techniques he did not teach and discuss include analytic tableaux, recursive functions, and model theory. His treatment of metalogic left something to be desired. For example, Mathematical Logic does not include any proofs of soundness and completeness. Early in his career, the notation of his writings on logic was often idiosyncratic. His later writings nearly always employed the now-dated notation of Principia Mathematica. Set against all this are the simplicity of his preferred method (as exposited in his Methods of Logic) for determining the satisfiability of quantified formulas, the richness of his philosophical and linguistic insights, and the fine prose in which he expressed them.
Most of Quine's original work in formal logic from 1960 onwards was on variants of his predicate functor logic, one of several ways that have been proposed for doing logic without quantifiers."Willard Van Orman Quine
"Mathematical logic, especially proof theory and theory of computation, foundations of mathematics, especially constructive and predicative foundations." Solomon Feferman
"Categorical logic originated with Bill Lawvere's Functorial Semantics of Algebraic Theories (1963), and Elementary Theory of the Category of Sets (1964). Lawvere recognised the Grothendieck topos, introduced in algebraic topology as a generalised space, as a generalisation of the category of sets (Quantifiers and Sheaves (1970)). With Myles Tierney, Lawvere then developed the notion of elementary topos, thus establishing the fruitful field of topos theory, which provides a unified categorical treatment of the syntax and semantics of higher-order predicate logic. The resulting logic is formally intuitionistic. Andre Joyal is credited, in the term Kripke–Joyal semantics, with the observation that the sheaf models for predicate logic, provided by topos theory, generalise Kripke semantics. Joyal and others applied these models to study higher-order concepts such as the real numbers in the intuitionistic setting." ... "In an even broader perspective, one might take category theory to be to the mathematics of the second half of the twentieth century, what measure theory was to the first half. It was Kolmogorov who applied measure theory to probability theory, the first convincing (if not the only) axiomatic approach. Kolmogorov was also a pioneer writer in the early 1920s on the formulation of intuitionistic logic, in a style entirely supported by the later categorical logic approach (again, one of the formulations, not the only one; the realizability concept of Stephen Kleene is also a serious contender here). Another route to categorical logic would therefore have been through Kolmogorov, and this is one way to explain the protean Curry–Howard isomorphism."Categorical Logic
Research Proposal: Quantum Computation and Algebraic Structures in Semantics
Ontological Commitment/Innocence, Neutrosophy, [29], carrara.pdf, [30]
Linguistic Theory and Meta-Theory: for a Science of Texts
Stratificational Grammar (glottopedia)
Biological Information as Game-Theoretic Information
Boudewijn de Bruin (logic of game theoretic explanations)
“A relational first order structure is homogeneous if every isomorphism between finite substructures extends to an automorphism” https://www.birs.ca/workshops/2015/15w5100/report15w5100.pdf
Intuition behind homogeneous models https://math.stackexchange.com/questions/2835270/intuition-behind-homogeneous-models
Evolution, Ecology & Autopoiesis
"Macroscopic collections of simple (and typically nonlinear) interacting units that are endowed with the ability to evolve and adapt to a changing environment." Complex Adaptive System
"Mathematically, problems involving strongly self-referential global/system wide mappings or those that involve coevolutionary contrarian or hostile agents make these impossible to be solved by deductive means of Column I, Table 1. Adaptive/evolutionary methods of Column III, Table 1 have to be used and models beyond optimization and the rational economic man of neoclassical economics are needed. The algorithmic unsolvability of the fixed point mappings of equilibria with adaptive novelty in the Gödelian structure (see, Figure 1), makes self-organization the flip side of CAS and is perhaps the unifying principlefor all complex systems." Computability and Evolutionary Complexity: Markets as Complex Adaptive Systems
Predictions of Intelligent Design
and teleological explanations.pdf Darwinian and Teleological Explanations: Are They Incompatible?, Evolutionary Game Theory
"Homeotely: The term homeotely signifies that subsystems will direct their behaviour in such a way that it is beneficial for the well-being of the overall system. When applied to the evolutionary process, it states that subsystems will develop in such a way that they are beneficial for the well-being of the overall system. At first glance, this sounds embarrassingly teleological. However, if we recognize the fact that the behaviour as well as the evolution of systems is guided by context-sensitive self-interest, teleology vanishes into thin air. Context-sensitive self-interest is a systemic evolutionary principle: organisms are forced by their selfish genes to seek nothing but their own advantage - but the environment in which they develop, or the system of which they are a subsystem, only allows a limited set of developments and patterns of behaviour, any breach of the rules being punished with elimination. For an animal endowed with choice this harsh law transforms into an ethical principle: since its behaviour is only partly genetically determined, the word sensitive assumes its active meaning, i.e. it refers to conscious reactions to perceived or anticipated effects of behaviour or development on the overall system. Edward Goldsmith, The Way, [31]
Intelligent Design - Introduction
At the Intersection of "Metaphysical Naturalism" and "Intelligent Design"
"She then got to work, pronouncing the death of neo-Darwinism. Echoing Darwin, she said “It was like confessing a murder when I discovered I was not a neo-Darwinist.” But, she quickly added, “I am definitely a Darwinist though. I think we are missing important information about the origins of variation. I differ from the neo-Darwinian bullies on this point.” She then outlined the basis of her theory of the origin of the cell nucleus as a fusion between archaebacteria (thermoplasma) and Eubacteria (Spirochaeta). “We live on a bacterial planet,” she reflected. “The cell is the fundamental unit of life. A minimal cell has DNA, mRNA, tRNA, rRNA, amino acylating enzymes, polymerases, sources of energy and electrons, lipoprotein membranes, and ion channels, all contained within a cell wall, and is an autopoietic (self-regulating feedback) system.”[32]
"My own position on evolution (Vaughan Pratt):
I am skeptical of claims for the ability of random mutation and natural selection to account for the complexity of life. Careful examination of the evidence for any such theory should be encouraged. What I do believe is that the evident variations between individuals of any species, however caused, in combination with natural selection, are primary drivers of speciation, in accord with Darwin. Accounting for the complexity of life is a tremendously more challenging task that random mutation and natural selection cannot possibly accomplish on their own. Darwin never claimed his theory did so, and the vast amount we have learned since Darwin about molecular biology, cell biology, and ecology shows that it would have been wildly presumptuous for him to have done so.
The above is my position statement. The first two sentences reproduce almost verbatim the wording of A Scientific Dissent from Darwinism signed by some 700 scientists. The only replacements are the obvious "I am" for "We are," and more importantly "any such theory" for "Darwinian theory" since Darwin did not claim to account for the complexity of life but for the phenomenon of speciation, whereby the characteristics of a population can change over time: the longer the time the greater the number and extent of possible changes. Speciation can contribute to the complexity of life, as Darwin pointed out with admirable clarity, but unless it is the only such contributor it cannot be said to account for it, and nowhere did Darwin claim to have done so. ... An argument for more intelligent design:
Intelligent Design is the theory that complex structures on this planet such as trees and brains are too complex to have arisen by mere chance juxtaposition of their many components and must therefore have an intelligent cause.
Intelligent design is advanced by its proponents as an alternative to the mechanism of natural selection at the core of Darwin's theory of evolution. The process of essay writing will be much easier with MarvelousEssays.Com as there are a lot of highly professional and talented writers who are always eager to help you out with any sort of academic assignments regardless of the complexity levels. I do know what I m talking about! Darwin postulated that every species arose from earlier species by natural selection of those individuals of the species best adapted to the prevailing circumstances.
A prerequisite for evolution is that the species which is to evolve must contain some variation between its individuals; if all members of the species were identical, natural selection would have no way of selecting better-adapted individuals.
Where I run into difficulty is with the idea that an intelligent designer would not exploit natural selection. Why laboriously design the universe by hand when the convenient mechanism of natural selection allows one to do the job automatically?
Designing the universe by hand is as inefficient as washing laundry by hand. When the benefits of automation are available the intelligent designer avails himself of them. All He need say is, "Let there be variety." Natural selection is a wonderfully efficient way to design the universe."[33]
"Autopoiesis (from Greek αὐτo- (auto-), meaning "self", and ποίησις (poiesis), meaning "creation, production") literally means "self-creation" and expresses a fundamental dialectic between structure, mechanism and function. The term was introduced in 1972 by Chilean biologists Humberto Maturana and Francisco Varela:
An autopoietic machine is a machine organized (defined as a unity) as a network of processes of production (transformation and destruction) of components which: (i) through their interactions and transformations continuously regenerate and realize the network of processes (relations) that produced them; and (ii) constitute it (the machine) as a concrete unity in space in which they (the components) exist by specifying the topological domain of its realization as such a network.[1]
[...] the space defined by an autopoietic system is self-contained and cannot be described by using dimensions that define another space. When we refer to our interactions with a concrete autopoietic system, however, we project this system on the space of our manipulations and make a description of this projection.[2]" http://en.wikipedia.org/wiki/Autopoiesis
"the process whereby an organization produces itself. An autopoietic organization is an autonomous and self-maintaining unity which contains component-producing processes. The components, through their interaction, generate recursively the same network of processes which produced them. An autopoietic system is operationally closed and structurally state determined with no apparent inputs and outputs. A cell, an organism, and perhaps a corporation are examples of autopoietic systems. See allopoiesis. (F. Varela) Literally, self-production. The property of systems whose components (1) participate recursively in the same network of productions that produced them, and (2) realize the network of productions as a unity in the space in which the components exist (after Varela) (see recursion). Autopoiesis is a process whereby a system produces its own organization and maintains and constitutes itself in a space. E.g., a biological cell, a living organism and to some extend a corporation and a society as a whole. (krippendorff)" http://pespmc1.vub.ac.be/ASC/autopoiesis.html
"The CTMU has a meta-Darwinian message: the universe evolves by hological self-replication and self-selection. Furthermore, because the universe is natural, its self-selection amounts to a cosmic form of natural selection. But by the nature of this selection process, it also bears description as intelligent self-design (the universe is “intelligent” because this is precisely what it must be in order to solve the problem of self-selection, the master-problem in terms of which all lesser problems are necessarily formulated). This is unsurprising, for intelligence itself is a natural phenomenon that could never have emerged in humans and animals were it not already a latent property of the medium of emergence. An object does not displace its medium, but embodies it and thus serves as an expression of its underlying syntactic properties. What is far more surprising, and far more disappointing, is the ideological conflict to which this has led. It seems that one group likes the term “intelligent” but is indifferent or hostile to the term “natural”, while the other likes “natural” but abhors “intelligent”. In some strange way, the whole controversy seems to hinge on terminology."[34]
“R. Crumb’s “Trash – What Do We Throw Away?” (1982) https://whosouttherecomics.wordpress.com/2018/09/19/r-crumbs-trash-what-do-we-throw-away-1982/
Mereon:
There is a lot of implied talk by Chris Langan and his supporters about the “ergodic” model of population homogenization (“the commission of genocide against *Whitey* through interbreeding with *Turd-Worlders*.”) The problem with this attitude is that crypto-Jews like Chris Langan don’t want White Aryans to learn self-discipline or mathematics, just how to kvetch in their neuro-linguistically programmed slang.
Mereon (talk) 19:04, 30 June 2021 (UTC) Mereon
“Urban Systems Dynamics, Urban Growth and Scaling Laws: The Question of Ergodicity ... This cross-sectional interpretation in terms of the longitudinal trajectory of an individual city assumes that the city system is ergodic. Yet this hypothesis is not consistent with an evolutionary theory of urban systems integrating the spatial distribution of labour and the hierarchical diffusion of innovation.” https://www.researchgate.net/publication/301186944_Urban_Systems_Dynamics_Urban_Growth_and_Scaling_Laws_The_Question_of_Ergodicity
Conclusion: Perspectives on urban theories https://arxiv.org/pdf/1911.02854.pdf
Spatial dynamics of complex urban systems within an evolutionary theory frame https://arxiv.org/pdf/2010.14890.pdf
Empowering Urban Governance through Urban Science: Multi-Scale Dynamics of Urban Systems Worldwide https://www.mdpi.com/2071-1050/12/15/5954/htm
Urban growth and the emergent statistics of cities https://advances.sciencemag.org/content/6/34/eaat8812
Economics, Game Formats & Decision Theory
Mechanism Design and Intentions: http://www.econ.uzh.ch/faculty/netzer/publications/MechanismIntentions.pdf
"When two or more players are engaged in a game with uncertainties, they need to consider what the other players’ beliefs may be, which in turn are influenced by what they think the first player’sideas are. Harsanyi defined type spaces simply as a set in which all possible players-as defined by theirbeliefs- could be found. Later on, more meaningful constructions of this set were performed.The theory of coalgebra, on the other hand, has been created to deal with circular phenomena, soits application to the problem of type spaces is only natural. We show how to apply it and we use themore general framework of category theory to compare the relative strength of previous solutions to the problem of defining type spaces." A Coalgebraic Approach to Type Spaces, Harsanyi Type Spaces and Final Coalgebras Constructed from Satisfied Theories, [35]
Games and information: an introduction to game theory
The Teleological Impulse: Thorstein Veblen, the Philosophy of Science, and Existentialism
"Five conditions of the mixed economy, including full employment, stability, economic growth, efficiency, and equity, that are generally desired by society and pursued by governments through economic policies. The five goals are typically divided into the three that are most important for macroeconomics (the macroeconomic goals of full employment, stability and economic growth) and the two that are most important for microeconomics (the microeconomic goals of efficiency and equity)."[36]
"Mechanism Design: How to Implement Social Goals
The theory of mechanism design can be thought of as the “engineering” side of economic theory. Much theoretical work, of course, focuses on existing economic institutions. The theorist wants to explain or forecast the economic or social outcomes that these institutions generate. But in mechanism design theory the direction of inquiry is reversed. We begin by identifying our desired outcome or social goal. We then ask whether or not an appropriate institution (mechanism) could be designed to attain that goal. If the answer is yes, then we want to know what form that mechanism might take. In this paper, I offer a brief introduction to the part of mechanism design called implementation theory, which, given a social goal, characterizes when we can design a mechanism whose predicted outcomes (i.e., the set of equilibrium outcomes) coincide with the desirable outcomes, according to that goal. I try to keep technicalities to a minimum, and usually confine them to footnotes." http://nobelprize.org/nobel_prizes/economics/laureates/2007/maskin-lecture.html
Game-Theoretic Foundations of Mechanism Design
Mechanism Design for Dynamic Settings
Dynamic Mechanism Design: Incentive Compatibility, Profit Maximization and Information Disclosure
Evolution and Intelligent Design
Game Theoretic Problems in Network Economics and Mechanism Design Solutions
Multiagent Systems http://www.masfoundations.org/mas.pdf
Computational-Mechanism Design: A Call to Arms
Great Expectations. Part I: On the Customizability of Generalized Expected Utility
Great Expectations. Part II: Generalized Expected Utility as a Universal Decision Rule
On Game Formats and Chu Spaces
"Quantum game theory is an extension of classical game theory to the quantum domain. It differs from classical game theory in three primary ways: Superposed initial states, Quantum entanglement of initial states, Superposition of strategies to be used on the initial states. This theory is based on the physics of information much like quantum cryptography." Quantum Game Theory
"My current research programme is mostly devoted to stochastic analysis on infinite configuration spaces of Riemannian manifolds, and related questions. It is strongly motivated by applications in statistical mechanics, and requires the combination of analytic, probabilistic and algebraic methods." http://maths.york.ac.uk/www/ad557
Computational, Knowledge & Systems Engineering
"Computers today are not only the calculation tools - they are directly (inter)acting in the physical world which itself may be conceived of as the universal computer (Zuse, Fredkin, Wolfram, Chaitin, Lloyd). In expanding its domains from abstract logical symbol manipulation to physical embedded and networked devices, computing goes beyond Church-Turing limit (Copeland, Siegelman, Burgin, Schachter). Computational processes are distributed, reactive, interactive, agent-based and concurrent. The main criterion of success of computation is not its termination, but the adequacy of its response, its speed, generality and flexibility; adaptability, and tolerance to noise, error, faults, and damage. Interactive computing is a generalization of Turing computing, and it calls for new conceptualizations (Goldin, Wegner). In the info-computationalist framework, with computation seen as information processing, natural computation appears as the most suitable paradigm of computation and information semantics requires logical pluralism." [37], Theory and Linguistic Meaning.pdf AN INVITATION TO LANGUAGE AND GAMES, Toggling Operators in Computability Logic
"A network automaton (plural network automata) is a mathematical system consisting of a network of nodes that evolves over time according to predetermined rules. It is similar in concept to a cellular automaton, but much less studied. Stephen Wolfram's book A New Kind of Science, which is primarily concerned with cellular automata, briefly discusses network automata, and suggests (without positive evidence) that the universe might at the very lowest level be a network automaton."Network Automaton
"Many social, economic, biological and legal interactions can be thought of as situations in which actors have private information, public signals that they can give off, and various algorithms for figuring out which signals to emit. Often those algorithms involve introspection – a look at one's own condition and values – and an examination of the behavior of "neighboring" members of society. This article conceptualizes something known as a Global Cellular Automaton Network (GCAN) as a general tool for exploring systems that feature distributed information (with varying degrees of privacy), potentially heterogeneous values, and complex patterns of connection between agents. It does so using the Mathematica programming language and drawing on some of the insights of Stephen Wolfram's book, A New Kind of Science [Wolfram 2002]. The GCAN derives from the somewhat simpler construct of a global cellular automaton (GCA). A GCA is a single cellular automaton in which the evolution of each site is a function not only of the values of the site's neighbors – which is the distinguishing feature of conventional cellular automata – but also of certain global features of the automaton. A GCAN preserves the notion of evolution that depends in some way on global features of the system but further extends from both a cellular automaton and a GCA in at least two ways: (1) there are multiple cellular automata that are networked with each other; (2) each site within each cellular automaton node within the GCAN evolves, based not only on the values of its neighbors and on its own global characteristics but also on certain characteristics of the cellular automata to which it is connected. This extension creates a tool capable of great flexibility in the systematic study of complex systems. Moreover, GCANs can be used to explore "Random Cellular Networks", in which the update rule to be applied by each site on each iteration is determined by application of a node-specific cellular automaton to the signals being emitted by its neighboring nodes. They provide a vehicle for studying composition of cellular automata." Global Cellular Automaton Networks
CogWeb Glossary, Cognitive Science Eprint Archive
The Computational Manifold Approach to Consciousness and Symbolic Processing in the Cerebral Cortex
Novamente (Intelligent Virtual Agents), Research Papers, Mirror Neurons, Mirrorhouses, and the Algebraic Structure of the Self, Quantum Consciousness
How Intelligence Evolved?, Outline for a Theory of Intelligence
Physical Complexity and Cognitive Evolution
Bioenergetics: A Key to Mind and Brain
The Conscious Mind: In Search of a Fundamental Theory, No Matter, Never Mind: proceedings of toward a science of consciousness
Fundamental Principles of Cognitive Biology
Cognition, Categorization and Language: Cognitive Grammar meets Vantage Theory
"There can be no doubt that structural linguistics, which flourished half a century ago on both sides of the Atlantic Ocean, lived up to its name: it was structural because it considered languages to be self-contained entities that had either to be shaped into a rigorous structure, or actually possessed a structure which was real and merely waiting to be discovered."Does Cognitive Linguistics live up to its name?
Hyperstructure in Brain and Cognition, Extended Memory Evolutive Systems in a Hyperstructure Context
Higher Order Boolean Networks as Models of Cell State Dynamics
"Its central idea is to describe a database as a collection of predicates over a finite set of predicate variables, describing constraints on the possible values and combinations of values. The content of the database at any given time is a finite (logical) model of the database, i.e. a set of relations, one per predicate variable, such that all predicates are satisfied. A request for information from the database (a database query) is also a predicate." Relational Model
"Relational algebra, an offshoot of first-order logic (and of algebra of sets), deals with a set of finitary relations (see also relation (database)) which is closed under certain operators. These operators operate on one or more relations to yield a relation."Relational Algebra
"Knowledge engineering (KE) was defined in 1983 by Edward Feigenbaum, and Pamela McCorduck as follows:
KE is an engineering discipline that involves integrating knowledge into computer systems in order to solve complex problems normally requiring a high level of human expertise. At present, it refers to the building, maintaining and development of knowledge-based systems.[2] It has a great deal in common with software engineering, and is used in many computer science domains such as artificial intelligence, including databases, data mining, expert systems, decision support systems and geographic information systems. Knowledge engineering is also related to mathematical logic, as well as strongly involved in cognitive science and socio-cognitive engineering where the knowledge is produced by socio-cognitive aggregates (mainly humans) and is structured according to our understanding of how human reasoning and logic works.
Various activities of KE specific for the development of a knowledge-based system:
Assessment of the problem Development of a knowledge-based system shell/structure Acquisition and structuring of the related information, knowledge and specific preferences (IPK model) Implementation of the structured knowledge into knowledge bases Testing and validation of the inserted knowledge Integration and maintenance of the system Revision and evaluation of the system.
Being still more art than engineering, KE is not as neat as the above list in practice. The phases overlap, the process might be iterative, and many challenges could appear. Recently, emerges meta-knowledge engineering as a new formal systemic approach to the development of a unified knowledge and intelligence theory." http://en.wikipedia.org/wiki/Knowledge_engineering
"Systems engineering is an interdisciplinary field of engineering that focuses on how complex engineering projects should be designed and managed over the life cycle of the project. Issues such as logistics, the coordination of different teams, and automatic control of machinery become more difficult when dealing with large, complex projects. Systems engineering deals with work-processes and tools to handle such projects, and it overlaps with both technical and human-centered disciplines such as control engineering, industrial engineering, organizational studies, and project management." http://en.wikipedia.org/wiki/Systems_engineering
...systems engineering, engineering design, psychology and cognition, organization design and innovation, and economics and mathematics to construct a research program into foundational issues that underlie how we design and develop large-scale complex systems such as aircraft, spacecraft, and launch systems. http://vddi.org/ESDW.pdf
"Large aerospace projects are expected to overrun development costs and schedule, often by 100%. Manufacturing cost for military airplanes or spacecraft cost, on average, 50% above what was estimated when the program was first approved. Pe...rformance seldom meets all requirements. The US General Accountability Office would fix this by not allowing new technology into new aircraft, but this seems to throw the baby out with the bathwater. ... The Value-Driven Design Institute is dedicated to scientific research to advance the state of knowledge of Systems Engineering applied to large complex systems. We are particularly concerned with increasing the productivity of engineering design by introducing methods that eliminate cost and weight overruns, improve system performance, and manage technical risk. Areas of research are distributed optimal design, non-deterministic predictive models of component attributes (such as weight, performance, and cost models), system value models, and risk management. We publish our results in the open literature and provide technical training in all the methods we develop." http://vddi.org/vddi-home.htm
"Value-driven design creates an environment that enables and encourages design optimization by providing designers with an objective function and eliminating those constraints which have been expressed as performance requirements. The objective function inputs all the important attributes of the system being designed, and outputs a score. The higher the score, the better the design. Describing an early version of what is now called value-driven design, George Hazelrigg said,
"The purpose of this framework is to enable the assessment of a value for every design option so that options can be rationally compared and a choice taken." At the whole system level, the objective function which performs this assessment of value is called a "value model." The value model distinguishes value-driven design from Multi-Attribute Utility Theory applied to design. Whereas in Multi-Attribute Utility Theory, an objective function is constructed from stakeholder assessments, value-driven design employs economic analysis to build a value model. The basis for the value model is often an expression of profit for a business, but economic value models have also been developed for other organizations, such as government." http://en.wikipedia.org/wiki/Value-driven_design
The best you can do if you're trying to promote some social or economic goal is to try to set up the rules that present the right incentives for people to do the things you want them to do. http://ubiquity.acm.org/article.cfm?id=763912
MS&E 236: Game Theory with Engineering Applications Strategic interactions among multiple decision makers emphasizing applications to engineering systems. Topics: efficiency and fairness; collective decision making and cooperative games; static and dynamic noncooperative games; and complete and incomplete information models. Competition: Bertrand, Cournot, and Stackelberg models. Mechanism design: auctions, contracts. Examples from engineering problems. http://tinyurl.com/3m73644
Why Service Systems Engineering?
"Service innovation is rapidly becoming a priority for industry, government, and academics. These three stakeholders must work closely together to create the programs and investment needed to develop the service science, management, engineering, and design expertise and knowledge needed for the 21st century"
- James C. Spohrer, Ph.D. Director Service Research IBM http://www.sse.mtu.edu/
Meta-Theory, Philosophy & Third Culture
"Scientists . . . are used to dealing with doubt and uncertainty, he says, an experience the value of which extends beyond the sciences. I believe that to solve any problem that has never been solved before, you have to leave the door to the unknown ajar. You have to permit the possibility that you do not have it ...exactly right."The Meaning of It All, Richard Feynman
"To arrive at the edge of the world's knowledge, seek out the most complex and sophisticated minds, put them in a room together and have them ask eachother the questions they're asking themselves." Edge
"In April of this year, three internationally known theoretical physicists – Nobel Laureate Brian Josephson, Jack Sarfatti and F. David Peat – were excluded from a major international physics conference that Sarfatti initiated (the “de Broglie-Bohm,” “Bohm,” or “Towler Institute” conference in Tuscany) because of their interests in areas considered to be outside of “mainstream” quantum physics thought. Recently, Sarfatti agreed to an interview with the PRAXIS Society for Human Integrity and spoke about the subject of integrity and the climate of exclusivity and GroupThink that has developed within the international physics community. A five minute compilation of the Sarfatti interview can be seen at: http://www.youtube.com/watch?v=-VHDC7jVt3g" Jack Sarfatti and the Shadows on the Wall
"METATHEORY a theory the subject matter of which is another theory. A finding proved in the former that deals with the latter is known as a metatheorem. The most notable example of a metatheory was provided by David Hilbert, a German mathematician, who in 1905 set out to construct an elementary proof of the consistency of mathematics. For this purpose he needed a theory that studies mathematics and has mathematical proofs as the objects to be investigated. Although theorems proved in 1931 by Kurt Gödel, a Moravian–U.S. mathematical logician, made it unlikely that Hilbert’s program could succeed, his metamathematics became the forerunner of much fruitful research. From the late 1920s Rudolf Carnap, a leading philosopher of science and of language, extended this inquiry, under the headings metalogic and logical syntax, to the study of formalized languages in general. In discussing a formalized language it is usually necessary to employ a second, more powerful language. The former is then known as the object language, whereas the second is its metalanguage."Metatheory, Wikipedia, Metacognition: Knowing about Knowing, Dynamical Cognitive Science, Distributed Cognition and the Will: Individual Volition and Social Context
Douglas Hofstadter - Analogy as Core, Core as Analogy
"Keith Holyoak and Paul Thagard (1997) developed their multiconstraint theory within structure mapping theory. They defend that the "coherence" of an analogy depends on structural consistency, semantic similarity and purpose. Structural consistency is maximal when the analogy is an isomorphism, although lower levels are admitted. Similarity demands that the mapping connects similar elements and relations of source and target, at any level of abstraction. It is maximal when there are identical relations and when connected elements have many identical attributes. An analogy achieves its purpose insofar as it helps solve the problem at hand. The multiconstraint theory faces some difficulties when there are multiple sources, but these can be overcome. Hummel and Holyoak (2005) recast the multiconstraint theory within a neural network architecture. A problem for the multiconstraint theory arises from its concept of similarity, which, in this respect, is not obviously different from analogy itself. Computer applications demand that there are some identical attributes or relations at some level of abstraction. Human analogy does not, or at least not apparently.
Mark T. Keane and Brayshaw (1988) developed their Incremental Analogy Machine (IAM) to include working memory constraints as well as structural, semantic and pragmatic constraints, so that a subset of the base analog is selected and mapping from base to target occurs in a serial manner. Empirical evidence shows that human analogical mapping performance is influenced by information presentation order."
High-level perception: Douglas Hofstadter and his team challenged the shared structure theory and mostly its applications in computer science. They argue that there is no line between perception, including high-level perception, and analogical thought. In fact, analogy occurs not only after, but also before and at the same time as high-level perception. In high-level perception, humans make representations by selecting relevant information from low-level stimuli. Perception is necessary for analogy, but analogy is also necessary for high-level perception. Chalmers et al. conclude that analogy is high-level perception. Forbus et al. (1998) claim that this is only a metaphor. It has been argued (Morrison and Dietrich 1995) that Hofstadter's and Gentner's groups do not defend opposite views, but are instead dealing with different aspects of analogy."Analogy
Imagining the Tenth Dimension: A Book by Rob Bryanton
The Phaneron: Philosophy. Logic. Semiotics. Singularity.
"Ontologies cross over philosophy, natural language and computation. Spurred on by the hype surrounding the "Semantic Web", our recent examinations of ontologies has been dominated by the computational domain. Most references to "ontologies" found on the web today describe ontologies from the narrow focus on machine-computation and first-order logic. It is almost intuitively believed that these computational ontologies will take computing further toward mimicking human behavior." Ontologies and Semantics, State of the Art on Ontology Alignment
"We classify the works presented here in three broad categories: Algebraic approaches, which comprise the works of Bench-Capon and Malcolm on ontology morphisms, and that of Jannink and colleagues on an ontology composition algebra; Informationflow- based approaches, which include the works of Kent on the Information Flow Framework, that of Schorlemmer on duality in knowledge sharing, the IF-Map method of Kalfoglou and Schorlemmer based on information-flow theory and populated ontologies, the work of Priss on Peircean sign triads, and FCA-Merge, based on formal concept analysis and lattice exploration; and Translation frameworks, with the formative work of Gr¨uninger on the TOVE project. .. Very close in spirit and in the mathematical foundations of IFF, Schorlemmer studied the intrinsic duality of channel-theoretic constructions, and gave a precise formalisation to the notions of knowledge sharing scenario and knowledge sharing system. He used the categorical constructions of Chu spaces (Gupta 1994; Barr 1996; Pratt 1995) in order to precisely pin down some of the reasons why ontologies turn out to be insufficient in certain knowledge sharing scenarios (Corrˆea da Silva et al. 2002). His central argument is that formal analysis of knowledge sharing and ontology mapping has to take a duality between syntactic types (concept names, logical sentences, logical sequents) and particular situations (instances, models, semantics of inference rules) into account. Although no explicit definition of ontology mapping is given, there is an implicit one within the definition of knowledge sharing scenario, namely as a Chu transform." Ontology Mapping: The State of the Art, [38]
"Marco Schorlemmer’s research interests lie in tackling challenging engineering problems faced by software and knowledge engineers today by means of rigorous mathematical techniques from theoretical computer science. He has published over forty papers in specialised journals and international workshop and conference proceedings in the fields of formal specification and automated theorem proving, diagrammatic representation and reasoning, distributed knowledge coordination, and semantic interoperability of ontologies." [39]
Ontology, Society, and Ontotheology
"In 1959 C.P. Snow published a book titled The Two Cultures. On the one hand, there were the literary intellectuals; on the other, the scientists. He noted with incredulity that during the 1930s the literary intellectuals, while no one was looking, took to referring to themselves as "the intellectuals," as though there were no others. This new definition by the "men of letters" excluded scientists such as the astronomer Edwin Hubble, the mathematician John von Neumann, the cyberneticist Norbert Wiener, and the physicists Albert Einstein, Niels Bohr, and Werner Heisenberg." The Third Culture
Book review: Good and Real: Demystifying Paradoxes from Physics to Ethics by Gary Drescher.
Superrationality or Renormalized Reasoning
Exploring the role of mathematics in the scientific study of consciousness. https://youtube.com/channel/UC7Eq7alQ9gJgAVhVS3IcvQw/playlists
A (very) Brief History of Charles Ehresmann https://youtu.be/5LMK8-xrsVM
Science, Technology & Society
Quantrek - Engineering the Best Possible Future Through Frontier Science
"A Shock Level measures the high-tech concepts you can contemplate without being impressed, frightened, blindly enthusiastic - without exhibiting future shock. Shock Level Zero or SL0, for example, is modern technology and the modern-day world, SL1 is virtual reality or an ecommerce-based economy, SL2 is interstellar travel, medical immortality or genetic engineering, SL3 is nanotech or human-equivalent AI, and SL4 is the Singularity. The classification is useful because it helps measure what your audience is ready for; for example, going two Shock Levels higher will cause people to be shocked, but being seriously frightened takes three Shock Levels. Obviously this is just a loose rule of thumb! Also, I find that I often want to refer to groups by shock level; for example, "This argument works best between SL1 and SL2". (This does not mean that people with different Shock Levels are necessarily divided into opposing social factions. It's not an "Us and Them" thing.)
SL0: The legendary average person is comfortable with modern technology - not so much the frontiers of modern technology, but the technology used in everyday life. Most people, TV anchors, journalists, politicians.
SL1: Virtual reality, living to be a hundred, "The Road Ahead", "To Renew America", "Future Shock", the frontiers of modern technology as seen by Wired magazine. Scientists, novelty-seekers, early-adopters, programmers, technophiles.
SL2: Medical immortality, interplanetary exploration, major genetic engineering, and new ("alien") cultures. The average SF fan.
SL3: Nanotechnology, human-equivalent AI, minor intelligence enhancement, uploading, total body revision, intergalactic exploration. Extropians and transhumanists.
SL4: The Singularity, Jupiter Brains, Powers, complete mental revision, ultraintelligence, posthumanity, Alpha-Point computing, Apotheosis, the total evaporation of "life as we know it." Singularitarians and not much else.
If there's a Shock Level Five, I'm not sure I want to know about it!" Future Shock Levels
Cosmology, Physics & Phenomenology
"Physics, as conceived by most of us, is totally dependent on the assumption that causes are physically linked to their effects. Nothing is thought to happen without adjacency and contact, where contact is matter-to-matter, field-to-field, or matter-to-field. Adjacency is totally determined by the metric, or set of spatial relationships, on which it is defined. So physics is inherently metrical in nature, and phenomena which violate the metrical adjacency criterion totally disable the associated physics. In this event, physics must be extended. G is the exclusive syntax of metametrical theoretical extension, where the "metametric" is an arbitrary relational extension of the limiting physical subtheory that we call the "physical metric". Because inductive criteria forbid us to rule out nonlocal correlations and mechanisms, G is not a theoretical "option", but a fully justified metaphysics. All those who care about science and understanding can breathe a sigh of relief that this metaphysics has finally been discovered, precisely defined, and explicated as the CTMU.
Physics has long fixated upon the geometrical properties of space in its deepest analyses of time and causality. The structure of matter, on the other hand, has been seen in terms of algebraic symmetries of components. But physics is process, and petroglyphic mathematical entities can yield only a limited amount of insight on their own. It is time to realize that dynamical processes are computed, and that the mysterious power granted the observer in relativity and quantum theory resides entirely in the computative essence of observation." http://www.megasociety.org/noesis/46/index.html
"Abramsky a0910, a0910 [physical systems as Chu spaces]; > s.a. hilbert space; operator theory." Formulations of Quantum Physics
"I propose this new Holy Trinity.
Developer (creator, that which creates)
Code (nature, that which is)
User (consciousness, that which experiences)
We are the playwright, the play & the player
The designer, experience & experiencer
and finally...
god, cosmos, & being" Space Collective
"In his new book, Stapp insists that the "causal closure of the physical", in particular concerning quantum theory, is an untenable myth. He elaborates on ideas of Bohr, von Neumann, Heisenberg and, from a philosophical point of view, James and Whitehead to sketch a complex picture in which the physical and the mental are emphatically conditioned by each other. Stapp's wide-ranging proposal offers stimulating reading, a strong sense of conceptual coherence and intuitive appeal, and empirical predictions that deserve to be refined and tested."[40] Henry Stapp
Resource Guide for Physics and Whitehead
"Quantum Logical Causality, Category Theory, and the Philosophy of Whitehead: Connecting Zafiris’ Category Theoretic Models of Quantum Spacetime and the Logical-Causal Formalism of Quantum Relational Realism"
Recent work in the natural sciences—most notably in the areas of theoretical physics and evolutionary biology—has demonstrated that the lines separating philosophy and science have all but vanished with respect to current explorations of ‘fundamental’ questions (e.g., string theory, multiverse cosmologies, complexity-emergence theories, the nature of mind, etc.). The centuries-old breakdown of ‘natural philosophy’ into the divorced partners ‘philosophy’ and ‘science,’ therefore, must be rigorously reexamined. To that end, much of today’s most groundbreaking scholarship in the natural sciences has begun to include explicit appeals to interdisciplinary collaboration among the fields of applied natural sciences, mathematics and philosophy. This workshop will be dedicated to the question of how a philosophical-metaphysical theory can be fruitfully applied to basic conceptualizations in the natural sciences.
More narrowly, we will explore the process oriented metaphysical scheme developed by philosopher and mathematician Alfred North Whitehead (1861-1947) and Michael Epperson’s application of this scheme to recent work in quantum mechanics, and the relation of these to Elias Zafiris’s category theoretic model of quantum event structures." Workshop.pdf Quantum Logical Causality, Category Theory, and the Metaphysics of Alfred North Whitehead
"S. Majid: On the Relationship between Mathematics and Physics:
In ref. [7], S. Majid presents the following thesis' : "(roughly speaking) physics polarises down the middle into two parts, one which represents the other, but that the latter equally represents the former, i.e. the two should be treated on an equal footing. The starting point is that Nature after all does not know or care what mathematics is already in textbooks. Therefore the quest for the ultimate theory may well entail, probably does entail, inventing entirely new mathematics in the process. In other words, at least at some intuitive level, a theoretical physicist also has to be a pure mathematician. Then one can phrase the question `what is the ultimate theory of physics?' in the form `in the tableau of all mathematical concepts past present and future, is there some constrained surface or subset which is called physics?' Is there an equation for physics itself as a subset of mathematics? I believe there is and if it were to be found it would be called the ultimate theory of physics. Moreover, I believe that it can be found and that it has a lot to do with what is different about the way a physicist looks at the world compared to a mathematician...We can then try to elevate the idea to a more general principle of representation-theoretic self-duality, that a fundamental theory of physics is incomplete unless such a role-reversal is possible. We can go further and hope to fully determine the (supposed) structure of fundamental laws of nature among all mathematical structures by this self-duality condition. Such duality considerations are certainly evident in some form in the context of quantum theory and gravity. The situation is summarised to the left in the following diagram. For example, Lie groups provide the simplest examples of Riemannian geometry, while the representations of similar Lie groups provide the quantum numbers of elementary particles in quantum theory. Thus, both quantum theory and non-Euclidean geometry are needed for a self-dual picture. Hopf algebras (quantum groups) precisely serve to unify these mutually dual structures." IHES: On the Fusion of Mathematics and Theoretical Physics
"We take this need for a self-dual overall picture as a fundamental postulate for physics, which we called the principle of representation-theoretic self-duality:
(Postulate) a fundamental theory of physics is incomplete unless self-dual in the sense that such a role-reversal is possible. If a phenomenon is physically possible then so is its observer-observed reversed one.
One can also say this more dynamically: as physics improves its structures tend to become self dual in this sense. This has in my view the same status as the second law of thermodynamics: it happens tautologically because of the way we think about things." Algebraic Approach to Quantum Gravity: Relative Realism
"Next week I'm attending a symposium at the Royal Society on the origin of time and the universe, in honour of Michael Heller's 2008 Templeton Prize. There'll be talks by Michael himself, John Barrow, Andreas Doring, Paul Tod, and Shahn Majid. Majid proposed some years ago that the mathematical concept of self-duality can be used to provide an ultimate explanation of the universe in terms of the universe itself; a type of self-explanation. Michael Heller wrote a lovely paper on Majid's ideas, (Algebraic self-daulity as the 'ultimate explanation', Foundations of Science, Vol. 9, pp369-385) in 2004, and I've been reviewing this ahead of next week's symposium.
Majid illustrates his idea with the notion of a self-dual bicrossproduct Hopf algebra, (although he doesn't believe that this specific mathematical structure is a candidate for a theory of everything). To understand what this is, we first need to understand what a Hopf algebra is."A Self-Explaining Universe?, Bicrossproduct Hopf Algebras
"It is proposed that the physical universe is an instance of a mathematical structure which possesses a dual structure, and that this dual structure is the collection of all possible knowledge of the physical universe. In turn, the physical universe is then the dual space of the latter." The Duality of the Universe
Some Remarks on Turing and Spencer-Brown
How close are we to a fundamental theory?
What is the Universal Nilpotent Computational Rewrite System?, [41]
From Zero to Infinity: The Foundations of Physics
Closer to Truth (Cosmos. Consciousness. God.)
The Laws of Physics and Emergence
Reflections on a Self-Representing Universe
"The concept of measure is intimately involved with the notion of number. Modeling, a sophisticated form of abstract description, using mathematics and computation, both tied to the concept of number, and their advantages and disadvantages are exquisitely detailed by Robert Rosen in Life Itself, Anticipatory Systems, and Fundamentals of Measurement. One would have hoped that mathematics or computer simulations would reduce the need for word descriptions in scientific models. Unfortunately for scientific modeling, one cannot do as David Hilbert or Alonzo Church proposed: divorce semantics (e.g., symbolic words: referents to objects in reality) from syntax (e.g., symbolic numbers: referents to a part of a formal system of computation or entailment). One cannot do this, even in mathematics without things becoming trivial (ala Kurt Godel). It suffices to say that number theory (e.g., calculus), category theory, hypersets, and cellular automata, to mention few, all have their limited uses. The integration between all of these formalisms will be necessary plus rigorous attachment of words and numbers to show the depth and shallowness of the formal models. These rigorous attachments of words are ambiguous to a precise degree without the surrounding contexts. Relating precisely with these ambiguous words to these simple models will constitute an integration of a reasonable set of formalisms to help characterize reality." Existence Itself: Towards the Phenomenology of Massive Dissipative/Replicative Structures
Truth without Satisfaction https://www.jstor.org/stable/30226315
Neil Turok on the Theory of Cosmological Inflation https://youtu.be/DxetKz61evI
“The Steinhardt–Turok model
In this cyclic model, two parallel orbifold planes or M-branes collide periodically in a higher-dimensional space. The visible four-dimensional universe lies on one of these branes. The collisions correspond to a reversal from contraction to expansion, or a Big Crunch followed immediately by a Big Bang. The matter and radiation we see today were generated during the most recent collision in a pattern dictated by quantum fluctuations created before the branes. After billions of years the universe reached the state we observe today; after additional billions of years it will ultimately begin to contract again. Dark energy corresponds to a force between the branes, and serves the crucial role of solving the monopole, horizon, and flatness problems. Moreover, the cycles can continue indefinitely into the past and the future, and the solution is an attractor, so it can provide a complete history of the universe.” https://en.wikipedia.org/wiki/Cyclic_model