Over the past fifty years, the fabric of our society has been radically transformed by successful logic-based applications. In today’s world, logic chips (i.e., CPUs) and the logic-grounded software that controls them support nearly all socio-economic infrastructure, from banking to defense. These global computing applications are the major arena where logic as a body of knowledge meets, and is tested against reality. Although classical first- order logic (FOL) appears to work perfectly for truth functional applications such as computer chip design where propositions are guaranteed to have a truth value (i.e., a positive XOR negative charge), inference problems characteristic of widely used logic-grounded software technologies [Thomsen 2002], and representation problems characteristic of logic-based research into natural language understanding (Berant, Chou, Frostig et al., 2013) suggest that there are foundational areas in logic that are not yet completely understood.
Following is a rough introduction to my R&D activities in pure and applied logic. It has a decidedly Wittgensteinian bent.
Research into Type-Logical semantics and its application to real world problems of inference and knowledge management
In the paper Improving Semantic Software with Tractarian Logic_Final , we focus on the problem of translating information from a natural form (e.g., token sequences for natural languages, data bases or spreadsheets) into an explicitly logical form (e.g., First Order Logic FOL). And we outline the real world problems that occur in these kind of semantic software applications owing to specific characteristics of the FOL upon which they are based. We then describe how Wittgenstein in the Tractatus and his lectures from the early 1930s advocates logical approaches relevant to semantic software that differ in significant ways from what became absorbed into consensus FOL. In particular, we argue that Wittgenstein’s views on logic provide practical guidance for the mapping of sentences into putative propositions, for differentiating classically behaved propositions from other wff, and for the role of logic in natural language processing
In the paper Triggering a Copernican Shift in Logic through Sequenced Evaluations -final – which was a finalist in a paper competition on the future of logic, we evaluate the performance of classical first order logic across a variety of logic-consuming information technologies. For those areas where classical logic fares worst, we identify specific changes to first order logic that would improve both logic as an abstract discipline against its own abstract criteria (i.e., paradox) and the performance of those information technologies that most depend on logic
2013-2016: Development of a smart information grid
Supported by US government R&D grants, built software that extracts semantics from various surface media and fuses it all into an extensible spatio-temporal event representation that supports logical and computational reasoning
2012: Rough draft (core dump) for a new paradigm and a formal model for logic that attempts to extend classical first order logic (FOL)
How the Great War impacted the roots of modern logic An historical introduction to the unfinished research program of Bertrand Russell and Ludwig Wittgenstein and why it is relevant today
A Critique of the consensus paradigm in logic An introduction to the consensus paradigm and a description of its most crippling flaws
A new paradigm for logic Describes cognition and awareness through multiple levels of embedded processes. Points the way to a new view of the mind-body distinction. And a new grounding for logic.
Knowing the World is Language A draft proposed theory of logic based on a new paradigm of cognitive processing that also attempts to provide a foundation for language and mathematics
2007-2011: start ups and consulting: prototyping specific type engine-based applications
⇒ How to extract the dimensional model implicit within a collection of spreadsheets
⇒ How to generate image signatures without the use of pre-defined ontologies in a way that would make them comprehensible to humans
⇒ How to extract testable propositions from text
⇒ How to parse text into a logical form that immediately integrates with a knowledge store
⇒ How to represent disparate information in a semantically unified fashion
⇒ How to infer across quantitative and qualitative information in the presence of uncertainties, and inconsistencies
♦ Overview to the LC Logical Typing System
♦LC Foundations of Abstract Science
♦Logical implication for propositional variables in LC Type Logic
♦Logical implication for predicate variables in LC Type logic
♦Patent filing (#20100169758) Extracting Semantics from Data
♦A critique of the consensus paradigm in logic
At a more applied level
Regarding spreadsheet interpretation
Regarding image interpretation
2003-2007: Working as chief scientist for a large corporation building type-logical engine pieces and applications for business
⇒ Computational models for organizational performance management
⇒ Abstract definitional layers that support federated heterogeneous query processing
⇒ Next gen Type Engines
♦ Organizational performance management: grounding papers
♦ Enterprise risk management
♦ Patent (#7,797,320) on a method for computing semi-orthogonal partial bases in N dimensional categorical data sets
♦ Patent (#7,631,005) on a method for defining nested irregular views
1994-2003 Period of application and synthesis
⇒ What are the core theoretical problems with canonical logic?
⇒ What are the core theoretical problems with canonical mathematics
⇒ How do the foundations of math and logic relate to software?
⇒ First cut of a foundation for abstract science
⇒ What are the key functional characteristics of Type Engines as compared with current Relational or Multidimensional database architectures
♦ Relating LC to the Canonical foundations of Mathematics
♦ Relating LC to Canonical Logic
♦ Overview of the relation between LC and Canonical Logic
♦ OLAP Solutions: Building Multidimensional Information systems 2nd edition
♦ Microsoft OLAP Solutions
♦ OLAP Solutions 1st edition
♦ Synthesizing knowledge from large data sets
1989-1994: First cut at an alternative foundation, first cut at mapping it to canonical beliefs/approaches, first application to software
⇒ What is the core process of meaning creation?
⇒ What foundational problems (aka paradoxes) does this process/approach resolve?
⇒ What practical problems can be solved by software that follows this approach?
⇒What are the criteria for meaningfulness as relates to any kind of expression?
⇒What is the smallest number of most irreducible dimensions that can be used to relate any expression with any expression?
⇒How do these criteria for meaningfulness map to those of LW in the Tractatus?
♦ Kirkberg Austria 1993: Foundations of Logic
“Subsuming multi valued logics in a two tiered bi-valent approach”
presented at the Wittgenstein Symposium on the Foundations of Logic
♦ Kirkberg Austria 1992: Foundations of Mathematics
Peer-reviewed paper and publication
“A Functional Basis for Tractarian Number Theory”
presented at the Wittgenstein Symposium on the Foundations of Mathematics
Austria 1992; published in Wittgenstein’s Philosophy of Mathematics
Verlag Holder-Pichler-Tempsky Wien 1993
♦ Kirkberg Austria 1989: Foundations of Language
Peer-reviewed paper and publication
“A Tractarian Approach to Information Modeling”
presented at the Wittgenstein Symposium on the Foundations of Language
Austria 1989; published in Wittgenstein-Towards a Re-evaluation
Verlag Holder-Pichler-Tempsky Wien 1990
♦ FreeThink (user manual and docs)
♦ Adding to Space and Time
♦ Relating Axiomatic systems to Empirical Knowledge
♦ Patent for multi level hierarchies