Slide 1 - Rensselaer Polytechnic Institute

LbR and Robust Reasoning (Solidification Loop);
LbR and Introspection (Solidification Loop);
LbR and Knowledge Acquisition Over Text with
Diagrammatic Content (Know. Acquisition Loop)
Slides for Noah’s Prospective LbR Program
run “Slide Show” for hyperlinks
version of July 8th, 2005
Selmer Bringsjord
Rensselaer AI & Reasoning Lab
Department of Cognitive Science (Chair)
Department of Computer Science
Rensselaer Polytechnic Institute (RPI)
Troy NY 12180 USA
[email protected]
LbR and Robust Reasoning
•
Arguably the best machine reasoning system in the world today: Athena,
invented by Konstantine Arkoudas
– seamlessly integrated with standard first-order ATPs (e.g., Vampire, SPASS,
Otter, and whatever is next in this ever-progressing game)
– fully programmable (& allows recursive datatypes, polymorphism, etc.)
– human control adjustable from zero to full
– includes cutting-edge explanation generation in straight English as a problem
solving technique
– “natural” format used, not simple chaining or resolution: modeled on what is
human-readable (see the pdf avalable @
http://www.mm.rpi.edu/SELPAP/ENCYCCOGSCI/bringsjord.encyccogsci.html for an
overview of natural-style reasoning)
• natural format allows representation and reasoning over uncertain information
– allows ideal mix of efficiency and expressivity via sub-sorted multi-sorted logic
(e.g., modal logics can be efficiently encoded in MSLsub)
– enables model generation abduction
– integrated with model builders
– already includes more than the features currently being sought for the ISO &
ARDA standard, Common Logic!
– proved sound and has fully formal semantics, and falls within the class of wellfounded Denotational Proof Languages
Robust Reasoning & Explanation
Generation in English
• Using Athena, we can generate fluid English from proofs
and arguments, including partial and defective proofs
and arguments.
• A problem solving technique that we can therefore
support in the Solidification Loop is to have the system
generate English to communicate the state of its
knowledge, and have humans judge on the basis of this
English what adjustments and additions need to be
made to the system’s knowledge.
• Here is a demo and explanation of our seminal
approach/R&D in Natural Language Generation:
– http://www.cs.rpi.edu/~khemls/marmml/nlg_demo_draft2.mpg
LbR and Robust
Introspection
Reasoning
• Arguably the best machine reasoning system in the world today:
Athena, invented by Konstantine Arkoudas
– seamlessly integrated with standard first-order ATPs (e.g., Vampire,
SPASS, Otter, and whatever is next in this ever-progressing game)
– fully programmable (& allows recursive datatypes, polymorphism, etc.)
– human control adjustable from zero to full
– “natural” format used, not simple chaining or resolution: modeled on
what is human-readable (see the pdf avalable @
http://www.mm.rpi.edu/SELPAP/ENCYCCOGSCI/bringsjord.encyccogsci.html for an
overview of natural-style reasoning)
• natural format allows representation and reasoning over uncertain
information
– allows ideal mix of efficiency and expressivity via sub-sorted multisorted logic (e.g., modal logics can be efficiently encoded in MSLsub)
– enables model generation abduction
– integrated with model builders
– already includes more than the features currently being sought for the
ISO & ARDA standard, Common Logic!
– proved sound and has fully formal semantics, and falls within the class
of well-founded Denotational Proof Languages
1. Abductive Introspection Based on
Natural Deduction/Argumentation
• When reasoning is based not on flat, uninformative
forms of reasoning like resolution, but rather on natural
deduction and argumentation, the technique of goal
analysis can be used.
• Goal analysis allows gaps in reasoning to be readily
identified, and the required “shape” of these gaps to be
pinned down.
• Once these gaps have been identified, abductive
hypotheses can be ventured as to how to fill them in in
order to complete the reasoning in question.
• These hypotheses can then be supplied to the
Knowledge Acquisition loop.
LbR and Introspection
• Arguably the best machine reasoning system in the world today:
Athena, invented by Konstantine Arkoudas
– seamlessly integrated with standard first-order ATPs (e.g., Vampire,
SPASS, Otter, and whatever is next in this ever-progressing game)
– fully programmable (& allows recursive datatypes, polymorphism, etc.)
– human control adjustable from zero to full
– “natural” format used, not simple chaining or resolution: modeled on
what is human-readable (see the pdf avalable @
http://www.mm.rpi.edu/SELPAP/ENCYCCOGSCI/bringsjord.encyccogsci.html for an
overview of natural-style reasoning)
• natural format allows representation and reasoning over uncertain
information
– allows ideal mix of efficiency and expressivity via sub-sorted multisorted logic (e.g., modal logics can be efficiently encoded in MSLsub)
– enables model generation abduction
– integrated with model builders
– already includes more than the features currently being sought for the
ISO & ARDA standard, Common Logic!
– proved sound and has fully formal semantics, and falls within the class
of well-founded Denotational Proof Languages
2. A New Form of Abduction: Model
Generation Abduction
• Suppose that the system is trying to establish some proposition A on
the basis of background knowledge  and a particular theory C, but
only has some of the knowledge it takes to substantiate A.
• Then the following form of abduction can be run semi-automated
using Athena as a problem solving technique, because refinement of
C can be carried out by the Know. Acquisition loop; i.e., the
information said to be possibly missing can be sought by the Know.
Acq. loop.
• Please note that model finding is an automated process under
Athena. It is a way of building a possible scenario.
LbR and Introspection
• Arguably the best machine reasoning system in the world today:
Athena, invented by Konstantine Arkoudas
– seamlessly integrated with standard first-order ATPs (e.g., Vampire,
SPASS, Otter, and whatever is next in this ever-progressing game)
– fully programmable (& allows recursive datatypes, polymorphism, etc.)
– human control adjustable from zero to full
– “natural” format used, not simple chaining or resolution: modeled on
what is human-readable (see the pdf avalable @
http://www.mm.rpi.edu/SELPAP/ENCYCCOGSCI/bringsjord.encyccogsci.html for an
overview of natural-style reasoning)
• natural format allows representation and reasoning over uncertain
information
– allows ideal mix of efficiency and expressivity via sub-sorted multisorted logic (e.g., modal logics can be efficiently encoded in MSLsub)
– enables model generation abduction
– integrated with model builders
– already includes more than the features currently being sought for the
ISO & ARDA standard, Common Logic!
– proved sound and has fully formal semantics, and falls within the class
of well-founded Denotational Proof Languages
3. MSL-Encoded Epistemic Logic as
a Framework for Introspection
• The Challenge:
– Provide a rigorous framework for managing when and how reasoning
fails, and for determining missing knowledge from the failure and drive
the acquisition of that missing knowledge. And, do all this rapidly.
• That is, a three-part need:
– The envisaged framework must know what it knows, and know what it
doesn’t know (relative to its goals).
– The framework must be able to arrive at a state in which it does have
the knowledge it needs.
– This must be done in unprecedentedly (such metareasoning has been
notoriously slow in the past) rapid fashion.
• The solution: Use extension of techniques already validated for
AFRL by Bringsjord and Arkoudas, as explained in this paper.
• In addition, use techniques for proof and argument diagnosis,
already in use in Bringsjord’s ARDA-sponsored Slate system.
Knowledge Acquisition & LbR Text
Containing Diagrams & Pictures
• Arkoudas and Bringsjord have created a new
Denotational Proof Language, tentatively called DNDL,
for the express purpose of machine reading of text
containining diagrams.
• Bringsjord’s RAIR Lab also has a team working on
image processing, where the images are specifically
ones found in math text to be machine read. This team
has working systems now that read diagram-rich text
directly.
• Because it’s not clear that the texts to be read in this
LbR program will have diagrams that are to be machine
read in earnest, we don’t include here hyperlinks to
technical content.