Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Inference Agent #232

Open
andre-senna opened this issue Jan 30, 2025 · 0 comments
Open

Implement Inference Agent #232

andre-senna opened this issue Jan 30, 2025 · 0 comments
Assignees
Milestone

Comments

@andre-senna
Copy link
Contributor

andre-senna commented Jan 30, 2025

Inference Agent is a server in DAS architecture (like Link Creation Agent and Query Agent). They are supposed to be in the same DASNode network.

Inference Agent processes inference requests. There will be several different types of inferences requests but for now we'll focus only in two of them:

  • PROOF_OF_IMPLICATION_OR_EQUIVALENCE
  • PROOF_OF_IMPLICATION
  • PROOF_OF_EQUIVALENCE

Each request types should be implemented as commands which expect potentially different parameters, although, for these three commands we're going to implement, the list of parameters is the same.

PROOF_OF_IMPLICATION_OR_EQUIVALENCE <handle1> <handle2> <max proof length>
PROOF_OF_IMPLICATION <handle1> <handle2> <max proof length>
PROOF_OF_EQUIVALENCE <handle1> <handle2> <max proof length>

Inference requests are processed mostly in the same way:

  1. Send request to Link Creation Agent to create reasoning links passing a list of [QUERY_LINK_CREATION, PROCESSOR/CREATION_TEMPLATE] pairs
  2. Start Distributed Inference Control algorithm passing QUERY_INFERENCE and FITNESS_FUNCTION
  3. Wait for the inference to finish
  4. Send results back to the caller

But different types of inference will use different queries and different processors and fitness functions in steps 1. and 2.

Both, Link Creation Agent and Inference Control Algorithm will be part of DAS architecture so we can just use shared identifiers to indicate which link processor or which fitness function are supposed to be used. I suggest using string enumerations shared in src/cpp/commons.

Processors will be implemented in the Link Creation Agent while fitness functions will be implemented in the Distributed Inference Control algorithm.

Follows the sequence diagram depicting how inference requests are supposed to unfold.

Image

PROOF_OF_IMPLICATION_OR_EQUIVALENCE

Link Creation request

Actually, the Inference Agent issues multiples requests to the Link Creation Agent.

(1) Requests to create SATISFYING_SET and PATTERNS

Query:

LINK_TEMPLATE Expression 3
    NODE Symbol EVALUATION
    LINK_TEMPLATE Expression 2
        NODE Symbol PREDICATE
        VARIABLE P
    LINK_TEMPLATE Expression 2
        NODE Symbol CONCEPT
        VARIABLE C

Creation template:

LIST 2
    LINK_CREATE Expression 2 1
        NODE Symbol SATISFYING_SET
        P
        CUSTOM_FIELD truth_value 2
            strength 1.0
            confidence 1.0
    LINK_CREATE Expression 2 1
        NODE Symbol PATTERNS
        C
        CUSTOM_FIELD truth_value 2
            strength 1.0
            confidence 1.0

(2) Requests to create IMPLICATION

Query:

AND 3
    LINK_TEMPLATE Expression 2
        NODE Symbol SATISFYING_SET
        VARIABLE P1
    LINK_TEMPLATE Expression 2
        NODE Symbol SATISFYING_SET
        VARIABLE P2
    NOT
        OR
            LINK_TEMPLATE Expression 3
                NODE Symbol IMPLICATION
                LINK_TEMPLATE Expression 3
                    NODE Symbol EVALUATION
                    VARIABLE P1
                    VARIABLE C
                LINK_TEMPLATE Expression 3
                    NODE Symbol EVALUATION
                    VARIABLE P2
                    VARIABLE C        
            LINK_TEMPLATE Expression 3
                NODE Symbol IMPLICATION
                LINK_TEMPLATE Expression 3
                    NODE Symbol EVALUATION
                    VARIABLE P2
                    VARIABLE C
                LINK_TEMPLATE Expression 3
                    NODE Symbol EVALUATION
                    VARIABLE P1
                    VARIABLE C

Processor: IMPLICATION_DEDUCTION

(3) Requests to create EQUIVALENCE

Query:

AND 3
    LINK_TEMPLATE Expression 2
        NODE Symbol PATTERNS
        VARIABLE C1
    LINK_TEMPLATE Expression 2
        NODE Symbol PATTERNS
        VARIABLE C2
    NOT
        OR
            LINK_TEMPLATE Expression 3
                NODE Symbol EQUIVALENCE
                LINK_TEMPLATE Expression 3
                    NODE Symbol EVALUATION
                    VARIABLE P
                    VARIABLE C1
                LINK_TEMPLATE Expression 3
                    NODE Symbol EVALUATION
                    VARIABLE P
                    VARIABLE C2        
            LINK_TEMPLATE Expression 3
                NODE Symbol EQUIVALENCE
                LINK_TEMPLATE Expression 3
                    NODE Symbol EVALUATION
                    VARIABLE P
                    VARIABLE C2
                LINK_TEMPLATE Expression 3
                    NODE Symbol EVALUATION
                    VARIABLE P
                    VARIABLE C1

Processor: EQUIVALENCE_DEDUCTION

Distributed Inference Algorithm parameters

This is the query passed to the distributed inference control algorithm:

CHAIN
    <handle 1>
    <handle 2>
    <max proof length>
    OR
        LINK_TEMPLATE Expression 3
            NODE Symbol IMPLICATION
            _FIRST_
            _SECOND_
        LINK_TEMPLATE Expression 3
            NODE Symbol EQUIVALENCE
            _FIRST_
            _SECOND_

And the tag for the fitness function is "IMPLICATION_OR_EQUIVALENCE_DEDUCTION"

PROOF_OF_IMPLICATION and PROOF_OF_EQUIVALENCE

These types of inference requests will be detailed in another card

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants