Skip to content

Conversation

omereliy
Copy link
Contributor

@omereliy omereliy commented Jun 2, 2025

Coverage Badge

  • negative preconditions
  • added type hierarchy
  • model type checking
  • aligning action\fluents lifted\grounded\parameter bound classes to formal definitions.
  • generator now allows observing static fluents.

@haz haz marked this pull request as draft June 2, 2025 13:59
@haz
Copy link
Contributor

haz commented Jun 2, 2025

Thanks, @omereliy ! Can you start by explaining the shift to macq/core? We have parsed actions for generation, and then learned actions (potentially lifted) on the extraction side -- what's the need for yet another action spec?

@omereliy
Copy link
Contributor Author

omereliy commented Jun 2, 2025

The suggestion includes ultimately removing all fluent/action related class that are not a part of the core package. And adapting all other packages to use those classes.

The learnedLiftedFluent behaves as a parameter bound fluent and hashed as it is a lifted fluent. Highlighting the need for the addition of parameter bound fluent. The learnedLiftedAction class stores sets of the learnedLiftedFluent class as preconditions and effects which I find error prone since it should be a parameter bound fluent.( some fluent can "disappear" when storing in a set).

The trace action/fluent classes can be of the same class as in the learned types. I don't see the need of redundant classes.

The split allowes easier handling of PDDL generation of the model.

This will eventually allow applying a learned action on a state (if fluent share the same name). Ultimately allowing trace generation, grounding/lifting and binding of a model, action and fluent. And bottom line making the evaluation of the model and debugging of process easier.

Future work can include a package for model evaluation.

@omereliy
Copy link
Contributor Author

omereliy commented Jun 2, 2025

The lack of type hierarchy can cause a lot of problems when trying to generate a pddl with tarski package, (multiple definitions of a variable error, if to be specific fluents when learning typed models with hierarchy tree of depth>2 where the root is 'object' type)

@haz
Copy link
Contributor

haz commented Jun 2, 2025

Ok, a scrapping of the previous separation is a fairly major shift we should think on.

@e-cal / @beckydvn / @TathagataChakraborti : I know it's been ages. Like, really long ago. But can you remember the original philosophy behind having separate action/fluent classes from the trace side -vs- learned representation side? I recall us debating it, but I don't recall what went into the final decision.

Other points, @omereliy :

  • We should be shifting away from tarski, and so I'm not so interested in changes that are needed just for tarksi's sake.
  • Having type hierarchy makes sense, but I don't think there's much out there in the way of extraction methods that learn the hierarchy, right?
  • Not sure I understand what you're saying here:

The learnedLiftedFluent behaves as a parameter bound fluent and hashed as it is a lifted fluent. Highlighting the need for the addition of parameter bound fluent. The learnedLiftedAction class stores sets of the learnedLiftedFluent class as preconditions and effects which I find error prone since it should be a parameter bound fluent.( some fluent can "disappear" when storing in a set).

By far the biggest argument to unify everything is so that we pave the way for metrics. But I'm hoping to get a take from the original co-authors to be sure that's a safe move to make.

@TathagataChakraborti
Copy link
Collaborator

I know it's been ages. Like, really long ago. But can you remember the original philosophy behind having separate action/fluent classes from the trace side -vs- learned representation side?

I think one of the fundamental ideas about model learning we wanted to preserve is that the model you choose to learn can have different properties than the underlying model (unknown) that produced the traces i.e. there can be multiple interpretations of the same traces with different sets of assumption on model properties (e.g. is it noise in observation or stochasticity of the behavior?).

But I am not sure the classes need to be different to capture this.

cause a lot of problems when trying to generate a pddl with tarski package, (multiple definitions of a variable error,

can confirm this 😬 i had to make a emergency fork recently just to make something run. 🫣

@haz
Copy link
Contributor

haz commented Jun 4, 2025

Thanks, @TathagataChakraborti !

But I am not sure the classes need to be different to capture this.

Definitely needs to be flexible enough to allow for differing input/output formats (just imagine all the lifted source PDDL with grounded actions learned), but as you point out, it's the class separation that remains a conundrum. @e-cal / @beckydvn ?

@e-cal
Copy link
Collaborator

e-cal commented Jun 4, 2025

I think separating the classes just better fits the mental (and literal) model of separation between actual and learned actions & fluents. The data models end up being pretty different since actual is a 1-1 representation of the original pddl while learned is whatever the extraction technique manages to pull out, so there isn't a lot that can be cleanly shared between the classes.

But, there's no technical reason as far as I can recall, and having them all inherit from a parent class seems like more of a java flavored / object oriented way of designing the internal api vs a more procedural way.

The only issue I could foresee is the classic issue with complete object oriented inheritance structures where you inevitably need to add some behavior or property to a child class that doesn't exactly suit the existing inheritance scheme and then requires a bunch of refactoring and more sub-classing.

My personal preference is to keep things more isolated and use data models over big OO class inheritance chains, but since I haven't worked on the codebase in awhile I can't speak to whether or not changing it would help or hurt the dev experience.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants