Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hybrid Elimination Improvements #1575

Merged
merged 17 commits into from
Jul 19, 2023
Merged

Hybrid Elimination Improvements #1575

merged 17 commits into from
Jul 19, 2023

Conversation

varunagrawal
Copy link
Collaborator

@varunagrawal varunagrawal commented Jul 12, 2023

  • Override apply with UnaryAssignment for DecisionTreeFactor.
  • Improve discrete elimination by eliminating the joint distribution and re-segregating back into original conditionals.
  • Make hybrid code use common parent DiscreteFactor where applicable.
  • Update the eliminate method to check the factors as mentioned in @dellaert's TODO.
  • Templetize methods in Switching.h to remove duplication.

@varunagrawal varunagrawal self-assigned this Jul 12, 2023
@varunagrawal
Copy link
Collaborator Author

I changed the way we prune the discrete probabilities by pruning the joint distribution rather than the conditionals. This gives a 3x speedup. Maybe we should be pruning before discrete-only elimination.

Before

Number of timesteps: K = 16

-Total: 0 CPU (0 times, 0 wall, 0.65 children, min: 0 max: 0)
|   -SmootherEstimation: 3.65 CPU (1 times, 3.70242 wall, 0.65 children, min: 0 max: 0)
|   |   -SmootherUpdate: 3.63 CPU (15 times, 3.67252 wall, 0.65 children, min: 0 max: 0)
|   |   |   -assembleGraphTree: 0.01 CPU (30 times, 0.00976 wall, 0.01 children, min: 0 max: 0)
|   |   |   -hybrid continuous eliminate: 0 CPU (168 times, 0.006431 wall, 0 children, min: 0 max: 0)
|   |   |   -HybridBayesNet PruneDiscreteConditionals: 0.64 CPU (15 times, 0.640958 wall, 0.64 children, min: 0 max: 0)
|   |   |   -HybridBayesNet UpdateDiscreteConditionals: 2.8 CPU (15 times, 2.84436 wall, 0 children, min: 0 max: 0)
|   |   |   |   -HybridBayesNet MakeConditional: 0 CPU (120 times, 0.001385 wall, 0 children, min: 0 max: 0)
|   |   |   -HybridBayesNet PruneMixtures: 0 CPU (15 times, 0.009233 wall, 0 children, min: 0 max: 0)

After

Number of timesteps: K = 16

-Total: 0 CPU (0 times, 0 wall, 0.71 children, min: 0 max: 0)
|   -SmootherEstimation: 0.91 CPU (1 times, 0.904348 wall, 0.71 children, min: 0 max: 0)
|   |   -SmootherUpdate: 0.88 CPU (15 times, 0.882961 wall, 0.71 children, min: 0 max: 0)
|   |   |   -assembleGraphTree: 0 CPU (30 times, 0.009388 wall, 0 children, min: 0 max: 0)
|   |   |   -hybrid continuous eliminate: 0 CPU (168 times, 0.004795 wall, 0 children, min: 0 max: 0)
|   |   |   -HybridBayesNet PruneDiscreteConditionals: 0.64 CPU (15 times, 0.632673 wall, 0.64 children, min: 0 max: 0)
|   |   |   -HybridBayesNet UpdateDiscreteConditionals: 0.07 CPU (15 times, 0.071286 wall, 0.07 children, min: 0 max: 0)
|   |   |   -HybridBayesNet PruneMixtures: 0 CPU (15 times, 0.00871 wall, 0 children, min: 0 max: 0

@dellaert
Copy link
Member

I changed the way we prune the discrete probabilities by pruning the joint distribution rather than the conditionals. This gives a 3x speedup. Maybe we should be pruning before discrete-only elimination.

Does this PR make both these changes? I'd prefer to review a PR that just does the tablefactor and shows the speedup...

@varunagrawal varunagrawal changed the title TableFactor for Hybrid Elimination Hybrid Elimination Improvements Jul 16, 2023
@varunagrawal
Copy link
Collaborator Author

I changed the way we prune the discrete probabilities by pruning the joint distribution rather than the conditionals. This gives a 3x speedup. Maybe we should be pruning before discrete-only elimination.

Does this PR make both these changes? I'd prefer to review a PR that just does the tablefactor and shows the speedup...

This PR only updates the discrete elimination to prune the joint distribution. TableFactor is not yet used, and will come in a subsequent PR. :)

Updated PR description to reflect the changes happening here.

@dellaert
Copy link
Member

I'd still like to request to split into TableFactor related changes and other, and PR straight to develop? Otherwise the base branch will become an un-reviewable kitchen-sink PR. PS CI seems to fail, so splitting might help there as well.

@varunagrawal
Copy link
Collaborator Author

In that case I'm going to have to do some cherry picking and force pushing.

@varunagrawal
Copy link
Collaborator Author

I'd still like to request to split into TableFactor related changes and other, and PR straight to develop? Otherwise the base branch will become an un-reviewable kitchen-sink PR. PS CI seems to fail, so splitting might help there as well.

It won't be a kitchen sink PR if we merge in the parent PRs first.

Base automatically changed from hybrid-tablefactor to develop July 17, 2023 15:54
@varunagrawal
Copy link
Collaborator Author

@dellaert I split the changes to TableFactor into #1580.

Copy link
Member

@dellaert dellaert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some comments. This PR changed too many things at once to be sure of anything, though.

@@ -299,6 +299,42 @@ namespace gtsam {
/// Return the number of leaves in the tree.
size_t nrLeaves() const;

/**
* @brief This is a convenience function which returns the total number of
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spelling. And why are we adding it ? And why is the implementation recursive.
I would just as well delete it unless it has a purpose.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a bit lost on which word is misspelled. The purpose is to help with testing and ensure correctness as a convenience method.

@@ -231,7 +231,7 @@ TEST(HybridBayesNet, Pruning) {
auto prunedTree = prunedBayesNet.evaluate(delta.continuous());

// Regression test on pruned logProbability tree
std::vector<double> pruned_leaves = {0.0, 20.346113, 0.0, 19.738098};
std::vector<double> pruned_leaves = {0.0, 32.713418, 0.0, 31.735823};
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's happening here? Why are regressions changing ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The decision tree is normalizing the values based on the introduced zeros from pruning. Since I changed the way we're pruning (on the joint rather than the conditionals), the normalizing factor has changed.

double density = exp(logProbability);
EXPECT_DOUBLES_EQUAL(density, actualTree(discrete_values), 1e-9);
EXPECT_DOUBLES_EQUAL(density,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was not a regression but suddenly there is an arbitrary mult factor here?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is the multiplicative factor for the different normalization constant.

@@ -63,8 +63,8 @@ TEST(MixtureFactor, Printing) {
R"(Hybrid [x1 x2; 1]
MixtureFactor
Choice(1)
0 Leaf Nonlinear factor on 2 keys
1 Leaf Nonlinear factor on 2 keys
0 Leaf [1]Nonlinear factor on 2 keys
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

spacing is weird in this case.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

Copy link
Member

@dellaert dellaert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, merge this at will :-)

@varunagrawal varunagrawal merged commit ba7c077 into develop Jul 19, 2023
26 checks passed
@varunagrawal varunagrawal deleted the hybrid-tablefactor-2 branch July 19, 2023 10:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants