This repository has been archived by the owner on Nov 3, 2024. It is now read-only.
Associate odom orientation with image data (Feature Detection) #44
Labels
enhancement
Upgrade(s) to an existing feature/code-snippet
Estimation
Estimation/Prediction/Filtering related
Future
This will not be worked on for now
Milestone
Time estimate: 15 hours
Currently the plan is to use the Hough Transform (HF) as a base detector for the gate and the pole.
To use it as base filter we need to filter out non-vertically and non-horizontally oriented outputs. The current method to accomplish that is to look at the their orientations in the camera frame. Ideally, we would like to correlate feature orientations with the world frame - use absolute drone pose to filter out lines that are non-local-gravity-oriented in the actual world (instead relative to the drone).
Create an odometry subscriber in the feature detection node. Write an odometry processing module in the Feature Detection package, that intakes the drone pose in the world frame and outputs the orientation of the drone in the roll axis (in degrees/radians). Integrate this module with the Hough Processing function, so that it is possible to dynamically choose a threshold for orientation-based filtering.
The text was updated successfully, but these errors were encountered: