Skip to content

Commit

Permalink
Add wiki figures to website, minor changes associated with code cleanup
Browse files Browse the repository at this point in the history
  • Loading branch information
shalabymhd committed Nov 22, 2024
1 parent b205dce commit 4d44f76
Show file tree
Hide file tree
Showing 17 changed files with 22 additions and 23 deletions.
Binary file added assets/anchor_constellation.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/apriltag_det.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/banner_image.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/decar_logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/ifo.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/lazy_classifier_results.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/setup.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/table.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/trajectories.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
11 changes: 6 additions & 5 deletions docs/data.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,22 +7,23 @@ usemathjax: true
# Experimental Setup
## Uvify IFO-S
Each Uvify IFO-S quadcopter is equipped with an IMU, a front facing Intel RealSense d435i camera, an integrated downward facing camera, and two UWB transceivers, depicted below. The onboard flight computer is an NVIDIA Jetson Nano running PX4 autopiloting software.
![](https://github.com/ndahdah/miluv_wiki/blob/main/doc/_static/ifo.jpg)
![](https://decargroup.github.io/miluv/assets/ifo.jpg)

## Flight Arena
The UAVs operated within an approximate 4m x 4m x 3m subsection of an enclosed flight arena. The arena comprises 12 motion capture cameras and six anchors with UWB transceivers. The experimental setup is illustrated below
![](https://github.com/ndahdah/miluv_wiki/blob/main/doc/_static/setup.jpg)
![](https://decargroup.github.io/miluv/assets/setup.png)

The dataset includes experiments with three different anchor constellations, shown below
![](https://github.com/ndahdah/miluv_wiki/blob/main/doc/_static/anchor_constellation.jpg)
![](https://decargroup.github.io/miluv/assets/anchor_constellation.jpg)

The primary constellation consists of anchors at varying heights, evenly spaced around the UAVs' operating area. The second constellation consists of anchors at the same positions as the primary constellation, but with each transceiver at the same height. The third constellation consists of three clusters of two anchors at varied heights. The location of each anchor is determined using the motion capture cameras. For experiments with obstacles, wood, plastic, and foam were placed in front of the UWB tags in order to disrupt line-of-sight to the UAVs' UWB transceivers.

# Summary of Experiments
![](https://github.com/ndahdah/miluv_wiki/blob/main/doc/_static/table.jpg)

![](https://decargroup.github.io/miluv/assets/table.jpg)

# Trajectories
![](https://github.com/ndahdah/miluv_wiki/blob/main/doc/_static/trajectories.jpg)
![](https://decargroup.github.io/miluv/assets/trajectories.jpg)

<!---
# Trajectory Videos
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/apriltag.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ $ python examples/ex_detect_apriltags.py
in the repository's root directory.

Note that users can safely ignore the following: "warning: count < 8 :(". Once the first image appears on-screen, the user may press any key other than 'escape' to advance the processed image stream forward in time or press 'escape' to end the image stream. A sample image from this example is show below.
![](https://github.com/ndahdah/miluv_wiki/blob/main/doc/_static/apriltag_det.png)
![](https://decargroup.github.io/miluv/assets/apriltag_det.png)

## More details
This example demonstrates how to load images from the MILUV dataset, as shown below,
Expand Down
8 changes: 4 additions & 4 deletions docs/examples/ekf/se23_one_robot.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,14 +28,14 @@ We follow the same notation convention mentioned in the paper and assume the sam

## Importing Libraries and MILUV Utilities

We start by importing the necessary libraries and utilities for this example as in the VINS example, with the only change being the EKF model we are using and the fact that we do not need to import `miluv.utils` separately as we do not need to process VINS data.
We start by importing the necessary libraries and utilities for this example as in the VINS example, with the only change being the EKF model we are using.

```py
import numpy as np
import pandas as pd

from miluv.data import DataLoader
import utils.liegroups as liegroups
import miluv.utils as utils
import examples.ekfutils.imu_one_robot_models as model
import examples.ekfutils.common as common
```
Expand Down Expand Up @@ -68,10 +68,10 @@ accel: pd.DataFrame = imu_at_query_timestamps["imu_px4"][["timestamp", "linear_a
gyro: pd.DataFrame = imu_at_query_timestamps["imu_px4"][["timestamp", "angular_velocity.x", "angular_velocity.y", "angular_velocity.z"]]
```

To be able to evaluate our EKF, we extract the ground truth pose data at these timestamps. The `DataLoader` class provides interpolated splines for the ground truth pose data, and the reason for that is that we can query the ground truth data at any timestamp and call the `derivative` method to get higher-order derivatives of the pose data. For example, here we use the first derivative of the pose data to get the linear velocity data, which is necessary to evaluate our $SE_2(3)$ EKF. We use a helper function from the `liegroups` module to convert the mocap pose data and its derivatives to a list of $SE_2(3)$ poses.
To be able to evaluate our EKF, we extract the ground truth pose data at these timestamps. The `DataLoader` class provides interpolated splines for the ground truth pose data, and the reason for that is that we can query the ground truth data at any timestamp and call the `derivative` method to get higher-order derivatives of the pose data. For example, here we use the first derivative of the pose data to get the linear velocity data, which is necessary to evaluate our $SE_2(3)$ EKF. We use a helper function from the `utils` module to convert the mocap pose data and its derivatives to a list of $SE_2(3)$ poses.

```py
gt_se23 = liegroups.get_se23_poses(
gt_se23 = utils.get_se23_poses(
data["mocap_quat"](query_timestamps), data["mocap_pos"].derivative(nu=1)(query_timestamps), data["mocap_pos"](query_timestamps)
)
```
Expand Down
4 changes: 2 additions & 2 deletions docs/examples/ekf/se23_three_robot.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ import numpy as np
import pandas as pd

from miluv.data import DataLoader
import utils.liegroups as liegroups
import miluv.utils as utils
import examples.ekfutils.imu_three_robots_models as model
import examples.ekfutils.common as common
```
Expand Down Expand Up @@ -67,7 +67,7 @@ Lastly, we extract the ground truth poses and biases for all three robots at the

```py
gt_se23 = {
robot: liegroups.get_se23_poses(
robot: utils.get_se23_poses(
data[robot]["mocap_quat"](query_timestamps), data[robot]["mocap_pos"].derivative(nu=1)(query_timestamps), data[robot]["mocap_pos"](query_timestamps)
)
for robot in data.keys()
Expand Down
7 changes: 3 additions & 4 deletions docs/examples/ekf/se3_one_robot.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,10 +42,9 @@ We then import the `DataLoader` class from the `miluv` package, which provides a
from miluv.data import DataLoader
```

We also import the `liegroups` and `utils` libraries from the `miluv` package, which provide utilities for Lie groups that accompany and other helper functions.
We also import the `utils` module from the `miluv` package, which provide utilities for Lie groups that accompany and other helper functions.

```py
import utils.liegroups as liegroups
import miluv.utils as utils
```

Expand Down Expand Up @@ -96,10 +95,10 @@ gyro: pd.DataFrame = imu_at_query_timestamps["imu_px4"][["timestamp", "angular_v
vins_at_query_timestamps = utils.zero_order_hold(query_timestamps, vins)
```

To be able to evaluate our EKF, we extract the ground truth pose data at these timestamps. The `DataLoader` class provides interpolated splines for the ground truth pose data, which we can use to get the ground truth poses at the query timestamps. We use a helper function from the `liegroups` module to convert the mocap pose data to a list of $SE(3)$ poses.
To be able to evaluate our EKF, we extract the ground truth pose data at these timestamps. The `DataLoader` class provides interpolated splines for the ground truth pose data, which we can use to get the ground truth poses at the query timestamps. We use a helper function from the `utils` module to convert the mocap pose data to a list of $SE(3)$ poses.

```py
gt_se3 = liegroups.get_se3_poses(
gt_se3 = utils.get_se3_poses(
data["mocap_quat"](query_timestamps), data["mocap_pos"](query_timestamps)
)
```
Expand Down
3 changes: 1 addition & 2 deletions docs/examples/ekf/se3_three_robot.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@ import numpy as np
import pandas as pd

from miluv.data import DataLoader
import utils.liegroups as liegroups
import miluv.utils as utils
import examples.ekfutils.vins_one_robot_models as model
import examples.ekfutils.common as common
Expand Down Expand Up @@ -94,7 +93,7 @@ Then, the ground truth data,

```py
gt_se3 = {
robot: liegroups.get_se3_poses(data[robot]["mocap_quat"](query_timestamps), data[robot]["mocap_pos"](query_timestamps))
robot: utils.get_se3_poses(data[robot]["mocap_quat"](query_timestamps), data[robot]["mocap_pos"](query_timestamps))
for robot in data.keys()
}
```
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/losclassification.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ $ python examples/ex_los_nlos_classification.py
in the repository's root directory.

The output from this example is show below.
![](https://github.com/ndahdah/miluv_wiki/blob/main/doc/_static/lazy_classifier_results.png)
![](https://decargroup.github.io/miluv/assets/lazy_classifier_results.png)

## More details
This example demonstrates how to use CIR data and how to set up MILUV data for machine learning purposes as shown below,
Expand Down
8 changes: 4 additions & 4 deletions docs/examples/visualizeimu.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ axs[2].plot(time, gt_gyro[2, :], label="Ground Truth")
```

<p align="center">
<img src="https://decargroup.github.io/miluv/assets/imu/gyro.png" alt="gyro" width="400" class="center"/>
<img src="https://decargroup.github.io/miluv/assets/imu/gyro.png" alt="gyro" width="600" class="center"/>
</p>

We then plot the measurement error and the compute ground truth bias for the gyroscope measurements.
Expand All @@ -92,7 +92,7 @@ axs[2].plot(time, imu_px4["gyro_bias.z"], label="IMU Bias")
```

<p align="center">
<img src="https://decargroup.github.io/miluv/assets/imu/gyro_bias.png" alt="gyro_bias" width="400" class="center"/>
<img src="https://decargroup.github.io/miluv/assets/imu/gyro_bias.png" alt="gyro_bias" width="600" class="center"/>
</p>

Similarly, we plot the accelerometer measurements, the ground truth, the measurement error, and the computed ground truth bias as follows.
Expand All @@ -110,7 +110,7 @@ axs[2].plot(time, gt_accelerometer[2, :], label="Ground Truth")
```

<p align="center">
<img src="https://decargroup.github.io/miluv/assets/imu/accel.png" alt="accel" width="400" class="center"/>
<img src="https://decargroup.github.io/miluv/assets/imu/accel.png" alt="accel" width="600" class="center"/>
</p>

```py
Expand All @@ -126,5 +126,5 @@ axs[2].plot(time, imu_px4["accel_bias.z"], label="IMU Bias")
```

<p align="center">
<img src="https://decargroup.github.io/miluv/assets/imu/accel_bias.png" alt="accel_bias" width="400" class="center"/>
<img src="https://decargroup.github.io/miluv/assets/imu/accel_bias.png" alt="accel_bias" width="600" class="center"/>
</p>

0 comments on commit 4d44f76

Please sign in to comment.