Skip to content
This repository has been archived by the owner on Jul 16, 2024. It is now read-only.

Commit

Permalink
Merge branch 'master' into patch-1
Browse files Browse the repository at this point in the history
  • Loading branch information
mdurrani808 authored Jan 1, 2024
2 parents 7019e39 + 85729b9 commit 8f3f26d
Show file tree
Hide file tree
Showing 16 changed files with 356 additions and 87 deletions.
65 changes: 62 additions & 3 deletions source/docs/contributing/photonvision/build-instructions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -93,6 +93,32 @@ Running the following command under the root directory will build the jar under

``gradlew shadowJar``

Build and Run the Source on a Raspberry Pi Coprocessor
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

As a convinenece, the build has built in `deploy` command which builds, deploys, and starts the current source code on a coprocessor.

An architecture override is required to specify the deploy target's architecture.

.. tab-set::

.. tab-item:: Linux

``./gradlew clean``
``./gradlew deploy -PArchOverride=linuxarm64``

.. tab-item:: macOS

``./gradlew clean``
``./gradlew deploy -PArchOverride=linuxarm64``

.. tab-item:: Windows (cmd)

``gradlew clean``
``gradlew deploy -PArchOverride=linuxarm64``

The ``deploy`` command is tested against Raspberry Pi coprocessors. Other similar coprocessors may work too.

Using PhotonLib Builds
~~~~~~~~~~~~~~~~~~~~~~

Expand Down Expand Up @@ -139,10 +165,10 @@ After adding the generated vendordep to your project, add the following to your
}
Debugging a local PhotonVision build
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Debugging PhotonVision Running Locally
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

One way is by running the program using gradle with the :code:`--debug-jvm` flag. Run the program with :code:`./gradlew run --debug-jvm`, and attach to it with VSCode by adding the following to launch.json. Note args can be passed with :code:`--args="foobar"`.
One way is by running the program using gradle with the :code:`--debug-jvm` flag. Run the program with :code:`./gradlew run --debug-jvm`, and attach to it with VSCode by adding the following to :code:`launch.json`. Note args can be passed with :code:`--args="foobar"`.

.. code-block::
Expand All @@ -165,6 +191,39 @@ One way is by running the program using gradle with the :code:`--debug-jvm` flag
PhotonVision can also be run using the gradle tasks plugin with :code:`"args": "--debug-jvm"` added to launch.json.


Debugging PhotonVision Running on a CoProcessor
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Set up a VSCode configuration in :code:`launch.json`

.. code-block::
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"type": "java",
"name": "Attach to CoProcessor",
"request": "attach",
"hostName": "photonvision.local",
"port": "5801",
"projectName": "photon-core"
},
]
}
Stop any existing instance of PhotonVision.

Launch the program with the following additional argument to the JVM: :code:`java -agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=photonvision.local:5801`

Once the program says it is listening on port 5801, launch the debug configuration in VSCode.

The program will wait for the VSCode debugger to attach before proceeding.

Running examples
~~~~~~~~~~~~~~~~

Expand Down
19 changes: 0 additions & 19 deletions source/docs/examples/apriltag.rst

This file was deleted.

1 change: 0 additions & 1 deletion source/docs/examples/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,5 @@ Code Examples
aimingatatarget
gettinginrangeofthetarget
aimandrange
apriltag
simaimandrange
simposeest
2 changes: 0 additions & 2 deletions source/docs/getting-started/april-tags.rst
Original file line number Diff line number Diff line change
Expand Up @@ -27,5 +27,3 @@ Getting Started With AprilTags
3. Read page on :ref:`Robot Integration Strategies with AprilTags<docs/integration/aprilTagStrategies:Simple Strategies>` on different approaches to using the data you get from AprilTags. This includes simply turning to the goal, getting the pose of the target, all the way to real-time, latency compensated pose estimation.

4. Read the :ref:`PhotonLib documentation<docs/programming/photonlib/getting-target-data:Getting AprilTag Data From A Target>` on how to use AprilTag data in your code.

5. Read the :ref:`example code<docs/examples/apriltag:Knowledge and Equipment Needed>` on a fully featured example on different ways to use AprilTags.
2 changes: 1 addition & 1 deletion source/docs/getting-started/installation/networking.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Physical Networking

After imaging your coprocessor, run an ethernet cable from your coprocessor to a router/radio and power on your coprocessor by plugging it into the wall. Then connect whatever device you're using to view the webdashboard to the same network and navigate to photonvision.local:5800.

PhotonVision *STRONGLY* recommends the usage of a network switch on your robot. This is because the second radio port on the current FRC radios is known to be buggy and cause frequent connection issues that are detrimental during competition. More information can be found in this `ChiefDelphi thread <https://www.chiefdelphi.com/t/why-you-probably-shouldnt-use-the-second-port-on-your-openmesh-om5p-radio-and-embrace-using-an-ethernet-switch-instead/406374>`_ and an in-depth guide on how to install a network switch can be found `on FRC 900's website <https://team900.org/blog/ZebraSwitch/>`_.
PhotonVision *STRONGLY* recommends the usage of a network switch on your robot. This is because the second radio port on the current FRC radios is known to be buggy and cause frequent connection issues that are detrimental during competition. An in-depth guide on how to install a network switch can be found `on FRC 900's website <https://team900.org/blog/ZebraSwitch/>`_.



Expand Down
4 changes: 2 additions & 2 deletions source/docs/getting-started/installation/wiring.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,9 @@ Coprocessor without Passive POE
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
1a. Option 1: Get a micro USB (may be USB-C if using a newer Pi) to USB-A cable and plug the USB A side into a regulator like `this <https://www.amazon.com/KNACRO-Voltage-Regulator-Converter-Module/dp/B01HM12N2C/ref=sr_1_2>`_. Then, wire the regulator into your PDP/PDB and the Micro USB / USB C into your coprocessor.

1b. Option 2: Use a USB power bank to power your coprocessor. There are rules that regulate the usage of power banks so ensure that you aren't breaking them, more information can be found `here <https://www.chiefdelphi.com/t/limelight-powered-by-external-battery/390710>`_.
1b. Option 2: Use a USB power bank to power your coprocessor. Refer to this year's robot rulebook on legal implementations of this.

2. Run an ethernet cable from your Pi to your network switch / radio (we *STRONGLY* recommend the usage of a network switch, see the networking section for more info.)
1. Run an ethernet cable from your Pi to your network switch / radio (we *STRONGLY* recommend the usage of a network switch, see the networking section for more info.)


------------------------------------------------------------
Expand Down
2 changes: 1 addition & 1 deletion source/docs/hardware/picamconfig.rst
Original file line number Diff line number Diff line change
Expand Up @@ -52,4 +52,4 @@ Save the file, close the editor, and eject the drive. The boot configuration sho
Additional Information
----------------------

See `the libcamera documentation <https://github.com/raspberrypi/documentation/blob/develop/documentation/asciidoc/computers/camera/libcamera_apps_getting_started.adoc>`_ for more details on configuring cameras.
See `the libcamera documentation <https://github.com/raspberrypi/documentation/blob/develop/documentation/asciidoc/computers/camera/rpicam_apps_getting_started.adoc>`_ for more details on configuring cameras.
8 changes: 4 additions & 4 deletions source/docs/hardware/supportedhardware.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,13 +20,13 @@ Pi cameras are always recommended over USB cameras as they have lower latency an

.. note:: Note that there are many CSI based OV9281 cameras but this is the only one that has been tested by the development team.

* `Arducam USB OV9281 Global Shutter Camera <https://www.amazon.com/Arducam-Distortion-Microphones-Computer-Raspberry/dp/B096M5DKY6>`_ (AprilTag Tracking)
* Arducam USB OV9281 Global Shutter Camera (AprilTag Tracking)

* `720p ELP Camera <https://www.amazon.com/SVPRO-Camera-Module-100degree-Distortion/dp/B07C1KYBYC>`_ (Retroreflective Target Tracking)
* 720p ELP Camera (Retroreflective Target Tracking)

* `Microsoft LifeCam HD-3000 <https://www.andymark.com/products/microsoft-lifecam-hd-3000-camera>`_ (Driver Camera)
* Microsoft LifeCam HD-3000 (Driver Camera)

* `720p Fisheye ELP Camera <https://www.amazon.com/ELP-Camera-170degree-Megapixel-Security/dp/B00VTINRMK/>`_ (Driver Camera)
* 720p Fisheye ELP Camera (Driver Camera)

.. note:: If you test a camera and find that it works with PhotonVision, we encourage you to submit that camera to the performance matrix below.

Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1 change: 1 addition & 0 deletions source/docs/programming/photonlib/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,4 +11,5 @@ PhotonLib: Robot Code Interface
driver-mode-pipeline-index
controlling-led
simulation
simulation-deprecated
hardware-in-the-loop-sim
4 changes: 2 additions & 2 deletions source/docs/programming/photonlib/robot-pose-estimator.rst
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ The PhotonPoseEstimator has a constructor that takes an ``AprilTagFieldLayout``

Using a ``PhotonPoseEstimator``
-------------------------------
Calling ``update()`` on your ``PhotonPoseEstimator`` will return an ``EstimatedRobotPose``, which includes a ``Pose3d`` of the latest estimated pose (using the selected strategy) along with a ``double`` of the timestamp when the robot pose was estimated. You should be updating your `drivetrain pose estimator <https://docs.wpilib.org/en/latest/docs/software/advanced-controls/state-space/state-space-pose-estimators.html>`_ with the result from the ``PhotonPoseEstimator`` every loop using ``addVisionMeasurement()``. See our `code example <https://github.com/PhotonVision/photonvision/tree/master/photonlib-java-examples/apriltagExample>`_ for more.
Calling ``update()`` on your ``PhotonPoseEstimator`` will return an ``EstimatedRobotPose``, which includes a ``Pose3d`` of the latest estimated pose (using the selected strategy) along with a ``double`` of the timestamp when the robot pose was estimated. You should be updating your `drivetrain pose estimator <https://docs.wpilib.org/en/latest/docs/software/advanced-controls/state-space/state-space-pose-estimators.html>`_ with the result from the ``PhotonPoseEstimator`` every loop using ``addVisionMeasurement()``.

.. tab-set-code::
.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/357d8a518a93f7a1f8084a79449249e613b605a7/photonlib-java-examples/apriltagExample/src/main/java/frc/robot/PhotonCameraWrapper.java
Expand All @@ -102,7 +102,7 @@ Calling ``update()`` on your ``PhotonPoseEstimator`` will return an ``EstimatedR
}
}

You should be updating your `drivetrain pose estimator <https://docs.wpilib.org/en/latest/docs/software/advanced-controls/state-space/state-space-pose-estimators.html>`_ with the result from the ``RobotPoseEstimator`` every loop using ``addVisionMeasurement()``. See our :ref:`code example <docs/examples/apriltag:knowledge and equipment needed>` for more.
You should be updating your `drivetrain pose estimator <https://docs.wpilib.org/en/latest/docs/software/advanced-controls/state-space/state-space-pose-estimators.html>`_ with the result from the ``RobotPoseEstimator`` every loop using ``addVisionMeasurement()``.

Additional ``PhotonPoseEstimator`` Methods
------------------------------------------
Expand Down
94 changes: 94 additions & 0 deletions source/docs/programming/photonlib/simulation-deprecated.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
Simulation Support in PhotonLib (Deprecated)
============================================

.. attention:: This page details the pre-2024 simulation support. For current Java simulation support, see :doc:`/docs/programming/photonlib/simulation`.

What Is Supported?
------------------

PhotonLib supports simulation of a camera and coprocessor running PhotonVision moving about a field on a robot.

You can use this to help validate your robot code's behavior in simulation without needing a physical robot.

Simulation Vision World Model
-----------------------------

Sim-specific classes are provided to model sending one frame of a camera image through PhotonVision. Based on what targets are visible, results are published to NetworkTables.

While processing, the given robot ``Pose2d`` is used to analyze which targets should be in view, and determine where they would have shown up in the camera image.

Targets are considered in view if:

1) Their centroid is within the field of view of the camera.
2) The camera is not in driver mode.
3) The target's in-image pixel size is greater than ``minTargetArea``
4) The distance from the camera to the target is less than ``maxLEDRange``

.. warning:: Not all network tables objects are updated in simulation. The interaction through PhotonLib remains the same. Actual camera images are also not simulated.

Latency of processing is not yet modeled.

.. image:: diagrams/SimArchitecture-deprecated.drawio.svg
:alt: A diagram comparing the architecture of a real PhotonVision process to a simulated one.

Simulated Vision System
-----------------------

A ``SimVisionSystem`` represents the camera and coprocessor running PhotonVision moving around on the field.

It requires a number of pieces of configuration to accurately simulate your physical setup. Match them to your configuration in PhotonVision, and to your robot's physical dimensions.

.. tab-set-code::

.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/80e16ece87c735e30755dea271a56a2ce217b588/photonlib-java-examples/simaimandrange/src/main/java/frc/robot/sim/DrivetrainSim.java
:language: java
:lines: 73-93

After declaring the system, you should create and add one ``SimVisionTarget`` per target you are attempting to detect.

.. tab-set-code::

.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/80e16ece87c735e30755dea271a56a2ce217b588/photonlib-java-examples/simaimandrange/src/main/java/frc/robot/sim/DrivetrainSim.java
:language: java
:lines: 95-111

Finally, while running the simulation, process simulated camera frames by providing the robot's pose to the system.

.. tab-set-code::

.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/80e16ece87c735e30755dea271a56a2ce217b588/photonlib-java-examples/simaimandrange/src/main/java/frc/robot/sim/DrivetrainSim.java
:language: java
:lines: 138-139

This will cause most NetworkTables fields to update properly, representing any targets that are in view of the robot.

Robot software which uses PhotonLib to interact with a camera running PhotonVision should work the same as though a real camera was hooked up and active.

Raw-Data Approach
-----------------

Users may wish to directly provide target information based on an existing detailed simulation.

A ``SimPhotonCamera`` can be created for this purpose. It provides an interface where the user can supply target data via a list of ``PhotonTrackedTarget`` objects.

.. tab-set-code::

.. code-block:: java
@Override
public void simulationInit() {
// ...
cam = new SimPhotonCamera("MyCamera");
// ...
}
@Override
public void simulationPeriodic() {
// ...
ArrayList<PhotonTrackedTarget> visibleTgtList = new ArrayList<PhotonTrackedTarget>();
visibleTgtList.add(new PhotonTrackedTarget(yawDegrees, pitchDegrees, area, skew, camToTargetTrans)); // Repeat for each target that you see
cam.submitProcessedFrame(0.0, visibleTgtList);
// ...
}
Note that while there is less code and configuration required to get basic data into the simulation, this approach will cause the user to need to implement much more code on their end to calculate the relative positions of the robot and target. If you already have this, the raw interface may be helpful. However, if you don't, you'll likely want to be looking at the Simulated Vision System first.
Loading

0 comments on commit 8f3f26d

Please sign in to comment.