Skip to content

Spot Command References

Kaiyu Zheng edited this page Sep 11, 2022 · 17 revisions

This single Wiki page collects all different things you could do with Spot. Lots come from spot/docs. The hope is you can just look here instead of looking inside the docs.

Publish Robot State (TF and URDF)

roslaunch rbd_spot_robot state_publisher.launch

GraphNav Mapping

This command (1) starts the record map command line interface from Spot SDK examples, (2) saves the map to a conventional location: rbd_spot_perception/maps/bosdyn, and (3) asks if you'd like to visualize the map. To run it:

rosrun rbd_spot_perception graphnav_mapper.sh <map_name>

Typical procedure:

  1. Press 1 to "Start recording a map"

  2. Use the controller to drive the robot around. Preferred if you drive the robot in loops.

  3. Press 2 to "Stop recording a map"

  4. Press 9 to "Automatically find and close loops" (IMPORTANT! this is loop closure)

  5. Press a to "Optimize the map's anchoring"

  6. Press 5 to save the map (It will be saved into a folder called 'downloaded_graph'); DO THIS, otherwise the map is not saved.

Caveat: the fixed frame of a GraphNav map is named graphnav_map. It is the same frame as the "seed frame" in Spot SDK terminology.

GraphNav Localization

The launch file below:

  • publishes the map as point cloud;
  • publishes waypoints;
  • publishes robot state as TF transforms with URDF;
  • starts a graphnav pose streamer for body localization.
roslaunch rbd_spot_perception graphnav_map_publisher_with_localization.launch map_name:=<map_name>

For visualization, run:

roslaunch rbd_spot_perception view_graphnav_point_cloud_with_localization.launch

GraphNav Navigation

rosrun rbd_spot_action graphnav_nav.py [options]

Run -h to see the options. Note that the first time you run this, you need to make sure a GraphNav graph is uploaded to the robot; You do that by specifying the --map-name parameter. For example: rosrun rbd_spot_action graphnav_nav.py [options] --map-name lab121_no_lidar where "lab121_no_lidar" is a map name and a directory of the same name exists under rbd_spot_perception/maps.

There are a few ways to run this script:

  • List waypoints: rosrun rbd_spot_action graphnav_nav.py --list
  • Print robot current pose in seed frame: rosrun rbd_spot_action graphnav_nav.py (no argument)
  • Navigate to waypoint: rosrun rbd_spot_action graphnav_nav.py --waypoint tb --take (Here, --take means to take over the lease for the duration of navigation execution, and tb is a shortcode for a waypoint).
  • Navigate to pose: rosrun rbd_spot_action graphnav_nav.py --pose 0.48 -2.61 1.57 --take (Here, the pose is specified by x y yaw where yaw is in radians)

Note that the pose is in the seed frame, which is the same as the 'graphnav_map' frame. The coordinate system of the seed frame differs in axes directions from Spot's own frame. You can visualize it in RVIZ. For example:

Take a look at the implementation of the above script to see how navigation request is made.

View Map

Once you add the following line to your .bashrc:

alias viewmap='python ~/repo/robotdev/spot/spot-sdk/python/examples/graph_nav_view_map/view_map.py'

You can view a map by

viewmap <path/to/map/directory>

Convenient aliases in .bashrc

alias viewmap='python ~/repo/robotdev/spot/spot-sdk/python/examples/graph_nav_view_map/view_map.py'
alias gnavmap='cd ~/repo/robotdev/spot/spot-sdk/python/examples/graph_nav_command_line'
alias dospot="cd ~/repo/robotdev && source setup_spot.bash"

Object Segmentation (using Mask RCNN)

First, start streaming the segmentation. In this case, we'd like to use the gripper camera

./stream_segmentation.py --camera hand --pub

When it's running, it should publish both the segmentation result image as well as point cloud of detected objects. If you run rosrun rqt_image_view rqt_image_view, you should be able to see something like

Fiducial marker detection and streaming

rosrun rbd_spot_perception stream_fiducial_markers.py

See video: https://www.youtube.com/watch?v=3irnPSBwQ6c&feature=emb_title