o3de / ROSConDemo

A robotic fruit picking demo project for O3DE with ROS 2 Gem
Other
65 stars 22 forks source link

AWS Live Demo - Document Script (demo steps + talking points) #3

Closed forhalle closed 1 year ago

forhalle commented 2 years ago

To support manually executing a live demo inside the ROSCon booths, document each step of the user story to be told through the demo, including the estimated amount of time dedicated to each step.

For example: 0:00 - 2:00: Import robot into software 2:01 - 2:30: Insert image recognition software 2:31 - 5:00: Manually drive robot, identify fruit, move the robot arm to pick the fruit, and place fruit in the vehicle's container etc.

Acceptance Criteria:

forhalle commented 2 years ago

Notes from today's meeting:

forhalle commented 2 years ago

Notes from today's meeting:

adamdbrw commented 2 years ago

Longer engagement: - (probably needs to be shortened)

  1. Initial state. The demo operator has to reset to this after each user engagement. Make it easy to reset to initial state.
    1. The scene is ready with the default camera view (facing one of the rows as seen from the dirt road), no robot yet.
    2. The simulation is not running - we are in the Editor mode.
    3. Robot prefab is not loaded / URDF is not imported.
  2. Import the robot using URDF Importer. Improve UX for the Importer. (1 minute)
    1. Select the Importer from the menu, open the tool.
    2. Select the URDF. By default, suggest the correct apple-robot file.
    3. The robot by default should appear in our view oriented towards apple row entrance. Default spawning point(s).
  3. Add required simulation components (perhaps as prefabs to save time) (2-3 minutes).Optimize the flow here.
    1. Talk about what needs to be added (since it is not a part of URDF)
    2. If it takes to much time at any point, instead go ahead and select a ready prefab (apple-kraken_ready.prefab) and summarize how it was created from the imported one.
    3. Follow up with adding other items. Measure how much we can do in short time, focus on adding the most interesting parts and fast forward to the ready one. Determine steps here (add vehicle control, wheel controllers, sensors, manipulator control components).
  4. Run the simulation and the ROS 2 stack (launch file, including RViz2) (0.5 minute). Create the launch file.
    1. Explain what is happening with ROS2 stack.
  5. Set the navigation goal to the nearest apple tree using RViz2. Prepare rviz2 config and map.
    1. Watch the robot go, explain some things as it travels. This should only take up to 30 seconds.
  6. Give the user manual control over the manipulator, explain controls, let him pick apples. Determine controls, figure out camera, bring gamepad(s)? (1.5 minutes)
    1. We can gamify it - make the picked apple count appear in the simulation, limited time. Implement counting, scoring.
      1. (Optionally) display the best human score
  7. Have the user start the script of automated picking (1 minute)
    1. The robot moves to the next tree.
    2. The scripted picking starts (it should be substantially faster than the user)
    3. Explain what is happening and that we are using ground truth here, but this could also be plugged to ROS 2 detection package.
  8. Add more robots - the user can use the spawner to scale up. Set fixed spawning points, determine limit for performance, figure out how to run ROS 2 stacks for each.(1 minute)
  9. Ask for feedback, what would be useful to simulate, what the user would like to see if they were to use such a tool. (1 minute)

Shorter engagement (custom time)

Run the scripted picking in the background. Allow user to take over at any point (and restore automation when he is finished). The user would manually control the robot, both manipulator and the mobile base. We need robust apple picking script for this. We can have counts of apples (manually picked by ROSCon visitors, automatically picked through the conference).

These are ideas that could use plenty of brainstorming, please comment and contribute!

Notes:

adamdbrw commented 2 years ago

Since we likely would not be able to get real yummy apples (@forhalle already checked), I suppose we could go for some apple-themed gadgets. My quick search (Google, I really want a fruit apple, not Steve's Apple) yielded this: https://www.bitsandpieces.com/product/shiny-3d-apple-puzzle image

Would it be possible to get 20-30 like this, but with O3DE logo? Does not have to be puzzles, but some themed gadget as a prize for engaged users would be great. Other idea would be branded t-shirts with "Certified Apple-Kraken operator" and apples as well as O3DE logo. Help me, I am not good at this :)

forhalle commented 2 years ago

You have great ideas, @adamdbrw. We're excited to talk with you about this at our upcoming meeting. In the meantime, here is a cut/paste of the conversation I had with the venue about the real apples for your reference:

apple conversation.pdf

adamdbrw commented 1 year ago

I believe we discuss that we will likely have two versions (one for each booth), a variant of longer and shorter engagement as mentioned in this comment: https://github.com/aws-lumberyard/ROSConDemo/issues/3#issuecomment-1234459101 Details are still TBD.

forhalle commented 1 year ago

Notes from today's meeting:

forhalle commented 1 year ago

@adamdbrw - After much discussion, we have decided to omit the gamification requirement you previously (above) suggested from the apple-picking simulation, as it is not directly relevant to demonstrating O3DE for simulation, and we do not have resourcing available. We will instead focus on our stretch goal of simulating multiple robots through RoboMaker integration with AWS services. We can talk about this more at our meeting this week.

@spham-amzn has agreed to add some detail to this ticket regarding the final script. In the meantime, however, I'd expect the script to borrow the below items from your original script above:

"0" Initial state. Robot is already imported and simulation components are already set up....

"3" Run the simulation and the ROS 2 stack (launch file, including RViz2) (0.5 minute). Create the launch file. "4" Explain what is happening with ROS2 stack. "5" Set the navigation goal to the nearest apple tree using RViz2. Prepare rviz2 config and map. "i" Watch the robot go, explain some things as it travels. This should only take up to 30 seconds.

"8" Add more robots - the user can use the spawner to scale up. Set fixed spawning points, determine limit for performance, figure out how to run ROS 2 stacks for each.(1 minute) "9" Ask for feedback, what would be useful to simulate, what the user would like to see if they were to use such a tool. (1 minute)

forhalle commented 1 year ago

@spham-amzn - Here's a starting point: https://github.com/aws-lumberyard/ROSConDemo/wiki/Demo-Walkthrough

adamdbrw commented 1 year ago

@forhalle @spham-amzn I understand the reasoning and we will refocus on the new goal. I guess the most important part for our work plan is whether the manipulation is in or out of the demo scope (I can not determine that from the comment). It is a big item that we can handle in several different ways:

  1. No manipulation. The robot will not be able to pick apples. Two options here:
    1. Remove the manipulator frame and the manipulator altogether
    2. Add visual elements and polish the looks. We could say this is a work in progress.
  2. Implement scripts for manipulator handling, but do not implement any components. This is aiming for a better result since the scene is an apple orchard and there would not be an obvious thing missing. But there are picking challenges to solve.
    1. Do not implement orchestration, instead use simple input control. Kraken can only pick apples manually.
    2. Implement picking orchestration using these scripts - Kraken can automatically pick apples. Optionally: also implement manual control (not a big item).
  3. Implement basic components and use them for scripting. We can claim support for manipulation feature in the Gem (first version). Users interested in manipulation might be more attracted. On top of point two items, we add components and use them in the model.

I guess the main question is what would we want robots to do except be there and move around the orchard. I believe we certainly can do better than 1.i. would suggest going with these points as stretch goals in their order. 1.ii -> 2.i -> 2.ii -> 3 We need to ensure other goals are reached (e.g. we have a stable simulation and working live demo including all the points you mentioned such as navigation and scaling up).

I suppose we could look at manipulation in the following way: With 1.ii being minimal goal (no manipulation, but looks are there), 2.ii being our target, and 3 as a stretch goal.

Let me know what you think.

forhalle commented 1 year ago

Hi @adamdbrw - We agree with the prioritization you mention above. Really hoping we can get to 2.ii.

spham-amzn commented 1 year ago

First draft of the script for the AWS demonstration:

  1. Initiate ros2/humble environment

  2. Launch O3DE / Apple Orchard Level on Desktop

  3. Spawn a robot onto the Orchard Level

  4. Launch Navigation stack for new robot

  5. Set navigation goal for robot to an Apple tree

  6. Watch the robot navigate to the tree

  7. When the robot arrives at the tree, initiate the manipulator script to start apple picking

  8. Explain during this process the sensors involved

  9. While collecting apples, spawn another robot on the apple orchard

  10. Repeat steps 5-7

  11. Ask for questions/feedback

SLeibrick commented 1 year ago

I think the above script aligns with what we discussed, with the addition of having buttons to 'start' the demo.

SLeibrick commented 1 year ago
  1. Spawn robot 1 from console using a service call, we look from the camera view point of that robot
  2. We direct the robot to go to a particular apple tree using the console screen by running the navigation stack and then using Rvis to set the goal
  3. When we are in position and immobile, click a button to begin and trigger the apple picking. Starting pose and camera perspective begin from a specific default location.
  4. Invite the user to move the robot to a new tree and repeat apple picking
  5. While the robot is moving, talk about scaling up for multi-robot simulation and using Robomaker to spawn different instances
  6. Spawn a second robot, automatically switch to the camera view showing the new robot in the scene. Spawning points are defined in the scene for multiple robots. Some scripted behaviors with specific predetermined goals.

Additional: text overlay displays status of robot

spham-amzn commented 1 year ago

Comments:

In addition, after conversations with the Robomaker team, robomaker is designed to scale up simulations,but one robot app per simulation app at a time. It is not designed to spin up multiple robot applications to interact with a single simulation, so using Robomaker to high-light that type of scalability isn't appropriate. We can still spawn additional robots and navigation stacks in the same robot application in Robomaker, but not in a scalable way.

SLeibrick commented 1 year ago

https://github.com/aws-lumberyard/ROSConDemo/wiki/Demo-Script