Closed forhalle closed 1 year ago
Notes from today's meeting:
Notes from today's meeting:
Longer engagement: - (probably needs to be shortened)
Make it easy to reset to initial state.
Improve UX for the Importer.
(1 minute)
By default, suggest the correct apple-robot file.
Default spawning point(s).
Optimize the flow here.
Measure how much we can do in short time, focus on adding the most interesting parts and fast forward to the ready one. Determine steps here (add vehicle control, wheel controllers, sensors, manipulator control components).
Create the launch file.
Prepare rviz2 config and map.
Determine controls, figure out camera, bring gamepad(s)?
(1.5 minutes)
Implement counting, scoring.
Set fixed spawning points, determine limit for performance, figure out how to run ROS 2 stacks for each.
(1 minute)Shorter engagement (custom time)
Run the scripted picking in the background. Allow user to take over at any point (and restore automation when he is finished). The user would manually control the robot, both manipulator and the mobile base. We need robust apple picking script for this.
We can have counts of apples (manually picked by ROSCon visitors, automatically picked through the conference).
These are ideas that could use plenty of brainstorming, please comment and contribute!
Notes:
Since we likely would not be able to get real yummy apples (@forhalle already checked), I suppose we could go for some apple-themed gadgets. My quick search (Google, I really want a fruit apple, not Steve's Apple) yielded this: https://www.bitsandpieces.com/product/shiny-3d-apple-puzzle
Would it be possible to get 20-30 like this, but with O3DE logo? Does not have to be puzzles, but some themed gadget as a prize for engaged users would be great. Other idea would be branded t-shirts with "Certified Apple-Kraken operator" and apples as well as O3DE logo. Help me, I am not good at this :)
You have great ideas, @adamdbrw. We're excited to talk with you about this at our upcoming meeting. In the meantime, here is a cut/paste of the conversation I had with the venue about the real apples for your reference:
I believe we discuss that we will likely have two versions (one for each booth), a variant of longer and shorter engagement as mentioned in this comment: https://github.com/aws-lumberyard/ROSConDemo/issues/3#issuecomment-1234459101 Details are still TBD.
Notes from today's meeting:
@adamdbrw - After much discussion, we have decided to omit the gamification requirement you previously (above) suggested from the apple-picking simulation, as it is not directly relevant to demonstrating O3DE for simulation, and we do not have resourcing available. We will instead focus on our stretch goal of simulating multiple robots through RoboMaker integration with AWS services. We can talk about this more at our meeting this week.
@spham-amzn has agreed to add some detail to this ticket regarding the final script. In the meantime, however, I'd expect the script to borrow the below items from your original script above:
"0" Initial state. Robot is already imported and simulation components are already set up....
"3" Run the simulation and the ROS 2 stack (launch file, including RViz2) (0.5 minute). Create the launch file.
"4" Explain what is happening with ROS2 stack.
"5" Set the navigation goal to the nearest apple tree using RViz2. Prepare rviz2 config and map.
"i" Watch the robot go, explain some things as it travels. This should only take up to 30 seconds.
"8" Add more robots - the user can use the spawner to scale up. Set fixed spawning points, determine limit for performance, figure out how to run ROS 2 stacks for each.
(1 minute)
"9" Ask for feedback, what would be useful to simulate, what the user would like to see if they were to use such a tool. (1 minute)
@spham-amzn - Here's a starting point: https://github.com/aws-lumberyard/ROSConDemo/wiki/Demo-Walkthrough
@forhalle @spham-amzn I understand the reasoning and we will refocus on the new goal. I guess the most important part for our work plan is whether the manipulation is in or out of the demo scope (I can not determine that from the comment). It is a big item that we can handle in several different ways:
I guess the main question is what would we want robots to do except be there and move around the orchard. I believe we certainly can do better than 1.i. would suggest going with these points as stretch goals in their order. 1.ii -> 2.i -> 2.ii -> 3 We need to ensure other goals are reached (e.g. we have a stable simulation and working live demo including all the points you mentioned such as navigation and scaling up).
I suppose we could look at manipulation in the following way: With 1.ii being minimal goal (no manipulation, but looks are there), 2.ii being our target, and 3 as a stretch goal.
Let me know what you think.
Hi @adamdbrw - We agree with the prioritization you mention above. Really hoping we can get to 2.ii.
First draft of the script for the AWS demonstration:
Initiate ros2/humble environment
Launch O3DE / Apple Orchard Level on Desktop
Spawn a robot onto the Orchard Level
Launch Navigation stack for new robot
Set navigation goal for robot to an Apple tree
Watch the robot navigate to the tree
When the robot arrives at the tree, initiate the manipulator script to start apple picking
Explain during this process the sensors involved
While collecting apples, spawn another robot on the apple orchard
Repeat steps 5-7
Ask for questions/feedback
I think the above script aligns with what we discussed, with the addition of having buttons to 'start' the demo.
Additional: text overlay displays status of robot
Comments:
In addition, after conversations with the Robomaker team, robomaker is designed to scale up simulations,but one robot app per simulation app at a time. It is not designed to spin up multiple robot applications to interact with a single simulation, so using Robomaker to high-light that type of scalability isn't appropriate. We can still spawn additional robots and navigation stacks in the same robot application in Robomaker, but not in a scalable way.
To support manually executing a live demo inside the ROSCon booths, document each step of the user story to be told through the demo, including the estimated amount of time dedicated to each step.
For example: 0:00 - 2:00: Import robot into software 2:01 - 2:30: Insert image recognition software 2:31 - 5:00: Manually drive robot, identify fruit, move the robot arm to pick the fruit, and place fruit in the vehicle's container etc.
Acceptance Criteria: