bulletphysics / bullet3

Bullet Physics SDK: real-time collision detection and multi-physics simulation for VR, games, visual effects, robotics, machine learning etc.
http://bulletphysics.org
Other
12.64k stars 2.88k forks source link

How long before we should expect bullet 3? in time for blender 2.8? #632

Closed BluePrintRandom closed 5 years ago

BluePrintRandom commented 8 years ago

I would love to have bullet 3 inside the bge in blender 2.8.....

how far off is it?

BluePrintRandom commented 8 years ago

Bump_Bump

benelot commented 8 years ago

If you mean the GPU physics pipeline, I think the priorities have changed. It is in an experimental stage and you can test it.

BluePrintRandom commented 8 years ago

if we can have a GPU based A* + a GPU based AI + gpu physics ragdoll simulations + GPU armature skinning.....

you could have huge mobs of enemies without any drain from CPU-> gpu communications.

(think a herd of zombies)

is this possible with Bullet3 + openCL ?

'bullet walking ragdoll crowd simulations' ?

benelot commented 8 years ago

Well first of all, your GPU has to switch for each of these usages, so you either have lots of GPU cards and use them all at the same time or it would not work. Second, everything with dependencies within the calculation causes large issues on GPU. A* might be impossible to do in full-parallel, except that you can expand a step further everywhere at the same time. GPU based AI sounds unnecessary given the current state of Game AI. Once we have trained Neural Networks that play, we are talking again about this. The GPU physics works, except for large piles of objects (so called islands), they again cause dependencies.

All in all: We are not there yet from my point of view.

BluePrintRandom commented 8 years ago

The idea behind a gpu ai was to avoid cycling data from gpu to cpu, and to go in a order where the gpu already had the info it was dependent on from the previous calc

like

physics ai applyForces from ai for next frame skin ragdoll and draw world

but these would all need to complete before its time to draw the next frame....

(a fully gpu bound gameloop)

but I guess a player would have no ability to input anything....

On Sep 7, 2016 2:56 PM, "Benjamin Ellenberger" notifications@github.com wrote:

Well first of all, your GPU has to switch for each of these usages, so you either have lots of GPU cards and use them all at the same time or it would not work. Second, everything with dependencies within the calculation causes large issues on GPU. A* might be impossible to do in full-parallel, except that you can expand a step further everywhere at the same time. GPU based AI sounds unnecessary given the current state of Game AI. Once we have trained Neural Networks that play, we are talking again about this. The GPU physics works, except for large piles of objects (so called islands), they again cause dependencies.

All in all: We are not there yet from my point of view.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/bulletphysics/bullet3/issues/632#issuecomment-245432260, or mute the thread https://github.com/notifications/unsubscribe-auth/AG25WdXVYKDS47YW1uPToa9ZsswA3EFMks5qnzMagaJpZM4IhB3P .

erwincoumans commented 5 years ago

here hasn't been any active OpenCL development in years, let's close this.

tlf30 commented 5 years ago

@erwincoumans is there a reason opencl support has basically been dropped? It was a feature that I believe many developers were looking forward to.

erwincoumans commented 5 years ago

Sure, there are no contributions to OpenCL support. My focus is on robotics and reinforcement learning those days, in the form of PyBullet. Since it is unlikely anyone is going to have time and resources for OpenCL, let's close this issue for now. If a developer shows up we can re-open.