Closed DaebangStn closed 1 week ago
Hi DaebangStn,
I think the remote viewer feature would fit best for your usecase here. The way this works is as follows:
1. Configure the viewer to be a server
from aitviewer.viewer import Viewer
from aitviewer.configuration import CONFIG as VC
VC.update_conf({"server_enabled": True})
viewer = Viewer(size=(900, 600))
Now the viewer runs a server thread internally and listens to incoming connections. Those connections (clients) can send data that is then visualized in the viewer. Those connections can be made from localhost or a remote server (e.g. where your neural network is executing code).
2. On the client side connect to the server Establish a connection to the viewer (server):
from aitviewer.remote.viewer import RemoteViewer
viz = RemoteViewer("localhost") # Or the actual IP if it's remote
3. Send data to be visualized
Now you instantiate renderables with the prefix Remote
instead of the usual ones:
from aitviewer.remote.renderables.meshes import RemoteMeshes
smpl_output = ... # smpl model output
m1 = RemoteMeshes(viz, smpl_output.vertices.detach().cpu().numpy()[0], smpl_model.faces.astype(np.int64), name=f"frame {frame_id}")
You can find all renderables that we support with a respective "remote" variant here: https://github.com/eth-ait/aitviewer/tree/main/aitviewer/remote/renderables
This should already create the mesh on the viewer. As long as you keep a reference to m1
you can choose to update the mesh in two ways:
# Update the first frame with new data (use -1 to update the last of the sequence).
m1.update_frames(new_vertices, [0])
# Or append new frames thereby growing the sequence.
m1.add_frames(new_vertices)
To avoid keeping the reference to m1
explicitly, you might find a small helper class useful:
class DebugHelper(object):
def __init__(self, host="localhost"):
self.viz = RemoteViewer(host)
self.meshes = dict()
self.seq_lengths = dict()
def add_new_mesh(self, name, remote_mesh):
self.meshes[name] = remote_mesh
self.seq_lengths[name] = 1
def update_frames(self, name, vertices):
self.meshes[name].update_frames(vertices, [-1])
def add_frames(self, name, vertices):
self.meshes[name].add_frames(vertices)
self.seq_lengths[name] += 1
which you would use instead of the remote viewer directly, i.e.
viz = DebugHelper()
m1 = RemoteMeshes(viz.viz, smpl_output.vertices.detach().cpu().numpy()[0], smpl_model.faces.astype(np.int64), name=f"frame {frame_id}")
viz.add_new_mesh("m1", m1)
viz.update_frames("m1", smpl_output.vertices.detach().cpu().numpy()[0])
Note that you don't have to close the viewer/server at any time, you can just leave it running in the background while one or several clients connect to display data.
You can achieve infinite sequences without the remote viewer, too, but I found this pattern quite useful. Without the remote viewer, you could just create a SMPLSequence
with a single frame and then use the update_frames
or add_frames
methods (this is also what the remote variants use in the background).
Finally, you could use a Streamable
, which was meant to display live output, for example from a webcam. For example to display a couple of rigid bodies, it could look like this:
class RBDataStream(Streamable):
def __init__(self, **kwargs, n_sensors):
super(RBDataStream, self).__init__(**kwargs)
self.rb_ori = np.zeros((len(n_sensors), 3, 3))
self.rb_ori[:] = np.diag([1, 1, 1])
self.rb_pos = np.zeros((len(n_sensors), 3))
self.rbs = RigidBodies(self.rb_pos , self.rb_ori)
self.add(self.rbs)
def parse_data(self, rb_id, rb_pos, rb_ori):
"""This function is called from an outside process/thread and delivers new data. It may contain more logic to sanitize the inputs before updating the renderables."""
with threading.Lock():
self._update_rigid_body(rb_id, rb_pos, rb_ori)
def _update_rigid_body(self, rb_id, rb_pos, rb_ori):
self.rbs.rb_pos[:, rb_id] = rb_pos
self.rbs.rb_ori[:, rb_id] = rb_ori
def capture(self):
self.redraw()
You would use this class like this:
from aitviewer.viewer import Viewer
viewer = Viewer()
rbs = RBDataStream()
... # Here you will have to make sure that you start a thread that sends data to the stream by invoking rbs.parse_data
viewer.scene.add(rbs)
Thank you for the kind suggestion! I really appreciate it. Maybe I would try the RemoteViewer!
Before opening the issue, I would really thank you for maintaining wonderful project I've ever seen. The codes are kindly commented and tutorials are nice!
I am developing some motion-generation projects and I need some helps with the design pattern.
I need to render indefinite numbers of frames. However, All of Node classes (renderable) especially Skeleton and SMPLSequence demands me to have a fixed number of frames before it instantiated. But I am generating motions with autoregressive manner, that means the network receives the previous frame and outputs the next frame so that it can generate infinite number of frames.
I considered about generating 1000 frames before test and play the data, But, If I want to check further, re-generating frames is too cumbersome because I need to check my output many times.
You already made a nice suggestions (https://github.com/eth-ait/aitviewer/issues/14#issuecomment-1281978482)
Since I also need interactive gui in the Viewer class, updating 1 length sequence in the Viewer.run() may seem plausible.
But, before implement in actual, could you give me any suggestions about the design pattern that I need?
Best regards,