-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can the output data be converted to bvh format? #3
Comments
Hi, I tried many times to visualize the results in bvh format but failed. I feel it is more complicated than I suppose. So I will be glad if you can figure it out. As far as I know, bvh is in "euler angle" rotation representation. Use the following code: from common.skeleton import Skeleton example_data = np.load(os.path.join(data_dir, example_id + '.npy')) face_joint_indx = [2, 1, 17, 16] Then the quat_params are the local rotations in quaternion representation. Note the root rotations are respect to Z+ direction. Hope it helps. |
Thank you very much for the reply! Quick question on the example data, is it the global or local position of the joints? What would the first entry be like? |
Hi, the example data use global positions of joints. It is in the same format as what you obtained through recover_from_ric method. You may also need to add these two lines: n_raw_offsets = torch.from_numpy(t2m_raw_offsets) So this quat_params give you the local rotations of each bone with respect to the pre-defined offsets. |
Hello! has this problem been solved? Can you share the method with me? |
Converting the SMPL pose to BVH was easy, and I got that done yesterday. |
Hello! Could you share the method with me? Thank you very much! |
Hi, if possible, could you please send the code or github repos? I am very interested. Thanks. |
For whom are interested in converting the positions into bvh format, this script can help a lot. https://github.com/DeepMotionEditing/deep-motion-editing/blob/master/utils/InverseKinematics.py What you need to do is 1) finding a smpl template bvh file that contains offset information, 2) using their bvh loader to load bvh file into an Animation object, 3) passing the animation object and positions into this inverse kinematics and get the rotations, 4) write the animation into a bvh file. I have succeeded using other data in a similar way. You may contact me if you have question. |
The BasicInverseKinematics should suffice already. For initial rotations at the beginning of optimization, you could just use identity transformation. |
Hi i am looking for solution that converting result npy files to bvh files so anyone who give me some detail guides and sample codes? I tried many things but finally failed. |
Hi, I got a lot of comments that our current rotation representation seems not compatible to other 3D softwares like blender. I kind of get the reason. In IK/FK in skeleton.py, for i_th bone, we are calculating the rotations for itself. While in bvh, actually we should get the rotations of it parent instead. Therefore, in line 91, you could try to use its parent bone, instead of the bone itself. I am not sure if it works. Here I attach the codes of our FK and bvh FK, you may see the difference while obtaining global positions: for i in range(1, len(chain)):
R = qmul(R, quat_params[:, chain[i]])
offset_vec = offsets[:, chain[i]]
joints[:, chain[i]] = qrot(R, offset_vec) + joints[:, chain[i-1]] BVH FK: for i in range(1, len(self.parents)):
global_quats[:, i] = qmul(global_quats[:, self.parents[i]], local_quats[:, i])
global_pos[:, i] = qrot(global_quats[:, self.parents[i]], offsets[:, i]) + global_pos[:, self.parents[i]] Hope this helps you. I do not have time to validate this idea. But if anyone figure it out in this or any other ways, I would appreciate so much if you could let me know. If it does not work, I know the recent work ReMoDiffuse managed to use the rotation representation in their demo. You may refer to them. BTW: I have updated the quaternion_euler_cont6d functions in quaternion.py, which should be safe to use. |
any way to output the resulting animation as an fbx file, or do you know of way that the npy can be used in a 3d software (ie, blender)? |
To get further on this, I tried to modify
I don't want to use FK/IK, but fetch the internal cont6d rotations and convert them directly to per-bone quaternion wrt. to the parent rotation. However, this produces weird rotations when I assume that each of the rotations is per-frame, per-bone local rotations (with shape <frame_count>, <bone_count>, 4>) Could you help me with this? |
I'm trying to convert the rotations and joint position into bvh format so that I can do better visualization. I can see that there is an IK method in the motion_process.py file which might help me to get the local rotation information. But it turns out that it's not correct. I fed in the joint positions returned by recover_from_ric method.
Would be appreciate if any help or hint is provided.
The text was updated successfully, but these errors were encountered: