[use IK in record_demos.py] Instead of using teleoperation, IK is used to collect data #3112
Replies: 3 comments
-
arm_fail.mp4 |
Beta Was this translation helpful? Give feedback.
-
Thank you for posting this, and adding the video. I'll move this to the Discussions section for the team and others to follow up. |
Beta Was this translation helpful? Give feedback.
-
Following up, to modify your remote control data collection script to use inverse kinematics (IK) for robot arm movement (replacing keyboard or VR), it’s crucial to ensure correct usage of the DifferentialIKController, correct frame conversions, and correct selection of the end-effector link. Here are key points you need to check and recommended modifications: 1. Use the Correct End EffectorFor the Franka arm in Isaac Lab, the proper end-effector link is usually Example: robot_entity_cfg = SceneEntityCfg(
"robot",
joint_names=["panda_joint.*"],
body_names=["panda_hand"]
)
robot_entity_cfg.resolve(env.scene) 2. Consistent Frame TransformationsAll positions and orientations for the IK controller must be expressed in the robot’s base frame (not world). When using a target (e.g., cube), convert its world pose into the robot base frame using 3. Obtain the Right Jacobian IndexMatch the example logic: if env.scene["robot"].is_fixed_base:
ee_jacobi_idx = robot_entity_cfg.body_ids[^1_0] - 1
else:
ee_jacobi_idx = robot_entity_cfg.body_ids This matches the controller’s expectation for the Franka robot—a fixed base arm. 4. Correct Data Extraction for IK
Example for conversion: ee_pos_b, ee_quat_b = subtract_frame_transforms(
root_pose_w[:, 0:3], root_pose_w[:, 3:7], ee_pose_w[:, 0:3], ee_pose_w[:, 3:7]
) 5. Controller Setup and UsageFollow the official IK tutorial and match the buffer setup. Here’s a safe and reproducible structure closely matching the working script:12 # Controller configuration (run once for setup)
diff_ik_cfg = DifferentialIKControllerCfg(command_type="pose", use_relative_mode=False, ik_method="dls")
diff_ik_controller = DifferentialIKController(diff_ik_cfg, num_envs=env.num_envs, device=env.device)
# Per-step updates (in your control loop)
# Get current quantities
jacobian = robot.root_physx_view.get_jacobians()[:, ee_jacobi_idx, :, robot_entity_cfg.joint_ids]
ee_pose_w = robot.data.body_state_w[:, robot_entity_cfg.body_ids, 0:7]
root_pose_w = robot.data.root_state_w[:, 0:7]
joint_pos = robot.data.joint_pos[:, robot_entity_cfg.joint_ids]
# Compute end effector pose in robot base frame
ee_pos_b, ee_quat_b = subtract_frame_transforms(
root_pose_w[:, 0:3], root_pose_w[:, 3:7], ee_pose_w[:, 0:3], ee_pose_w[:, 3:7]
)
# Compute IK (target pose comes from your application)
ik_target_pos_b, ik_target_quat_b = ... # Convert target (cube) from world to base frame here
ik_commands = torch.cat([ik_target_pos_b, ik_target_quat_b], dim=1)
diff_ik_controller.set_command(ik_commands)
joint_pos_des = diff_ik_controller.compute(ee_pos_b, ee_quat_b, jacobian, joint_pos)
robot.set_joint_position_target(joint_pos_des, joint_ids=robot_entity_cfg.joint_ids) 6. Common Issues (and Fixes)
7. Full Example ReferenceThe Isaac Lab tutorial ( Summary of Key Edits for Your Code:
Footnotes |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Has anyone modified the remote control data collection script to use IK to achieve the movement of the robot arm to complete the data collection instead of using the keyboard or VR?
My modifications are as follows. But the robot's movement is always not as expected.
Beta Was this translation helpful? Give feedback.
All reactions