Ragdoll x Human IK

Hi all,

Got a question about using Ragdoll with Maya’s Human IK, so I put this together. Overall, it’s quite tedious as Human IK is it’s own little mini-system inside of Maya and doesn’t always play by the rules. The gist of the workaround is to assign to locators that follow the Human IK character, and retarget onto the Human IK end effectors.



In the video, I mention a script to automate one of the tedious processes, here it is.

  1. Select all assigned controls
  2. Run the below script in a Python tab of your Script Editor
from ragdoll.vendor import cmdx
from ragdoll import api, commands

with cmdx.DagModifier() as mod:
        group = cmdx.encode("locators_grp")
    except cmdx.ExistError:
        group = mod.create_node("transform", name="locators_grp")

    for ik in cmdx.sl():
        marker = ik.output(type="rdMarker")
        if not marker:
        name = ik.name() + "_loc"
        locator = mod.create_node("transform", name, parent=group)
        shape  = mod.create_node("locator", name + "Shape", parent=locator)
        cmds.parentConstraint(str(ik), str(locator), maintainOffset=False)
        mod.connect(locator["worldMatrix"][0], marker["inputMatrix"])
1 Like

Interesting! I found a way by defining the ragdoll manikin as a HIK character and changing the source to the animation rig. Then to get the sim back on to the original rig after baking, swap the anim source in the HIK window.


Would it be possible to demonstrate this?

Of course! I can record a demo tonight:)

1 Like

I found a way to keep it all in one rig for more of an anim workflow.

This other method works well with mocap.


Thanks for this! :heart:

1 Like