Puppeteering : Recording Animation with VR Trackers

Hello, Marcus.

I really appreciate that you made this incredible software. I certainly sure that this plugin becomes an industry standard for a multitude of use cases in the near future.

I am curious about the development direction of Ragdoll Dynamics.

I saw someone who recently posted a post on a forum about Mocap.

https://forums.ragdolldynamics.com/t/mocap-to-ragdoll-simulation-in-live/1195

It was Amazing!! That was my dream. Full - Realtime Animation.

I’d like to try that right away, but unfortunately the problem is that motion capture equipment is quite expensive.

Not only the equipment, but also the software.

Even the software that tracks those motion tracker data and puts them in Maya doesn’t even sell permanent licenses,

sells them with individual dongles license, and continues to pay annual fees of more than a few thousand dollars.

I acknowledge Mocap Developer’s hard work, but I think this is definitely an expensive price that individual animators cannot afford.

But I’m looking into other solutions.
It is VR tracker equipment.

He’s actually using VR controllers and VR trackers in that Mocop Ragdoll post.

The VR controller is close to 300 dollars,

VR trackers can run up to 7 points of tracking for about 1,100 dollars, including ViVe Base Station 2.0 and Vive Tracker 3.0.

(I am Korean, so I thought of it as my standard, but I don’t know the exchange rate exactly.)

It can still be said to be expensive, but in my opinion, this level is comparable to the accuracy of other existing motion capture solutions and is reasonable in terms of price.

But software is still a problem.

As far as I know, It is impossible to connect that motion capture data directly to Maya without the 10000$ software provided by Mocap companies.

However, there are some softwares that wants to support VR controllers and trackers on its own. It’s a game engine.

I wonder if you saw this video.

If you haven’t, I hope you watch the video, please.

This is my suggestion, and I think it would be nice to be in the future of this plugin.

https://youtu.be/r1fHOS4XaeE?si=B980lNIk_oSfPLvK&t=2074


https://youtu.be/r1fHOS4XaeE?si=B980lNIk_oSfPLvK&t=2074
You can look around 34 minutes and 34 seconds.

Initially, it is structurally impossible to perform puppeteering with a normal mouse.

I remember what you said when I had a short conversation with you on Discord.

I totally agree with you. It’s rare to find someone who acknowledges and understands the value of live animation this early.

I think your plugin can change the many studios, industries. It can become the foundation for indie animators, and one-only-man animation directors.

I really appreciate you bringing the recording to Maya through Live mode.

If I understand correctly, I was also very impressed by creating a separate timeline buffer to distinguish between simulation and recording timelines.

Combining animating with the Ragdoll simulation is also a really groundbreaking idea.

I’d like to ask if you have any plans to extend this feature by referring to the VR controller provided by Steam VR or the 6 dof data of Vive tracker.

Of course I know this is going to be a really difficult challenge. In fact, I think Autodesk should adopt this plug-in as a new official system, rather than directly acquiring it and turning it into an expanding version by plug-in developers.

However, wouldn’t someone like you, a great software developer, be able to do even this?

Or if I can help you, I think I want to try developing it on your plug-in.
I have Oculus Quest 3 and Vive Base Station 2.0 and Vive trackers and Tundra trackers.

As long as I get trackers data, it would be really easy to connect from Maya to locator. I want to connect it to your simulation timeline in real time and record it.

I’m a student who can handle Python and a little C language.
I’ve been studying how to get this tracking datas in maya these days.
Maya doesn’t seem to have OpenXR API, of course, but… do you think this mocap based ragdoll simulation recording in maya seems impossible?

2 Likes

Thanks for this, excellent notes, yes you are definitely on the right track. I’ll prepare a longer response to you in a moment, while I sort a few things out on this end.

2 Likes

I watched this, and yes. This will be really trivial with Ragdoll soon, even with a mouse. This is effectively the state-of-the-art in digital puppetry today. This is as far as anyone has come. And it isn’t very far at all! We have prototypes that already go much further, with multi-track recording and multi-touch input, but that’s barely scratching the surface. There’s tons of low hanging fruit here and this is the direction I would like to explore with Ragdoll.

You’d be surprised! But yes, you can get much further with more hardware. VR controls is an obvious (and a little boring) option, we’ve got a few more things in mind that I expect will be more fun and controllable.

Your drawings

Yes! This is an excellent idea. Still limited to those with 3d printers, or those willing to send off their models for printing, and those who either make or buy a 3d model. But, very trivial and intuitive with endless potential. With camera-based tracking, this kind of thing will become more accessible over time as well.

Yes, I’d like to expose as much hardware for this style of animation as possible, VR controllers is a no-brainer.

Cool, once we do beta testing, I’ll be sure to ping you for testing.

Ok, in that case you might be able to test a little earlier, as we’re launching a Python API for the underlying engine shortly where you could mess about with this stuff in the same way we do for the plug-in(s) themselves.

Yep!

1 Like

Thank you so much for replying like this. I was really touched as I read it.

I was amazed that you were already planning. It also makes me excited to say that I will be able to experience the new features soon.

If I can do it early, I’ll consider any way I can help you. I really love this kind of experience.

If you need a recording of the video I experience, I’ll be happy to make it and send it to you. I have experience editing YouTube videos.

You are the one who makes my dream a reality.
I’m happy just to communicate with you.
Thank you very much for your development.

Also, this makes me really nervous and excited that you left this comment.

What other ideas do you have?
In addition to simply expanding the hardware, I think you must have come up with a new way.

Although I do not have anything comparable to your achievements, I have sympathized with and considered the direction you are pursuing.

It’s time to rip up the traditional animation method.

If my guess is correct, I think you have developed a new transform tool.

To improve the movement with a normal mouse, you need to come up with a software solution.

The reason why I, as an ordinary student, learned the C language and studied the Maya API was, in fact, there was one topic that I was personally studying for development.

It’s about the ephemeral rigging.

When I do modeling, like there is a modeling tool
When animating, I think there should be a tool dedicated to animating.

For example, I click on objects one by one and only use W: Move Tool, E: Rotate Tool, R: Scale tool.

However, there are too many controllers these days!

If I make IK and FK controllers, you need to make one more controller that selects IK and FK modes,

If I want to make space switch features, I have to make one, two, three new spaces beforehand that may or may not be used.

In the meantime, six more new switch nodes must be created with 3x2x1 for each number of cases to fill the gap again.

The improvement of this terrible task was to leave the hierarchy alone, like Ephemeral rigging, and just add functionality to the tool, like the new IK move tool or FK move tool.

https://youtu.be/J48b0GKI4RM?si=mTj2y9IdmlVVfVzF

I think this idea is also an idea you’ve thought of
because I noticed that your plug-in has a new animation move tool that pulls the Ragdoll in IK mode and a new animation tool that rotates the Ragdoll in FK mode.

Did I get it right?

I think there should be a lot more transform tools in Maya than there are now.

The first thing I came up with with confidence was the tool that selects the controller at the same time and moves at the same time.

It’s like a brush tool.

Let’s assume that I have to animate the meta-human face control rigs. I have so many controllers on the face, and I wish I had a new Animate Brush tool that only change the transform node’s translate instead of transforming any shape node.

Everyone only thinks of the Mocap solution, but I think the Ragdoll simulation plug-in is moving towards proposing a new workflow of keyframe animation, not just simulation and animation combined.

Facial animation has so far been too difficult for the number of controllers to control, making it difficult to edit without taking motion capture of the face.

this is what I’m trying to make… There’s a lot to be desired, but here’s the idea.

I think the new controller brush tool will bring the efficiency of keyframe animation.

Similarly, I think it’s time for a new Rotate Tool, a Move tool.

In particular, I have an idea that I want to convey to you about the new Rotate Tool…

I think your plugin’s separated timeline can be a new container that stores nonlinear functions for the alternatives replacing linear interpolation between keyframes in maya animation graph, not just a separate time for ragdoll simulation.

Also, I thought about how to further subdivide and add the concept of hierarchical structure to the keyframe class.

But, I know it’s a little off topic from here, I’m afraid if I’ve been too excited to be rude to you. so l end the talk here.

I really appreciated you for listening to my curiosity.

I’m so excited to be welcoming new animations with you, and I’m grateful to have met you.