technical


set up characters in unreal for live mocap retargeting

livelink


play takes


there are a lot of steps:

1.  first i make a super basic stickman character that’s the same size as ruth using the same skeleton as unreal’s default quinn. this is the skeleton that i will send mocap too. this character has an ik rig, and two animation blueprints, one that I can send mocap too, via a livelink pose node and a second that plays back some prerecorded ruth takes, including a rom, very useful for testing.



 




ik retargeter
2.  for each custom character im going to make an ik rig, and an ik retargeter. inside the ik retargeter,  i set the source to my  ik_stickman and the target to the  customcharacter ik rig --there is a bunch of fiddling that can be done here, to get a good retarget --what really helps is that under the ik retargeter asset browser i can preview the prerecorded ruth takes.




retargeting animation blueprint
3.  we also need to create an animation blueprint for each character that uses a retarget pose from mesh node, under the node details setting  the ikretargeter asset to the retargeter we previously created.




hide the source mesh


tick pose
4.  nearly there! now  create a character blueprint. Inside set the mesh to be the  stickman. under its details search for visible and untick all the visibility settings to hide it in the level.  now search for tick and set  visibility based anim tick option to always tick pose and refresh bones.  this will ensure the stickman mesh always  updates with its animation even though it is  invisible.

add another skeletal mesh and set the skeletal  mesh to be to your new character, set its animation  class to be the animaton blueprint we just made in step 3.




4.  finally, drop the character blueprint into the level.  you can set the animation class of the first mesh (the stickman) to either your prerecorded takes anim bp or your livelink mocap anim bp. when the level plays the stickman animation gets retargeted live to yourcharacter mesh.  
(This  technique also works for radical mocap - they have a custom character that you tweak and their own character IK rig that you retarget to yours, pretty much the exact same method)




openpose -> osc in unreal







run app-win.exe

open a webbrowser and point it to localhost:8000 this is where you shoud see a feed from your webcam overlaid with  a dots and stick skeleton.
 

osc data is streaming from the 17 skeleton points on port  9129
as an osc message looking like this: RECEIVE    | ENDPOINT([::ffff:127.0.0.1]:41234) ADDRESS(/pose0) FLOAT(144.01819) FLOAT(72.29978) FLOAT(177.62071) FLOAT(21.599257) FLOAT(148.32224) FLOAT(26.75776) FLOAT(305.67664) FLOAT(32.62451) FLOAT(161.46097) FLOAT(61.923046) FLOAT(443.73434) FLOAT(259.66837) FLOAT(352.28235) FLOAT(269.90558) FLOAT(381.73785) FLOAT(583.20636) FLOAT(266.5818) FLOAT(500.99966) FLOAT(238.74225) FLOAT(514.54865) FLOAT(42.904594) FLOAT(439.00473) FLOAT(434.8785) FLOAT(551.3904) FLOAT(347.69272) FLOAT(510.22787) FLOAT(331.41455) FLOAT(560.9484) FLOAT(260.55704) FLOAT(548.9178) FLOAT(305.8865) FLOAT(551.969) FLOAT(238.51305) FLOAT(502.90475)

below is a table of each skeletal joint against its horizontal, vertical position index:


index horizontal

0
2
4
6
8
10
12
14
16
18
20
22
24
26
28
30
32
index vertical

1
3
5
7
9
11
13
15
17
19
21
23
25
27
29
31
33
joint name

left ankle
left ear
left elbow
left eye
left hip
left knee
left shoulder
left wrist
nose
right ankle
right ear
right elbow
right eye
right hip
right knee
right shoulder
right wrist

one thing to note: the webbrowser window with the video feed must be the front tab and the browser must not be minimised, else it stops streaming.

second thing to note, in unreal 5.5,  when setting it up to receive osc like here the clientipaddress in the createoscserver node must be set to 0.0.0.0. (In the ureal docs it states that for localhost you can use 0 or leave it blank but this wont work in 5.5)

last thing to note, incoming osc horizontal and vertical values seem to be in the ranges 0-640  and 0-480 -- I assume because of the resolution of my web camera feed. In unreal x = horizontal, x = vertical.



set dmx light colour automatically



we want to try setting the colours of dmx controllable theatre lighting automatically. mark coniglio from troikatronix creates an isadora patch for us to do just that. here incoming video is sampled in a small box and then the rgb values are sent over artnet on the local network to a dmx light thats plugged in via its ethernet port. In the image you can see im holding  a roll of green tape in the centre of the screen, and this colour  is sampled from the video stream and sent both to colour a shape box on top of the video stream, and out  as dmx values.

eventurally for the performance we create a multiscene isadora patch that takes live video in from the ai camera and adds some show controls - a fade to black and another fader that blends in the colour square, which is been scaled to tint the entire image. another fader controls a blur effect.




tech setup at trafó







1. 
pc 1 runs unreal, showing different scenes audio into the mixer. 
the unreal window is captured and runs thru’ isadora which i’m using for show control.

2.
a second pc (pc2) is connected to our ai camera. this video feed also runs thru’ isadora which is also doing show control for the second projector. 

3. 
after the end of the show, we alt tab between applications to unreal and use radical to get mocap from the video feed to animate avatars in realtime for the audience.

4. 
a laptop is used to play music for the show, connected via the mixer into the venue pa system. a second channel mixes in Unreal audio for one of the scenes. a third channel in the mixer is connected to an ipad for the musician to play live after the performance.

tech setup optimised




performative installation version



performative installation version with captury






player, performer installation version with third person viewpoint



a 2-person installation version. ‘expanding the creative possibilites for contemporary dance performances by transposing to a different domain -- into a game world.’

a 'player' uses a game controller to navigate through a series of environments from the nino project featuring motion captured from our trafo performers. the environments are displayed on a large monitor, e.g. 55" 4k or above.

a second audience member acts as a  'performer' and is motion captured by a small camera connected via  pc to the internet; this converts their movement into motion capture, which animates an avatar seen onscreen on a large portrait monitor. the two pc's are linked so that the player sees the 'performers' movement.



further links


gibson/martelli technical blog
www.spaceplace.gibsonmartelli.com
artists website
https://gibsonmartelli.com/
isadora performance software
https://troikatronix.com/
modina 
https://modina.eu/
mechanoid muse
https://radiodrive.co/
skinner releasing network
https://skinnerreleasingnetwork.org/