technical
set up characters in unreal for live mocap retargeting
livelink
play takes
1. first i make a super basic stickman character that’s the same size as ruth using the same skeleton as unreal’s default quinn. this is the skeleton that i will send mocap too. this character has an ik rig, and two animation blueprints, one that I can send mocap too, via a livelink pose node and a second that plays back some prerecorded ruth takes, including a rom, very useful for testing.
add another skeletal mesh and set the skeletal mesh to be to your new character, set its animation class to be the animaton blueprint we just made in step 3.
(This technique also works for radical mocap - they have a custom character that you tweak and their own character IK rig that you retarget to yours, pretty much the exact same method)
openpose -> osc in unreal
open a webbrowser and point it to localhost:8000 this is where you shoud see a feed from your webcam overlaid with a dots and stick skeleton.
osc data is streaming from the 17 skeleton points on port 9129
as an osc message looking like this: RECEIVE | ENDPOINT([::ffff:127.0.0.1]:41234) ADDRESS(/pose0) FLOAT(144.01819) FLOAT(72.29978) FLOAT(177.62071) FLOAT(21.599257) FLOAT(148.32224) FLOAT(26.75776) FLOAT(305.67664) FLOAT(32.62451) FLOAT(161.46097) FLOAT(61.923046) FLOAT(443.73434) FLOAT(259.66837) FLOAT(352.28235) FLOAT(269.90558) FLOAT(381.73785) FLOAT(583.20636) FLOAT(266.5818) FLOAT(500.99966) FLOAT(238.74225) FLOAT(514.54865) FLOAT(42.904594) FLOAT(439.00473) FLOAT(434.8785) FLOAT(551.3904) FLOAT(347.69272) FLOAT(510.22787) FLOAT(331.41455) FLOAT(560.9484) FLOAT(260.55704) FLOAT(548.9178) FLOAT(305.8865) FLOAT(551.969) FLOAT(238.51305) FLOAT(502.90475)
below is a table of each skeletal joint against its horizontal, vertical position index:
index horizontal
0
2
4
6
8
10
12
14
16
18
20
22
24
26
28
30
32
0
2
4
6
8
10
12
14
16
18
20
22
24
26
28
30
32
1
3
5
7
9
11
13
15
17
19
21
23
25
27
29
31
33
left ankle
left ear
left elbow
left eye
left hip
left knee
left shoulder
left wrist
nose
right ankle
right ear
right elbow
right eye
right hip
right knee
right shoulder
right wrist
second thing to note, in unreal 5.5, when setting it up to receive osc like here the clientipaddress in the createoscserver node must be set to 0.0.0.0. (In the ureal docs it states that for localhost you can use 0 or leave it blank but this wont work in 5.5)
last thing to note, incoming osc horizontal and vertical values seem to be in the ranges 0-640 and 0-480 -- I assume because of the resolution of my web camera feed. In unreal x = horizontal, x = vertical.
set dmx light colour automatically
eventurally for the performance we create a multiscene isadora patch that takes live video in from the ai camera and adds some show controls - a fade to black and another fader that blends in the colour square, which is been scaled to tint the entire image. another fader controls a blur effect.
tech setup at trafó
pc 1 runs unreal, showing different scenes audio into the mixer.
the unreal window is captured and runs thru’ isadora which i’m using for show control.
2.
a second pc (pc2) is connected to our ai camera. this video feed also runs thru’ isadora which is also doing show control for the second projector.
3.
after the end of the show, we alt tab between applications to unreal and use radical to get mocap from the video feed to animate avatars in realtime for the audience.
4.
a laptop is used to play music for the show, connected via the mixer into the venue pa system. a second channel mixes in Unreal audio for one of the scenes. a third channel in the mixer is connected to an ipad for the musician to play live after the performance.
tech setup optimised
performative installation version
performative installation version with captury
player, performer installation version with third person viewpoint
a 'player' uses a game controller to navigate through a series of environments from the nino project featuring motion captured from our trafo performers. the environments are displayed on a large monitor, e.g. 55" 4k or above.
a second audience member acts as a 'performer' and is motion captured by a small camera connected via pc to the internet; this converts their movement into motion capture, which animates an avatar seen onscreen on a large portrait monitor. the two pc's are linked so that the player sees the 'performers' movement.
further links
gibson/martelli technical blog
www.spaceplace.gibsonmartelli.com
artists website
https://gibsonmartelli.com/
isadora performance software
https://troikatronix.com/
modina
https://modina.eu/
mechanoid muse
https://radiodrive.co/
skinner releasing network
https://skinnerreleasingnetwork.org/