iPhone Facial Capture with Unreal Engine | Unreal Fest Online 2020

>> Ryan: Hello! My name is Ryan Mayeda, and I'm on the Product Team for Unreal Engine where myfocus is on virtual make. So as you mightguess, that's what we're here to talk about today. Specifically, we're going tocover real-time facial captivate with iPhone using ournew app, Live Link Face. To do that, I'm going to enlistthe help of a familiar face, someone you are able to recognizefrom the UE5 reveal. She's going to act asmy avatar, and we're going to do thepresentation together. We'll chiefly be takinga tour of the iOS app, leaving other deepertopics, like the animation and rigging setup, as well astimecode and hardcore place configuration for a future talk. We'll touch on them a bit, but these are definitely orbits we want to coverlater as kinfolks get further into the facialcapture pipeline.All privilege, let's get started. I'll go ahead andlaunch Live Link Face. And you can see that she comesalive right away with my face driving hers through the app. Check it out. Nailed it. This may go without saying, but I'm going to say it anyway. One thing to call outis that the app uses the iPhone front-facing camera. So, my video is reflected, whichin turn means that our principal shifts will ogle reversedonscreen since she's actually looking at it. If I turn my foreman tothe right, she's also turning her pate to her title. And I think you get the idea. Next, I'll framed my commerce haton and make sure to be clear. The app is out now. And hopefully, everyone'salready downloaded it. If you haven't, grab it fromthe Apple App Store ASAP.It's free. The only real requirementis that your iPhone has a True Depth camera, whichis needed for the ARKit face tracking. The basic rule of thumb isthat if your iPhone doesn't have a Home button, you should be all good. Our large-scale point for the app wasto take the Face AR Sample that tons of people havealready had good success with, including us at Epic, andproductize the iOS portion so teams don't have to buildand deploy the app themselves. This isn't necessarilythat straightforward to do. And we found that alot of our buyers don't have thatexpertise in-house. You know, whereverpossible, we want to empower peopleto focus on creating the characters, actions, and content in general. And that's what ledus down this path. The Face AR Sample is stillvery much with us, though.And our aged friend, the kite boy, is still a great example assetto start with even though he's sitting this presentation out. Let's get into the app. But firstly, a bitof housekeeping. In order to use the featureswe're showing today, the work requires few plugins to enable. First up is Live Link.it might be self-evident. Live Link is in thename of the app itself. But the facial capturestream is coming in over the Live Link protocol.So you're, of course, goingto need that plugin on. Next up is the LiveLink Curve Debug UI. So technically this isn'trequired for the system to work. But this plugin issuper useful, specially if you're doing the animationand person setup. It genuinely causes you diagnosethe data coming in, figure out how it's beingapplied to your reputation, look at the blendshapes, that sort of thing. So it's super beneficial, and wehighly recommend having it on. ARKit and ARKit FaceSupport– again, quite self-explanatory sincewe are using the ARKit face tracking. But you're going toneed those on for sure. Take Recorder iswhat we're going to do to record the LiveLink stream in Unreal Engine and get wise into Sequencer orwork with it as animation.And last but not least isthe Timed Data Monitor. So, this is a new plugin youmay not be familiar with. It's in 4.25. It's super cool andit's kind of something I was alluding to earlier, wherethis is the timecode workflow is its own demo. But in this case, we're goingto use this new plugin time to imagine the timecodecoming in from the iPhone into on Unreal Engine. All privilege, let's jumpinto the main screen. So, we have kind ofa social-style UX, very much inspired by otherapps that use face tracking.We felt like this wouldmake it both familiar and enjoyable to parties as they werestarting out even though we don't have any filters. So you can do them in Unreal. In periods of the primary scheme, the most important thing to notice firstly is the big green”LIVE” at the top. So, lettuce means you're onair or streaming data, right? So my achievement is beingsent over to her in the Engine. If we sounds this, we can pause thestream, so now , no matter what I do, it's not going to carryover into Unreal Engine. This can be a handything to do if you need to go offline for whateverreason, take a break, whatever. It's just a tap apart. Let's go back to live forpresentation purposes. Next up in the top right– it'skind of some related peculiarities, but a little bit moreoriented around battery life. So right now, we haveboth video on, right? You can be found in me. We have the face tracking on.It's being sent over to her. If I tap this once, we'regoing to toggle the video off but leave the tracking on. So the video tends to use upa little bit more battery. It's something you may wantto turn off during a shoot although you keepthe capture proceeding. In addition, we foundthat some performers maybe don't like to see their facewhen they're play-act, right? It is likely to be confusing. So if that's the case, youcan toggle the video off. And then they canjust focus easier on their routes, et cetera. The last-place option here is boththe video off and face moving off, right? So this is superbattery saver mode. Might be something you wantto toggle in between goes, in between setups, or ifyou're taking a break, but restraining the actors incostume or to be maintained all rigged up– that's an option thereto help prolong the life of the phone for the shoot. Let's turn everything backon for our presentation. Next thing we'llmention is timecode.So no matter what, you're always going to see a readout of thecurrent timecode right there underneath the LIVE button. We're going to getinto this in more depth shortly as we diveinto the directs. But at minimum, you can always see you're gettingtimecode, what it is, and a little icon ofwhere it's coming from. Down in the bottomright is face detection.So this is green when theapp is spying a face. Obviously, it'sdetecting mine right now. But if it loses– if it losesthe face for whatever reason, then that's going to go gray. We can demonstrate this by beingsafe, kind of masking up here. So if I positioned my maskon, the app no longer recognizes my face or a face. And now we're offline. So even though we aretechnically still streaming, the app isn't concluding aface to actually torrent. So if I take this off, we'll be back again– picks up very quickly. But time mention, maybe notsuper quarantine friendly. Another thing to mentionis down at the bottom are slate and make, posterior midriff. So we're going to comeback to this later where reference is do some recording. But I just wantedto call out abruptly you can tap thisand modify the slate and make by handdirectly in the app. Patently, in general, it's the best practice to keep this alignedand paired up to what you have in Take Recorder. But there is the option hereto set it yourself in the app directly.Also down here is theLive Link Subject Name. And that's kind of a good segueway into the full regulates menu, which is upat the top left. So let's jump into there next. So, we have a lot of fixeds. We're trying to keep this asuser friendly as possible. But we also wantto make sure we're offering a robustset of pieces that help people straddled fromentry-level folks who may not touch these very much toprofessional stagecoaches that are going to really getinto the nitty-gritty. First thing we'll hit is LiveLink and the Subject Name, which you time realized exposed atthe bottom of the main screen. So by default, the subjectname is the name of the machine or the name of the phone. You can see my phone isPandaaaaaaaa with eight As. You can supersede it if you crave. And then now we'll– let'squickly jump out and simply indicate what this lookslike in the Engine.So I'll go out offull screen, pa over to the Live Link panel. So we have a source, whichis the ARKit face tracking. Down here is our subjectname, which is Pandaaaaaaaa, matching exactly what we have uphere in the app for the subject name. And that's how youmake that tie. Pop back into the full screen. Next up is network targets. So our main workflowidea for the app is to multicast thedata stream of the faces to all machines in apotential multi-user session.And the goal here isto minimize latency. We've really designedthe app to excel in a sort of collaborativevirtual make with multiple machines. Maybe you have onethat's recording, one that's doingVCam, one that's focused on stage operation, andmaybe you're scouting, right? So each of those machines isgetting the facial tracking data as fast as possible. And then we synchronizewith timecode. The system targets are basedon an IP/ port combining. So in this case, I'm asingle-user, single-machine setup simply withmy home box now. You can find yourIP truly easily. Simply sounds open a commandprompt, smacked ipconfig, right? So I'm send over Wi-Fi. I'm going to look atmy wireless adapter. And my IP addresshere is 192.168.1.17 on the home structure. And that's what I'veentered in as the target.The default port is 1.1.1.1. You probably onlyneed to change it if your IT unit tells you to. And if you want to enter intoa multi-user session– you have multiple machines– you'rejust going to add a target, punch in anotherIP address now. Cancel out of that. Close the thing now. Right, so next up isthe Live Link protocol. In Unreal 4.25, we madesome improvements here and computed support forfractional frames which improved accuracy andhelps avoid the possibility of duplicate enclose. But this is alsosomething that you need to be aware of forbackwards harmony roles. You can use the app with olderversions of Unreal, right? So right now, I'm using 4.25. But if I wanted to use anearlier job, like maybe a 4.24 campaign, then I can goin now and convert the protocol. The most important thing to point outhere is that it's very specific. So there is a requirement to an exactmatch between the versions for its optional protocol. Otherwise, thestream won't work.So because I'm ina 4.25 projection, if I change it to 4.24 hereand tell that placing take hold, I'm losing the torrent. Everything looks likeit's set up accurately, but I'm not getting anyfacial tracking data. So that's somethingto keep an eye out on. If you are using differentversions of Unreal, this is somethingthat you're going to need to make surehas an exact match. So let's button ourselves backto 4.25, get the stream back– here we go– and then close outour Live Link creates talk with the last item, which is Live at Startup.This one's prettystraightforward. You notice that whenwe propelled the app, we started streamingright away, and that's based on this toggle. If you don't wantthat to happen– you want it to be, like, atwo-step process where you open the app and thenopt in to streaming, then time toggle that person off. Next fixed to gothrough is timecode. So this is a big featurefrom the pro position. And we have three differentoptions for timecode. They kind of build incomplexity in specification. So first up is theSystem Timer. So this comes from theclock of the phone itself. So it may look like anon-standard, creepy timecode. In such cases, there'slike a 94 in it. That's not something thatyou frequently participate on a place. And this is becausethe system timer is based on how longthe phone has been on. So I haven't rebooted myphone for a long time.That's why it says 94 hours. The other two optionsare a little bit more of that kind of classic, expected timecode format. And those NTP and Tentacle Sync. So we'll go throughNTP first, right? So NTP makes you synchronizetimecode with a experience server. The default we stipulate is theApple one, service standards one, the same thing thatyour phone uses to figure out whattime it is based on what Apple thinks it is.We've seen some productionsrely on using their own NTP server on theatre. And they use this tosynchronize all the maneuvers. So this can be a really easyway to get time-of-day timecode. So note that right nowin LA, it's about 8: 20. And that's what you see here. Last-place up is Tentacle Sync. So this is definitelythe most pro option. And this enables the app tosynchronize with a ruler clock hardware machine on stage. And the course we do thatis with this invention called a Tentacle Sync thatwe've done integration with. So this is something that aTentacle Sync looks like. It's a little lightweightBluetooth guy. And it connectsto a ruler clock. So I've got one here, abusing the UltraSync. You can kind of seethis thing here, right? So, I know it's reflected, but the timecode on this guy should join whatyou witness there, right? 4:21: 27. And the space it operates isthat the master clock drives the Tentacle Sync.And then the TentacleSync, through Bluetooth, tells the iPhone whatthe timecode should be. So, the Tentacledevice is really cool. We actually use thison our own shoots. And so one thing tonote is that if you have a shoot with multipleperformers, numerou iPhones, you should be able to use thesame Tentacle Sync to drive the timecode on all of them. You don't need a one-to-onerelationship between Tentacle Sync and phones.So that's pretty cool. Last thing I want toshow here on timecode is– popping backout into the editor. And I mentioned it before, but the Timed Data Monitor here– so, this is the newplugin we were talking about. And this is kind of time toshow that the timecode is indeed coming through, and thisis what we're getting. So good-for-nothing up our sleeves–timecode coming through. What you learn on the phone iswhat you go into Unreal Engine. So I'm going to actuallyswitch this back to NTP. I ever feel alittle bit funny when it's not time-of-day timecode. It simply looks strange to me.All right, back to the lays. So OSC is up next. We're actually going to parkthat and is coming to it. OSC is for remotecontrol rectified of pieces that let you administerthe app externally. But that's going tobe our grand finale. So we'll look outfor that at the end. Next up is StreamHead Rotation. So, this is somethingthat you guys want to turn off when you'rein the mocap consider. So if I turn this off here– so now she's very still. Like , none of the headrotation I had before is coming through. When you're in a mocap suitor you have a body mocap solution availableto you, you're going to want to get the headrotation from the actual body mocap.If you have them both on, thenthe data you get from ARKit is going to fight with the datafrom the body mocap solution. And you're probablygoing to get bad answers. So if you're mix theapp maybe with like a pate rig or something likethat, you're obviously going to want toturn this person off. But since I'm at my desk, I'mkind of more streamer style. I'm going to leave that on. You know, it kind of givesa little bit more life in this scenario. You can also see how muchI bob my head when I talk, which is kind of embarrassing.But I will leave that in. Next up are somedisplay alternatives. So first up here isthe Preview Mesh. So if you turn thison, you're going to get a little bitmore direct feedback on what ARKit is attending, right? So it sees my face. You can see the exactplacement of it. And it's more anobvious ratify that we have data “re coming”. It likewise glances alittle bit creepy, so I didn't want to do thewhole introduction this practice. The Record Button lets youtoggle the record button on and off. This is something thatwe'll come back to last-minute when we hit recording. But the idea here is thatif you were administering the app externally, then you don't want it so that the actors orwhoever accidentally bump the record button. So you can just takeit off perfectly. Let's left open on for now. The BlendshapeData is more for debug. So if I turn this on, I'm goingto see a cluster of data now. And it's actually a goodway to show this is, like– I don't know.Hopefully, it's goingto come through. But down in the bottom rightis the head yaw, intelligence lurch, and intelligence flatten. So if I move my foreman around, youcan see those counts moving. But if I go back and Iturn head rotation off– right, so now you can seethose are all zeroed out. No was important that I do, they're zeroed out. That wants the app isnot routing that data over to Unreal Engine. So let's turn this person backon, turn blendshape off.Yeah, so Take Recorder– thisis the kind of format that's displayed. Like, there's two options here– you know, Slate and– there's Slate& Takeand Filename. So the standard hereis Slate& Take. It's the kind ofmore editorial form. You know, in such cases, we're 49 D, take 1. But you can also changeit to the filename. So this is more of thepipeline-centric view. You know, some peoplereally don't want spaces, so there's anunderscore for you.But this reflectsthe name of the file that's going to be laterextracted from the phone. But I wish the editorial record. And we'll leave it as that. And then the overlayis that we've had it on the part hour. You know, so it's allthe stuff around here– “LIVE, ” the different toggles. But you can alsoset it to fade out. So if you don'twant to see that– so if you wait acouple of seconds, then that overlayis going to go away. Kind of a betterexample of this is when you have video off, right? So this is maybe themost likely scenario that the performeris going to see. So we have the video off. We have the overlay off. But we still letthem construe what time it is and too the artillery. You know, oftentimesthe performer is the one that will catch thatthe artillery is almost dead, various kinds of like my phone now.But I'm going to leave thaton because it's kind of easier or better for the demo. And the last thing to mentionis the reference video. So we do record referencevideo every time you record. And there is an option to adjustthe quality and file size. We are going to show that stuffright now because actually we're going to record a make. So with that, let'srecord a take.Let's turn the video back on. All liberty. So we're going to recorda take in Take Recorder and from the iPhone manually. So first thing we'regoing to do here is switch over toTake Recorder. So we've already situated our slateand take to be the same thing. We ever require theconsistency between the two.And in this case, we're just going to record a LiveLink for us now. So now I'm going to hitRecord in Take Recorder. I'm going to hitRecord on the phone. And oh our do. OK, this is a demo recordingthat I've kick-started manually in both Take Recorder andon the iPhone, and yeah. We'll cut. And now let's take alook at it in Sequencer. So I didn't end up recordingaudio in Sequencer. But it is recorded inthe reference video. It's not really set up torecord audio on my computer, but let's just toy this back. You can see theaction croaking now. Perhaps I should haverecorded a shorter take, but my phrases at theend also came through. And now we will pop into thephone to take a look at it over there.So, Live Link Face has thistake browser facet as well. So in the bottom left, you can pop into here. So this is showing all thetakes I've already recorded. Not surprisingly, they're all of me. You can browse them by alltakes, by slate, or by epoch, right? So you can also do a rummage. So we know we were 49 D. So if I type “D, ” Ican prance over to mine. And here's the go we just did. The video, as I justmentioned, has audio in it. But it also hastimecode embedded. And the goal here is totake the cite video and be able to line it upto the recorded captivate. So let's play itjust as an example.[ VIDEO PLAYBACK] – OK, this is a demo recordingthat I've originated manually in both Take Recorder andon the iPhone, and yeah .[ Boundary PLAYBACK] >> Ryan: All liberty, it's beautiful. Another thing tonote here is that you can transfer the takes offof the phone using the iOS Activity Views. So if I thump theblue arrow up now, it's going to bring up abunch of different options that will zip up the dataand potentially mail it over. You know, AirDrop is generallythe best and the fastest if you have a Mac. But we also havesome other options. So you can doGoogle Drive, Box– you know, there's a bunchof different options that let you upload the dataas well as only displace it over AirDrop. But AirDrop isgenerally the most wonderful. Next up, we're going totake a look at an example take on disk. So we'll kind of cheat, but i look at a take that I've alreadymoved over, right? So for each take, you're going to end up with a series of files.Like, there's twomain ones, truly. So first is a video, andthen the second is a CSV. First thing we'll showis that video, right? So, I actually havethat open previously. So this is an example ofthe note video you get. I'll gambling it through now.[ VIDEO PLAYBACK] – Here's an example enter toshow that it has both timecode and audio.[ Outcome PLAYBACK] >> Ryan: So I'mgoing to disable the– let's mute me. I don't think we needto hear that again. But it's also meant to show herethat we are getting timecode in the media that “re coming”. Yeah, the QuickTimeitself utilizes a JPEG codec, so there's notemporal squeeze. It's frame-accurate. And again, aim here is totake the reference video, let you line it up to therecorded capture for reference.So it's somethingthat your animators may want to use, or just something to check that things areworking the highway you expected. The interesting thing that you getout of the do data is a CSV. So the CSV here pictures allof the blendshape data that's recorded. So it's kind of like araw recording or a backup for what you mightget in Take Recorder. So let's make thisa little bit bigger. Right, so you mostly get foreach timecode and fractional enclose, which you can see here– you're getting basically allof the raw blendshape data. This is all the same data thatyou interpreted in the overlay earlier.And so the idea hereis that this is just the comment, the backup. We don't inevitably havea native importer just yet. But it is possibleto make this data and write a Python script withUnreal Python API to import it. And this will let youbasically create a Live Link track as if it wasrecorded by Take Recorder. In the future, we'd liketo have a native importer. But for now, we're justpreparing the iOS app more functionality last-minute. We've also seensome people start to automate some of thepipeline assignments with the CSV. So your mileagemay run, and it'll be interesting to seewhat people build. So now we've reachedthe grand finale. I'll close out the CSV, papa back in to the editor and croak full screen. And we're going to close witha demo of how a Live Link Face can be remote controlledby an external application using OSC. So to do that, we'llrevisit our OSC settings.It's already enabled. And OSC stands forOpen Sound Control. It's a common word interfacepopular in virtual creation and, honestly, reasonably widelyused across a entire cluster of different industries. It uses the same IP addressand port-style setup as Live Link fornetwork communication. And the Listenersection here has to do with how theiPhone app is listening for bids from the OSCserver and then acting on them. We conveniently display theiPhone's IP address here because you're probablygoing to have to enter it into the control application. And this is a two-way setup. We are also welcome to become theopposite tendency and route contents out. To do that, you'regoing to need to set up an IP and port for the target. And this could beuseful for maybe more of a one-man-band-type scenariowhere the performer wants to initiate something, likemaybe the performer wants to punched Record on the phone. And then that triggersrecording on a entire assortment of other machines outside. You know, bothdirections are supported. But in such cases, we're going to demo it with a separate remote controlapp talking to the iPhone, telling it to record, and usingthe right mentioning convention.So the last thing we'lltoggle is the Record button. So we're going to turnthe Record button off precisely to demonstrate how we record a takefrom a remote app on the phone without interacting withthe phone at all directly. So let's bring up our remotecontrol app announced Switchboard. So, Switchboard is a PySide app. It was developed by theFortnite Shorts team. So if you viewed the SIGGRAPHtalk they did last summer, they demo anearly edition of it. And we've keptworking on it since. This is the current state, various kinds of a preview of something that we hope to share andrelease as part of 4.26. But in the meantime, it's the best way to visually present OSC in action. It's a little bitmore productiony compared to me just sortof typing in OSC messages at a authority line.So we mulled this wouldshow a little better. Quick thing to note hereis the IP address. So this is, again, theIP address that I went out of the iPhone app– plugged it in here, and thisis where the contents are going to be sent to by the app. You know, for a futuredemo, we'll probably start to touch onother things here like the multi-useraspect of Switchboard, but that's for another day. The main thing I want to showhere is just deepening the slate and make, right? So this is 49 D, make 2. This is the take after thetake we just recorded earlier. I'm just going to goahead and change it to something different, right? So 6D, make 1. And you can see that itimmediately informs the iPhone, and I haven't touchedthe iPhone at all. Next, I'll just goahead and record, right? So now “theres going”. This is a take that weinitiated from Switchboard.It's recording me, and we're going to take a look at it right now. All title, so now we willpop back onto the phone, jump into the make browserhere, and scroll down. We can see my brand-new take6D, or slate 6D, make 1. And here we go.[ VIDEO PLAYBACK] – This is a make that weinitiated from Switchboard. It's recording me, and we're going to take a look at it right now.[ Extremity PLAYBACK] >> Ryan: So, I didn't thinkI could top my earlier make, but I somehow did.And in any case, thisconcludes the presentation. As a remember, the appis available and free in the Apple App Store. With this app, we're really aiming to clear facial captivate easierand more accessible to pioneers going forward. And we're super, super, super roused to see what you do with it. Thank you so much forwatching the presentation. I truly appreciate it. We have a Q& A up next. And I'm looking forward tointeracting with you directly online ..

pexels photo 3861964

As found on YouTube

Get your RESOURCES HERE

You May Also Like