I'm not the only one who notices issues with the lip sync in the cutscenes right? Is it possible to fix that? It's no big deal I guess, but it's something you can't help but notice. Well that's my opinion, what do you guys think?
Printable View
I'm not the only one who notices issues with the lip sync in the cutscenes right? Is it possible to fix that? It's no big deal I guess, but it's something you can't help but notice. Well that's my opinion, what do you guys think?
This is a Japanese game. The lips are sync'd to the Japanese audio. Therefore, it will look awkward when they're not speaking Japanese.
Yeah but there are Japanese games that do have an English lip sync when they made the english audio, so it is possible three games off the top of my head would be FFX, Mario series (whenever there is talking), and Dark Chronicle (one of my favorites)
ask anyone that has to translate a script from one language to another, with the video staying the same. Not only do they have to translate the idea of what's being said, but find the right words to line up with the lip movements on film, sometimes though there just isn't and you either let the lips not sync correctly, or drastically change what's being said.
I think they reanimated the lips and synced them to English when those games were realeased to English dominant countries.
These are a bit different, as they either match the dialogue to the lip flaps, which are much more simplistic than XIV's (usually just bobbing up and down so you only have to match the "rhythm," which is also easier to alter, or they completely redo the game for each version.
However you simply CAN'T redo the scenes for each and every region, since this is an MMO. A dialogue heavy one, no less. Each version is accessing the same game files, just with a different text overlay. Changing it would involve releasing, at a minimum, 4 different versions of each patch and require each scene to be worked on 4 different times. It's like reshooting a movie each time except the actors have to be physically controlled by the director. It's incredibly tedious.
On a single-player game, this is feasible because you have the time to do so. For an MMO, you have very strict time tables and must keep a constant stream of content to remain competitive.
lol, nope, that would be way to expensive to do, because not only would you have to reanimate the mouth, but also the entire lower jaw.
The english script for perfectly synced voiced overs, are usually fairly different from direct translations, a good expample of this is many anime in which if you turn on english subtitles and english audio, you'll start to get really confused when what's being said doesn't match the subtitles except for maybe one or two words.
Indeed. It usually goes: Meaning > Timing > Literal translation.
There's a reason it's called "Localization" and not just "Translation."
That said, the one thing I do believe is that they'd have an easier time matching mouth flaps if they didn't fill the script with so many squeenixisms. >_>
X? Uh, no.
XIII I believe had done so, though. In this modern day and age, it should be feasible to go through and adjust lip-syncing, but it's a tedious task that requires time. Time, time, time. Which is one thing MMOs don't have very much of when it comes to the production pipeline, at least in regards to when content is going through it's localisation stages.
Since the game uses generic "lip flaps" (as kyuven put it) instead of having the mouth movements match any one particular language, it's never bothered me that much.
Gets even worse these days when people want more and more detail. When the mouths only had a few points of articulation, going back and adjusting it was feasible (just not practical since barely anyone cared).
These days they're talking about using motion capture for mouth movements. Which...yeah isn't going to create the best environment for redoing lip movements to meet the script.
Again this mostly attributed to limitations with the characters' mouths.
The models are frighteningly close to the uncanny valley, so it's really hard to make them look natural. Again, because mouth articulation is an incredibly detailed process.
Unless I'm lipsyncing for my life I can deal with it. I'm usually noticing dodgy graphics issues during cutscenes anyway. XD
I would rather have the main story line voice work completed.
Everyone out of the way, animator coming through. Let's clear up some misconceptions.
1: This game isn't lipsynced even in Japanese aside from certain scenes (such as the ending cutscenes in the 2.55 patch). However, the mouth movements stop closer to when the talking stops in Japanese, while in English characters may talk without the mouth moving and vice versa.
2: FFX wasn't lipsynced in English, however the actors tried to match the Japanese mouth movements with the English translation. This is what led to some of the dialogue coming off as strange - Yuna's actress for example is often ignorantly called bad because some of her scenes seem awkward, but she was really just trying to match Yuna's mouth flaps.
3: Most AAA Square Enix games starting with Kingdom Hearts are lipsynced to English though, for example FFXIII. If the budget is lower, English lipsync is one of the first things to get cut, as was the case with KH Re: Chain of Memories(there was also another complication in that case though).
4: There are two primary ways to lipsync. You can animate mouth flaps and have actors try to roughly match them, or record voices first and animate to the voices. The latter is more expensive and professional.
5: However there are multiple degrees of quality in lipsync of 3D models - it is NOT true that animators have to painstakingly animate the whole jaw by hand in every lipsynced cutscene like some people have said. This is an almost entirely procedural process where animators only have to move some simple sliders around for each movement - how automated it is depends on the game. The FF13 series for example had almost completely automated lipsync so every random NPC was lipsynced at all times. The more automated the process, the more mechanical and lifeless the character looks however. The most time-consuming facial animation in FF14 is seen in the Hildibrand quests and aforementioned ending sequence - these are some of the few places where they actually animated things by hand.
the mouth movements in this game are just random, different types of mouth movements that don't sync up to any audio. This is also true for your character whenever you type something in linkshell etc. You say something like "hi!" and your chara makes about 5-6 extra syllables. lol
So really, in the voiced cutscenes, they're moving their mouths much in the same way they are in the un-voiced ones...randomly generated movements.
I don't work with SE, but being familiar with the animation industry, I can almost certainly say that this was done for simplicity sake. This isn't a console release, like FFXIII, where the English localization actually went back into the motion capture studio to re-record mouth movements to match the English audio. This is an MMO, and as such, they cut corners where they can.
If they did as requested and synced the mouth movements to the audio, which one would it be? Remember, everyone in the world uses the same game files, just different "text" based on the language of choice. If they sync to JP, English, French and German lips would still be off...etc for the others.
Syncing all 4 languages would mean much more development time...heck just doing motion capture for ONE language would be time consuming enough. They would have to go in and actually change the game files...meaning each language would need to have their own client..or have extra files in there somewhere. It's not as easy as just selecting the audio option in your settings. Pretty sure this is why they left it the way it is currently....with random mouth movements that don't match up to any language.
Just a little tidbit: in version 1.0, the only selectable audio was English, with the subtitle text changing according to client language (even for the Japanese, imagine that!). Since only English was recorded, the lip movements DID match their speech. :D It's most visible in the starting cutscenes for the three cities (Gridania's has a few closeups of Yda when she's talking). Take a look on youtube if you're bored.
On the subject of actually animating lip movements themselves, this is a fairly simple process for a big company like SE. They need only set up their motion capture studio for facial capture, clean up the data, and apply it to the 3D models. If done right, there is hardly any manual adjustments that need to be made.
Uh... You guys should've noticed this, but in most cutscenes, they use pre-baked animation and blend them together. Most animations are actually taken straight from the emote list. I wouldn't be surprised if they had pre-baked lip flaps, and slap them there just the same. Why? Budget reasons. It would cost them a lot more money (or time) to use fresh animation to every cutscene.
Pre-bake is like, reusing animation keyframes that had been used in other shots or from a pool of pre-existing animation takes (motion capture data, or hand keyed) like those from the emote list or character attack moves in battle.
Of course, there are a few instances where they animate a scene specifically from scratch. Expect those to have proper lip sync to the Japanese audio.
They're using a pretty basic lip syncing algorithm that just moves the lips in a smei random manner whenever an audio is played. This is much more budget efficient than reanimating the lips for every language. You can notice it in scenes where the character makes a scream or a laugh and they didn't use the pre-baked animation. Example of Grynewaht scream. Even the English scene has the same issue.
Is it possible to have a lip syncing algorithm that works perfectly for any language? Yes, Soul Calibur 2 has it since 2002.
English voice
Japanese voice