TLDR:
To sum up, your question was "what if this information is breached?"
The answer is "the grown ups in the biometrics space assume it is by default."
An intrinsic part of their security model is the additional checks each time authentication happens to distinguish real people from replays and replicas. inasmuch as replays and replicas are not accepted, the secrecy of the actual face, fingerprint, etc isn't a part of the security model.
For background, I worked for 3 years as a developer in the research team of a biometrics startup. The industry definitely has its fair share of crackpots and mavericks and I got to hear all sorts of unlikely claims and philosophically dubious standards of measuring their effectiveness. As I mentioned in a comment, if your biometric security system relies the face (or fingerprint or whatever) being secret, you're one of the quacks. Likewise if you're dismissing biometrics because they're not kept secret, you're wasting your time arguing with quacks.
However, there are grown ups involved too. The major players do know what they're doing, and are suitably dismissive of the quacks. By major players, I'm talking major companies like Apple and government agencies like NIST. But on top of that, most everyone uses biometrics, they just don't use state-of-the-art tech for it.
Here's how it works.
You want to start a new job, and before anything else happens you're asked for some sort of government ID with a photo. Why the photo? Because they want to check that you (the human) match the id (the photo). Keep this distinction in mind: even though most face recognition systems can match two photos of faces, biometrics is specifically about matching a human. HR or IT security or whoever it is has to check two things: You look like the photo, and you're a human.
Likewise, every non-quack biometric authentication system has to check these two things. There will be a matcher and there will be a presentation attack detection system (PADS). The matcher confirms that you look like the stored photo (or stored mathematical representation in whatever sense) and the PADS is responsible for checking that you're not just a photograph. For example the iPhone FaceID uses an infrared dot projector and directly measures the 3d structure of your face, as well as using the camera to check you look like you. Other PADS systems measure other properties: perhaps motion, temperature, heartbeat, electrical capacitance, or some combination. The goal is to identify things properties that humans have by default but are hard, expensive work to forge.
If you use, say, a banking app that uses FaceID, it doesn't forward your face to the bank for checking. That would be fairly pointless. All the bank could verify is that someone has a picture of your face. In fact, Apple won't actually let the bank send that data; they won't let the biometric data leave the phone! Instead the phone verifies the person, and then sends a suitable message to the bank to the effect of "I, Josiah's phone, confirm that I have just seen a person and the person looks like Josiah." (Probably with an additional "And I'm signing this message with my private key." for good measure).
In terms of performance, matcher software have made incredible progress in the last few years. For example, during my time in the industry, the state of the art in face matchers got about a thousand times better (as measured on NIST's FRVT competition). They're far better than that HR official who checked your passport and set you up with your company account in the first place. In fact, they're into the level of performance where they could successfully distinguish many people from every other human being on the planet. That's really impressive for identification, but it's still not the antidote to malicious spoofing.
PAD systems also continue to improve. This is more of a mixed bag because their performance depends so much on what hardware they're using, and Apple's fancy IR projector will be better camera only systems that rely on, say, asking the person to blink. Generally, PAD systems are still the weakest link, but a strong PAD system still moves a typical attack overhead from "Pull up their facebook profile picture and take a snapshot" to "Gather a team of experts and set them a multi-week 3d fabrication project." On top of that, of course, you need access to the validation system: if we're assuming a setup like "You log in by doing FaceID on your phone" then you need their phone. Now that's perhaps still faster/cheaper than breaking the password of the sort of person who spends time on security.stackexchange, but it's a whole lot slower/less scalable than just trying "qwerty" as the password for each of the employees in the company you want to break.
To sum up, your question was "what if this information is breached?"
The answer is "the grown ups in the biometrics space assume it is by default."
That's what the PADS is for. It doesn't have to be high tech PADS. In some settings, a human monitoring the camera stations, watching for charletans holding a printout to the camera, is a reasonable PADS. If you don't have a PADS; if just knowing what someone looks like means you can impersonate them; then you don't have a biometric authentication system. You just have a moronic password system where everyone's password is tatooed onto their forehead. But if you do have a good PADS, you have a system that can offer a good level of security at an excellent level of convenience even for the sorts of person who asks why they can't leave the password field blank.
I would be remiss if I did not clarify that biometrics is not solely an authentication technology, and other uses do not always require a PADS. When the police match fingerprints from a crime scene against a database, they don't check that the prints are attached to a human. When a Casino uses face recognition to look for known card counters, they take for granted that no-one is trying to impersonate a counter. For these sorts of things, it entirely comes down to matcher performance. It is strictly for authentication that the PADS is key.