beside of all others, here are a basic one where you can start from:
put a styrofoam to the stage, point the cameras to the foam, do a whitebalance, check each camera to the other with a wipe on your mixer that the white looks the same (cam 1 left/ cam2 right and so on)
I guess "Onlocation" has a waveform so you can check here the exposure
"As close as possible" is the operating term here.
Understand that unless your shooting something with relatively flat "high key" lighting, there's an excellent chance that even the same cameras, set up the same way, will produce different (sometimes VERY different) pictures.
Think about it like this.
Lets say you're videotaping a band on stage.
A camera at the FOH position will likely see a well lit array of faces against whatever background the set designer specified. Move that camera to the stage right wings and it's likely shooting across the stage into the darkened wings stage left.
Not only will the background be different, but the light hitting the faces of the performers from the front will appear to be SIDELIGHTS to a camera in the stage right wings. And the back half of all the faces might be in shadow.
So everything about those two shots, the exposure, the color balance, the distance from object to camera will be different. So it might be VERY difficult to match those two shots.
If I was shading the cameras, I might choose to brighten the wing camera to add some detail to the large amounts of shadow highlights in that shot, but that would make that signal stronger than the front stage shot. Then 10 seconds later, that camera OP might decide to swing to the shiny cymbals and need to iris down to compensate.
Just saying that sometimes, no matter what you do, you can't expect two camera shots to stay matched precisely. You've got to treat them as individual signal sources depending on what you see on the monitors and make decisions on what's the best compromise.