I use a Panasonic DVX100a, G5, Viewsonic A75f CRT, and Final Cut for shooting and editing wedding videos. I know it's ghetto, but I have been using a 14-inch consumer Panansonic television(through a Sony Digital/Audio converter)for a second monitor to see what the image might look like on a TV(I know this is probably a TERRIBLY unstable process to gauge what my product will look like when my customers pop in their finished DVDs and watch it on their TV's or computers).
I know it might now seem like it, but my image ouput is really dear to me and would love to find an economical way to help myself(and customers) out in getting the best product I can deliver. This has become more crucial since my photography skills are moving to more nuanced levels!
Is my ghetto monitor hurting me more than helping me? Am I ridiculous in that I calibrate with the color bar technique? Are there solutions/tips that would be in the ballpark of a "few' hundred dollars or LESS that would translate to any image stabilization?
It just breaks my heart to know that I spend so much time properly exposing clips and sometimes color-grading to see it get some nasty orange tint or super contrast on some TV set. Thanks!
[james klatt] "It just breaks my heart to know that I spend so much time properly exposing clips and sometimes color-grading to see it get some nasty orange tint or super contrast on some TV set. Thanks!"
You have to get over that and welcome to the real wor;d of TV. All you can do is make sure what you turn out is the best you can get, if they see it on a crappy TV and as a result it looks bad, that is their problem, you can take pride in knowing you gave them a quality product.
I have always prefered a TV monitor that has a means to turn off filtering such as comb filters, color filters etc. I want my monitor to show me what I have not what it can do to make it look good. Then I can tweak and know I am getting the best I can.
Always remember, mediocre video can look great on an expensive Sony with skin tone correction, etc.
Personally I prefer Ikegami monitors, because they gave me the opportunity to turn off filtering so I could see what I had, but currently I don't have one cause this company won't spend $10,000-$20,000 on a monitor.
The moral of this story is expensive or inexpensive can still give you what you wnat, just that the inexpensive monitor takes more time to set up with less setup controls.
I probably didn't actually answer your question, but gave you food for thought.
[james klatt] "... Is my ghetto monitor hurting me more than helping me? Am I ridiculous in that I calibrate with the color bar technique? ..."
In addition to Charlie's excellent comments, a CRT TV can be useful if you keep it at least semi-calibrated using SMPTE color bars. I say semi-calibrated because most TVs don't have easily-accessible controls required for proper calibration, and as Charlie says you usually can't turn off most TVs' picture "enhancement" features.
You can use a piece of full CTB blue gel instead of a blue-only switch (since your TV doesn't have one) when performing the calibration. This technique is described on the following website:
All the best,
Just a friendly reminder to all: Please consider filling-in your COW user profile information so we have a better idea who you are, where you're from, and so forth. It's the friendly thing to do. Thanks!
Consider buying a used monitor.
You might consider a 14" Sony PVM series monitor these units can be picked up used for anywhere between $400-700 used.
A quality monitor will be a good investment if you really desire to do critical quality work for your clients.
Used monitors: I recommend against this. Used monitors can pickup a number of problems on top of phosphor wear.
Phosphor wear: As the phosphors in the monitor get hit by electrons, they undergo changes and become dimmer. This means the monitor gets dimmer (ok) but also the gamma response/curve for R, G, and B will change (color shifts).
A used monitor can be ok if it has low hours and you run some test patterns through it to check for problems (convergence, focus, burns, etc.). Really high-end monitors like BVM-series Sonys and Ikegamis accept calibration probes to compensate for phosphor wear.
Monitor calibration: A CTB won't give that good results for calibrating to color bars. A better "blue-only" gel (like the specific wratten gel they mention) may be better, but I haven't tried. The blue gel trick will be slightly off compared to blue gun or automatic calibration.
2- From a practical standpoint, it probably makes a lot of sense to get a small broadcast monitor designed for field use (ikegami, sony, JVC). Look for blue gun, 16:9, underscan. The monitor could be used in the studio and you know you are getting a fairly accurate picture.
As well, it can be useful in the field for checking exposure and focus and whether the boom mic is in the shot (that's what underscan is for). A field monitor can be very helpful.
A field monitor is about $600USD + shipping (expensive). A battery will cost extra.
Avoid LCD, stick with CRT.
3- Studio setup is just as important as having a broadcast monitor. Things to look for:
A- 7.5IRE setup. For FCP-specific info, check out http://www.kenstone.net/fcp_homepage/video_levels_nattress.html
Unfortunately the 7.5IRE setup/pedestal issue is VERY confusing and there's lots of misinformation about it.
B- All light sources (including computer monitors) in the room have the same color temperature. *This may be a little difficult to achieve.
Ideally, the color temperature would be 6500k but that may not be such a big deal.
C- The line of sight around the monitor should be a neutral gray.
D- Try to minimize glare on the monitor. Turn it off and look at it to check.