FORUMS: list search recent posts

LM-2140W basic questions

COW Forums : Flanders Scientific

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
Craig Alan
LM-2140W basic questions
on Jun 28, 2012 at 8:45:56 am

Just got this monitor.

Been reading up here.

Bram said:

Here are some basic things to keep in mind:

1. You can always reset the monitor to default values by going to the system menu and loading the default profile. This will help to ensure that no user adjustments are impacting the calibrated settings of the monitor.

Just to be certain. Does this mean that the monitor can permanently be defaulted to a proper calibration: never needing to be recalibrated? Not affected by temperature changes or students fooling with setting. I can always reset to the default and be good to go?

I did read the article on-line about the luminance level changing with age. How many hours of use are we talking about this making a difference?

MacPro4,1 2.66GHz 8 core 12gigs of ram. GPU: Nvidia Geoforce GT120 with Vram 512. OS X 10.6.x; Camcorders: Panasonic AG-HPX170, Sony Z7U, Canon HV30/40, Sony vx2000/PD170; FCP 6 certified; write professionally for a variety of media; teach video production in L.A.


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Jun 28, 2012 at 11:33:35 am

All monitors will drift over time, but some are certainly more stable than others. The LM-2140W is one of the most stable monitors we've ever offered. The White Edge Lit LED backlight does not exhibit the same level/speed of luminance drop off you would find on monitors with a CCFL backlight. This means that certainly for the first 1 to 2 years (depending on use) you can essentially reset to default at any time and be within very tight tolerances of original factory settings.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Bret Williams
Re: LM-2140W basic questions
on Jul 7, 2012 at 4:15:08 am

Hi Bram, I was thinking of getting the same monitor. I'm using the intensity extreme from BlackMagic currently with a consumer monitor that has HDMI input. I see that none of your monitors have HDMI input. Is that in the works, or is there some technical reason? That said, if I used the 2140 with the component out of the intensity extreme, is is calibrated at the factory for component as well as SDI?


Return to posts index


Bram Desmet
Re: LM-2140W basic questions
on Jul 7, 2012 at 4:43:33 am

If you want to use the HDMI output you can simply use a an inexpensive HDMI to DVI cable and plug directly into the monitor, the video portion of the signal is idential so this is really just a connector adaption and not a conversion of the signal, which is why the cables are not very expensive. We opt for a DVI connector simply because the licensing costs for HDMI are high and the vast majority of our customers do not need or use this input. That being said I would suggest using the Analog Component output from your card instead. As a general ruIe for any professional monitoring applications I suggest using Serial Digital if available, if not then Analog Component, and I would only use HDMI if neither of the first two are available.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Bret Williams
Re: LM-2140W basic questions
on Jul 7, 2012 at 4:46:30 am

Thanks Bram. So it is factory calibrated for the component input as well then, correct?


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Jul 7, 2012 at 5:03:16 am

Yes, the factory calibration is universal to all inputs.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index


Craig Alan
Re: LM-2140W basic questions
on Jul 7, 2012 at 5:57:00 am

Bram,
I thought that the video output of HDMI and SDI were identical. I thought the only differences were: SDI uses secure BNC connections; SDI can be much longer and cheaper that HDMI cables; SDI can carry time code; HDMI carries digital audio as well as video.

I much prefer using BNC connections and HDMI is the most fragile/insecure connection of any cable. I'm not usually connecting my audio to the same gear that I am connecting my video so the advantage of HDMI is actually an extra inconvenience -- IE menu setting to output analog audio rather than through HDMI.

I thought Analog Component has a bunch of different standards.

So unless my assumptions are wrong: I understand why Flanders would choose to use SDI over SDI and HDMI, I'm confused why you would recommend Analog Component over HDMI. Wouldn't that depend on the source?

I use component over HDMi when monitoring during a shoot; but in post, wouldn't HDMI to SDI cable be more accurate than analog?

MacPro4,1 2.66GHz 8 core 12gigs of ram. GPU: Nvidia Geoforce GT120 with Vram 512. OS X 10.6.x; Camcorders: Panasonic AG-HPX170, Sony Z7U, Canon HV30/40, Sony vx2000/PD170; FCP 6 certified; write professionally for a variety of media; teach video production in L.A.


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Jul 7, 2012 at 11:59:57 am

There is no such thing as an HDMI to SDI cable. You would be using a converter, not just a cable, at that point that is actually converting your HDMI signal to SDI. An HDMI to DVI cable does not convert the video signal. With such a cable you will lose audio (DVI does not support audio), but the video portion remains the same and you are essentially just 'adapting' to a different connector type and not converting the signal.

SDI is almost always preferential.

If SDI is not available from your source then the two next best options are Analog Component or HDMI. Opinions may vary on what is best, but I would challenge the assumption that is sometimes made that HDMI must be better because it is digital. I'm not sure I understand your point about analog component having a bunch of different standads. Digital component does as well. Whether being fed over HDMI, SDI, or Analog Component you can have a multitude of formats and may be dealing with YCbCr (YPbPr) or RGB signals.

The biggest inconvenience with Analog component vs HDMI is that you have to deal with 3 cables instead of just 1. With this in mind if you have a camera without SDI out that has analog component and HDMI outputs I would completely understand using the HDMI ouput for onset monitoring (that is what I would probably do as well in this case). However, as soon as you move me to an edit suite or any environment where the extra cales are not a big concern I would opt for Analog Component instead. I could go on for hours about all the reasons I hate HDMI for professional use, but one of the simplest justifications for using Analog Component over HDMI with specific respect to our monitors is that the DVI input is going to be just 8 bit. If your source is 8 bit then no big deal, but you don't have this limitation over Analog Component so right there you have one advantage.

The other big pain with HDMI (and DVI) is that they are essentially bi-directional signals. With SDI and Analog Component your card basically spits out a format and the monitor either accepts it or it does not. With HDMI and DVI you almost always have the sending device reading information from the monitor through something called an EDID protocol and adapting the signal to (theoretically) work best for the monitor. Even in the consumer world where this is supposed to make life easy it more often than not causes all sorts of headaches. In my experience the problems become even more frustrating when you start tyring to use HDMI/DVI in professional applications. Just from a conceptual standpoint I think having the source send out a signal as it intends according to defined industry protocols and then relying on the monitor to show this correctly is more appropriate than having the source adapt to suit the monitor.

Now take all of this advice with a grain of salt. With your particular I/O device these issues may be minimal to non-existent, which is why I said 'as a general rule' when listing my connection type preferences (SDI, then Analog Component, then HDMI). Most of my issues with HDMI/DVI become much more prevalent when connecting directly from computer to monitor with no I/O device. A professional I/O device with HDMI out will miitigate many of these concerns. You still have the crappy connector, etc. to contend with, but I digress...

I guess I'm more or less sticking up for good old Analog Component more than ripping on HDMI. Yes, I believe SDI is better, but let's not forget that it was essentially designed to be a digitized version (YCbCr or RGB) of Analog Component (YPbPr or RGB) in addition to supporting all kinds of useful anciallary data. A high quality Analog Component signal will be (should be anyway) mostly indistiguishable from its digital (SDI) counterpart. On many output devices (ranging from cameras to I/O boxes and everything in between) the Analog Component and Digital Component (SDI) signals actually come from the same exact chip before they either hit a Digital to Analog or Analog to Digital chip depending on how the signal originates to start with... Any differences in the quality of the SDI and Analog Component outputs is usually attributable to the quality of these DtoA or AtoD chips. This is another reason I would emphasize my 'as a general rule' disclaimer, because some of the Digital to Analog or Analog to Digital conversion occuring in various devices really stinks and leads to poor quality output.

To boil it all down to something simpler than my above rambiling let me say that if you have identical quality HDMI, SDI, and Analog Component signals they will look virtually identical on our monitors. There may be some very subtle differences caused by the various processing hardware on our monitors, but essentially any differences in quality would be attributable to differences in the output device over these various connections. So your experiences may vary substantially depending on your output device, but all other things being equal the monitors are perfectly happy with any of these signal types and will show them identically provided they are more or less identical in quality when sent to the monitor.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Craig Alan
Re: LM-2140W basic questions
on Jul 7, 2012 at 5:18:45 pm

[Bram Desmet] "There is no such thing as an HDMI to SDI cable"

Sorry bout that. I meant DVI. I use AJA converters. But you answered the question about Analog Component vs. HDMI (or HDMI to DVI). Thank you.

Many cams have component out with a single D shaped plug into the cam and then splits into the three component rca plugs. The RCAs can be adapted to BNC. So it actually is a more secure way to do this in the field than having to deal with HDMI which can slip out of either end.

Is that D shaped connection compromising the component signal in any way?

[Bram Desmet] "On many output devices (ranging from cameras to I/O boxes and everything in between) the Analog Component and Digital Component (SDI) signals actually come from the same exact chip before they either hit a Digital to Analog or Analog to Digital chip depending on how the signal originates to start with... Any differences in the quality of the SDI and Analog Component outputs is usually attributable to the quality of these DtoA or AtoD chips. "

So it would be possible to ask tech support for a camera lets say which type of conversion is taking place? I would assume in a prosumer level camcorder the better signal would be the original signal before needing to be converted? (I'm talking under $7000 for the cam).

MacPro4,1 2.66GHz 8 core 12gigs of ram. GPU: Nvidia Geoforce GT120 with Vram 512. OS X 10.6.x; Camcorders: Panasonic AG-HPX170, Sony Z7U, Canon HV30/40, Sony vx2000/PD170; FCP 6 certified; write professionally for a variety of media; teach video production in L.A.


Return to posts index


Bram Desmet
Re: LM-2140W basic questions
on Jul 7, 2012 at 6:06:58 pm

No, the d shaped connector does not really compromise the signal. In theory you could have signal degradation sooner if running very long cable lengths, but for all intents and purposes there is no real sacrafice there. And you are absolutely right, analog component over bnc connectors is a great choice and why our monitors use bnc, not RCA, style connectors for these inputs as well.

I think perhaps I'm overcomplicating this for you. If the camera has serial digital output you can be pretty much be guaranteed that is your best bet. If it only has HDMI and Analog Component you may want to test both to see if there is a significant quality difference, but more often than not they will be very similar. HDMI just tends to be more finicky withe respect to compatibiliy with various monitors whereas component out to a monitor with component inputs tends to just work with less fussing.

However, when we are not talking about cameras, but rather I/O devices for NLEs, etc. then I would suggest avoiding HDMI when and where possible. Unless you are just getting poor quality from your analog component output for some reason (like a poor quality D to A conversion) then I would always opt for that over HDMI. Of course with respect to such I/O devices for editing and color correction this is all quickly becoming a moot point as equipment with Serial Digital output is quite affordable and becoming much more ubiquitous. If you have SDI use that. If HDMI is not giving you any headaches then by all means that is also feasible, but if you ask around I think the general consenus you'll get is that HDMI is just a pain to use in professional environments whereas the biggest headache with analog component is just dealing with some extra cables.

As with anything your mileage and personal experiences may vary, but I've never heard anyone say Analog component is the devil, whereas with HDMI that is often some of the kinder feedback you will hear.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Bret Williams
Re: LM-2140W basic questions
on Jul 7, 2012 at 7:12:58 pm

This has been a great thread. With my intensity component and HDMI are identical. But I always thought component would be the lesser choice because I assumed it had to be decompressed to become analog, then digitized again to be displayed by the LCD.

But sounds like you lose 10bit by going to HDMI?


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Jul 7, 2012 at 7:24:33 pm

Correct, the DVI input to the monitor is limited to 8 bit so with 8 bit sources no big deal, but with 10 bit source footage you may get better results with analog component.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index


Bret Williams
Re: LM-2140W basic questions
on Jul 9, 2012 at 2:36:15 pm

Hi Bram. I'm looking at the website for ordering the LM-2140W. I notice the cheapest shipping is FedEx Home $40. I also notice you have instore pickup which I assume is up there in Suwanee. I'm in Roswell, GA. Long story short, what's the est delivery time for the FedEx Home to Roswell?


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Jul 9, 2012 at 3:41:32 pm

FedEx Home Delivery from Suwanee to Roswell is almost always just 1 day. You are also more than welcome to pick the unit up in Suwanee, GA. If you plan to pick up this week just give the office (678.835.4934) a bit of a heads up as we have new flooring being installed so prep time for package pickups may be a bit longer than usual. If you give them at least 2 hours notice that should be fine.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Bret Williams
Re: LM-2140W basic questions
on Jul 20, 2012 at 2:39:27 pm

Hey Bram, got my LM2140 yesterday. Waiting on some BNC adapters to come so I can hook it up component but in the meantime I hooked it up via HDMI while the old monitor was simeultaneously hooked up component (via RCA plugs on the extreme). So I discovered another reason not to use HDMI. The lag. With FCP the component and iMac screen are dead on in sync and the audio is too. With HDMI, it's a frame or two lag, making all the audio look out of sync. Many on the COW often post about having to run HDMI to a monitor, and then the audio from the monitor.

BTW - I have a high pitch annoying buzz coming from the back of the 2140 audible only in a quiet room. Is that normal? Anyone else have that?


Return to posts index


Bram Desmet
Re: LM-2140W basic questions
on Jul 20, 2012 at 2:55:40 pm

The White LED backlight on the LM-2140W can make an audible buzz when set at the default setting of 35fL. Most people cannot hear this, especially in a normal working environment with other equipment (No one over 40 in our office can hear it all). If you increase the backlight setting slightly (System Menu->Backlight) you will find that this slight buzz goes away.

The audio lag may have nothing to do with using HDMI. All LCD monitors, especially those actually showing your frame rate to you natively, have some degree of video processing delay. Most people use an offset in their video editing software to account for this, but you can also put the monitor into FAST processing mode to reduce the video processing latency. This pulls processing power away from ancillary features and menu display so those will respond more slowly, most people prefer using normal mode. You can find many more details on this in our PDF user manual and user manual video at http://www.FlandersScientific.com. This is another reason so many people prefer to spend the extra money for a serial digital I/O device, because then you can embed the audio in the video stream and allow the monitor to disembedd this to its internal speakers or even to a mixer or external speakers via the audio out on the back of the monitor, the advantage being that the monitor will then delay the audio by the exact amount of video processing time so you will always have perfect lip sync.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Bret Williams
Re: LM-2140W basic questions
on Jul 20, 2012 at 3:17:42 pm

I'm 42 and i can hear the backlight fairly well until 51 when it disappears. But I also her the high pitch whine of all the other monitors in the room. Used to that.

I didn't want to change any of the factory settings. Bringing up the backlight to 51 is noticeably brighter, (but probably more in line with normal computer/TV brightness I'm accustomed to.) Is that a bad idea?


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Jul 20, 2012 at 3:31:02 pm

Your hearing is good! I hear the buzz too, but I'm closer to 30 than 40. That being said I only hear it if I have it in a very quiet room. As soon as I have it in a room with other equipment, especially my laptop with its fan it is pretty quickly drowned out.

The monitor ships with the peak white luminance set to 35fL, much dimmer than its maximum capability. This dimming is what causes the slight buzz. At 51 your peak white luminance will be closer to 50fL. Brighter than I would suggest for a dark color grading or editing suite, but actually quite well suited for slightly brighter working environments. For field use people commonly have this set anywhere from 50 to 70. It depends largely on your working environment. Generally I would suggest keeping it at default if you have a 'typical' dark editing environment.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Bret Williams
Re: LM-2140W basic questions
on Jul 22, 2012 at 5:50:25 am

Interesting. A few things I've found and maybe you can tell me if they're normal (or anyone else with a 2140W).

Above 50 there is no noticeable difference in brightness, although I can take it up to 100, it remains exactly the same. Is that normal? There's a decent difference between 35 and 50, but 51 and the buzz goes away and any further increase is nothing.

Also I've hooked it up via HDMI (w/DVI adapter) and component. Both look exactly the same. (However DVI did default to RGB. Had to switch that) But the first thing I notice is the monitor is soft. Noticeably. Fine details are considerably less discernible than either of the other 3 monitors in use 1) a $250 Vizio that it's replacing 2) my iMac monitor and 3) a cheap acer second monitor. Playing 1080p at a 1:1 pixel ratio on any of those monitors results in a sharper image without having to tweak anything. I can crank the sharpness up to 15 and get it closer, but it makes the overall image too sharp and contrasty. Yet small text and fine details still a bit less sharp than the others. 2 of the other 3 monitors are matte finish. I figured for $2500 this thing should be the sharpest of the bunch. Is this normal? Is it some sort of super duper matte finish that's blurring things?


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Jul 22, 2012 at 11:53:17 am

[Bret Williams] "Above 50 there is no noticeable difference in brightness"
That is not true. At 100 the peak white luminance of the monitor will be approximately 20 to 25fL brighter than at 50. For a variety of reasons you may not notice this difference as easily at higher backlight settings, but as verified by any of the $12,000 to $30,000 measuring devices at our facility there is most certainly a luminance difference at 50 vs 100. Keep in mind the following:
1. The backlight adjustment is not perfectly linear. As you approach the top half of the adjustment range the steps in the backlight adjustment will result in less and less of an increase in peak white luminance (most monitors behave that way by the way).
2. Even if the backlight adjustment on a monitor were linear your eye would have a much harder time noticing a 5fL difference when comparing 60 and 65fL than comparing 30 and 35fL.

[Bret Williams] "I figured for $2500 this thing should be the sharpest of the bunch. "
Wrong! A professional monitor should not, and in our case does not, artificially sharpen or enhance your image. If you want pretty, not right, that is what consumer devices are for...The whole point of getting a professional reference monitor is so that you see the image as it really is and not an image that has been manipulated, cleaned up, and sharpened. When people first buy a professional monitor, whether our brand or another, a very common piece of feedback is 'my footage looks noisier, softer, uglier, etc." than on my consumer devices, but that is because the job of the professional monitor is not to make your content look pretty, it is to show you the image as it really is. We have a 'consumer' type mode on our monitors called Noise Reduction that you can switch to that emulates some of the basic things a consumer device does (noise reduction filter, additional sharpening, de-interlacing frame buffer with progressive output for interlaced material, etc.), but we never suggest operating in this mode for day to day editing because it WILL NOT show you the material as it really is. Even in this mode you may be looking at considerably less enhancement than the average consumer TV.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Bret Williams
Re: LM-2140W basic questions
on Jul 22, 2012 at 2:46:23 pm

Sorry, I guess I should've been more specific. I wasn't really referring to footage. Which I would tend to agree. And in general I would tend to agree with the Vizio. But what I'm really referring to is text graphics. ProRes or uncompressed via DVI or Component. Where consumer monitors seem to artificially crank the contrast, color, sharpness to make them pop, they generally lose fine details as they're blooming like crazy in my experience. Given, I've more experience with the SD world where my PVM-20 was immensely better than my Sony TV. No comparison.

But I figured uncompressed text, or say a square in Photoshop, should be a pixel for pixel expression. Not clearer on the computer LCD? Your scopes and such are crystal clear on the 2140, but if I created that exact graphic in PS, made it an uncompressed tif, an displayed it via component or DVI to the monitor, it would be soft. That's kinda what I"me getting at.

Anyway maybe I'm making too much of something. I guess I can take a look at some at the next cutters meeting or bring mine by for a comparison sometime. I'm just in Roswell. But busy as can be through the week.

All in all, fantastic monitor feature for feature. I'm loving all the scopes, pixel for pixel mode for truly seeing 720p footage or SD footage for what it is, and all the different title/action safe modes. I see it has quite a few preloaded for different network channels. Do they provide these templates for you to upload into the system?


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Jul 22, 2012 at 5:02:46 pm

Text and graphics are not immune from the same issues that impact standard 'footage'. Even the most basic of things like chroma subsampling will typically result in a perceptual difference vs the computer monitor.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Jul 22, 2012 at 7:29:52 pm

Most of the generic markers (4:3, 2.35:1, 2.4:1, etc.) have always been a part of the firmware. In addition to these we introduced a custom marker program early last year. This allows customers to send us a marker template in a .bmp format, which we then roll into the firmware for all new units that ship out and as a free firmware update for existing FSI customers. Some of these user generated markers have come from Networks directly, others have come from users that do work for these networks in a contracted/freelance capacity. In either case it is important to note that these are user created markers so you should always verify that the marker you plan on using matches up with your particularly application or requirements. If you need any custom markers now or in the future you can check out this link for more details and we'll be happy to have those added for you: http://www.flandersscientific.com/index/Custom_Markers

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Bret Williams
Re: LM-2140W basic questions
on Jul 22, 2012 at 11:58:02 pm

Another Question. I can't seem to get any audio monitors. Have gone through the manual and have it set up right. For example have function 2 set to profile 2 for scopes and VU. Profile 2 for scopes and VU is set to show v audio meter. No matter what I do, it shows a waveform instead. I can set both windows to off and get nothing, so I've got the right preset/function. This a bug in the firmware? Do I need to update? I'm getting audio, because I can hear it on the little speakers. I've tried with audio lock out on and off.


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Jul 23, 2012 at 12:31:32 am

Aren't you monitoring via analog component and/or HDMI? VU Meters on the monitor are for audio embedded in SDI, so if you are not feeding the monitor SDI you will not have audio level meters (see the manual for more info). The monitor will automatically deactivate VU meter windows on inputs other than SDI as they wouldn't show anything anyway. As I think I mentioned In an earlier post SDI equipped I/O cards have many benefits and they have become so affordable as of late that I think they really are a worthwhile consideration for anyone making their living in professional video. Now if you already have a card without SDI that is one thing, but for others than run across this post and still need to buy a card I would suggest spending the small premium for the SDI equipped card or I/O box. With many of these newer cards you could even output 4:4:4 3Gbps SDI instead of the 4:2:2 output you are limted to with your current card. Also, if you can direct basic operation questions to support@flandersscientific.com that would be very helpful, access to the cow is much more limited for our support folks in the weekend than standard email.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Bret Williams
Re: LM-2140W basic questions
on Jul 23, 2012 at 2:53:19 pm

Thanks Bram. That makes sense. I wondered how it could be properly calibrated for audio if it were an analog signal. Didn't find that in the manual. Odd that it throws a waveform up there though.

I'm actually eyeballing the yet to be released Ultrastudio Express. Analog, SDI, HDMI, all for $495. And of course DaVinci support. Now if we could get autodesk smoke support from something other than AJA that'd be great.

I just kept up the thread here on the Cow because it all becomes part of the world wide base of knowledge on the subject, searchable by google for others. :) And I'm not really in a rush. All good stuff. See ya at the cutters mtg if you're there.


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Jul 24, 2012 at 7:22:42 pm

UltraStudio Express looks promising. We have one UltraStudio 3D here at our office and regularly use it at trade shows and to demo 3D content. We've been pretty happy with it and other than running incredibly hot (too hot to touch for any length of time) it has served us well. That being said we also have 3 AJA IoXT and use those on a day to day basis. Provided that you don't need DaVinci Resolve support it has at least two big advantages that we have seen over the UltraStudio 3D:

1. Runs much cooler. It never gets too hot to touch.
2. Has 2 thunderbolt ports so you are not limited to using it as an end of chain device. This is the biggest advantage for us and why we use the AJA IoXTs on a daily basis and only use the UltraStudio for very specific tasks/demos.

Of course if you need DaVinci Resolve support this is all a moot point, but for those not using Resolve I can't say enough about how well the AJA IoXT has worked for us. Really looking forward to their T-Taps too as the ultimate portable I/O device.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Bret Williams
Re: LM-2140W basic questions
on Jul 24, 2012 at 8:28:31 pm

With my iMac I have 2 thunderbolt ports so I'm not running into the issues MacBook Pro users have. I like the IOXT, but it's just too expensive. The extreme doesn't get hot, and works with all my apps with the same driver- CS6, FCP x, FCP legacy, and avid too if I had that.


Return to posts index

Michael Heldman
Re: LM-2140W basic questions
on Jul 26, 2012 at 3:26:05 pm

Re: the question of HDMI vs. SDI. Here is another consideration: if I'm not mistaken, HDMI does not support 23.976 video.

Michael



Return to posts index

Bret Williams
Re: LM-2140W basic questions
on Jul 26, 2012 at 4:16:43 pm

HDMI does support 24p. That's a big part of the whole 24p / LCD blu-ray appeal.


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Jul 26, 2012 at 4:22:21 pm

I think what you have in mind is PsF, but as Bret pointed out 24P is not an issue.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Bret Williams
Re: LM-2140W basic questions
on Jul 26, 2012 at 4:41:27 pm

And for anyone reading this lenghty thread, I did fix the sharpness issue. I wasn't nuts. A different HDMI to DVI adapter cable worked and gave that extra sharpness I had expected, matching the computer and Vizio. However, the component is still soft by comparison. I had always looked the same on the Vizio. Perhaps I need a different BNC to RCA adapter for that as well.

But it's looking like a SDI Blackmagic unit is the way to go. Both for image and audio. I'll be able to get audio meters on the screen AND I'll be able to loop through the audio to be in sync with the video. Right now, audio to the monitor lags by a couple frames compared to the audio out of the Intensity Extreme. I used to run HDMI through my Vizio and use the audio out to power the speakers, but the FSI has only DVI in (no audio). You can still loop audio through, but it doesn't delay it, so looks like it's time to grow up and get some more big boy gear.


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Aug 3, 2012 at 12:28:43 pm

Hello to anyone coming across this thread.

Let me start by saying thank you to Brett for taking the time to try different setups, etc., which spurred me to also do some fresh side by side comparison testing with our AJA IoXT feeding the monitors analog component and SDI as well testing a BMD UltraStudio 3D with SDI output.

Long story short I think I've blundered in the order of my list of preferred connection setups. My overzealous dislike of HDMI caused me to suggest using Analog Component out of a professional I/O device over using HDMI out of the same professional I/O device. I was wrong, the HDMI output will provide you with better quality and a sharper image than analog component.

So, my updated preferred connection list:
1. SDI. No doubt about this one, it is the best option if available.
2. HDMI out from professional I/O device (Intensity Extreme, etc.) if SDI is not available.
3. Analog Component out from I/O device if SDI, HDMI, or other Digital Output is not available.
4. HDMI out directly from computer if no professional I/O device is available, though I still would advise against this if at all possible.
5. Composite out of I/O device.

Most of the problems with HDMI that I mention earlier in this thread relate to connecting directly from your computer's graphics card output and don't really apply to a professional I/O device with HDMI output. Though SDI is still preferential if you have it available, some cards only offer HDMI and Analog Component output. In such cases I can confirm that you will get better quality output to our monitors using HDMI.

Honestly I should have known better. When you are using Analog Component in the type of setup being discussed in this thread the signal has to go through a D to A conversion in the output device and then an A to D conversion in the monitor. It shouldn't really be surprise that going through two such conversions in the signal chain gives you poorer results than an all digital signal path.

I will say this though, I ended up testing some other devices with both Digital and Analog Component output and with the AJA IoXT we did have the most similar image between digital and analog signal chains. So similar that the slightly sharper image in the SDI signal vs the Analog Component signal may have more to due with the monitor than the AJA output device. On most footage there was no easily discernible difference when using the IoXT. On sharp graphics overlays and text is where you could most easily see the somewhat sharper image over SDI vs component. Of the variety of other devices I tested some clearly had poorer quality component output vs SDI output and/or HDMI output, even when viewing just standard footage with no graphics or text.

With SDI output devices becoming so affordable as of late I suppose a lot of this is a moot point for people buying new cards as you will almost certainly have and want to use the SDI output and maintain an all digital signal path. However, if you do buy a card or already own a card that only has HDMI and Analog Component out then using HDMI is probably best.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

Elliott Balsley
Re: LM-2140W basic questions
on Oct 31, 2012 at 3:50:45 am

I just stumbled upon this thread. Glad I read the whole thing, or I might have been confused with the misinformation earlier about analog vs. HDMI. Bram, your latest explanation makes more sense to me, with the A/D conversions losing quality. But is it still true that the DVI input is limited to 8 bit? If that's the case, then analog component might still be preferable when color accuracy is more important than sharpness (as is usually the case in a grading suite).


Return to posts index

Bram Desmet
Re: LM-2140W basic questions
on Oct 31, 2012 at 12:14:36 pm

Correct: HDMI will give you more sharpness, but component analog will give you more color depth. However, the real answer to your question is that you should, if at all possible, use Serial Digital as that will give you the best of both worlds. SDI I/O cards / boxes have come down so sharply in price that unless your computer just does not support them at all it is probably the single best investment you can make in your signal pipeline.

Bram Desmet
FSI (Flanders Scientific, Inc.)
http://www.FlandersScientific.com


Return to posts index

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
© 2017 CreativeCOW.net All Rights Reserved
[TOP]