I don't understand difference between media servers vs. standalone playback
I'm very interested in live show visual fx such as projection mapping and projected set design for live concerts and shows. Recently, I've stumbled upon a real time rendering software called Notch, which is made for content creation and live playback; but it seems like it's geared toward big time show producers and artists. I'm having trouble deciphering some of the lingo on their software description. For one thing, I need to understand the difference between a media server and standalone playback, as the software advertises that it supports both. Would anyone explain to me the difference? Does a media server take the place of a computer, for example? I only know about outputting to a projector from my laptop, and can't seem to wrap my head around the idea of a media server. I would appreciate if somebody could dumb things down for me and explain what it's used for, and perhaps have a look at the site:
"As well as media servers, Notch Playback enables the use of executables exported from Builder Pro. Invaluable for interactive installations or kiosks."
But Builder pro seems like an entirely different product from Playback, which is part of why I'm confused. They license Builder and Playback separately... I plan to write an email to the company to clarify this bit, but if anyone can break down the server vs. standalone playback issue for me, I'd really appreciate it!
[Hilary Tsai] " For one thing, I need to understand the difference between a media server and standalone playback, as the software advertises that it supports both."
Think about the way that web servers serve web pages, or file servers serve files. A client connects to a server, makes a request, and the server fulfills it.
Your browser says, "Hey, CreativeCOW, I'd like the Live & Stage events forum." The COW's server responds, "Ok, here you go." You say, "Great, thanks. Can you show me posts by thread instead of by topic?" The server responds, "Sure thing, here you go."
A media server does the same thing with live requests for media. A connected client makes media-related requests, and the media server fulfills them. Media servers don't just allow you to launch playback for media assets; they allow you to manipulate many of their properties in real time. "Hey media server, can you start playback on this media file?" "Great, now can you make it partially transparent?" "Cool, can you blend it additively?" "Let's scale it down to 50% over the next 10 seconds."
These are simplified examples; advanced media servers can think in 3D, they can work in networks to support massive, synchronous pixel spaces, they can handle projection blending and warping, they can handle multiple independent timelines with independent cueing, they can use live inputs (from systems like Blacktraxx), etc.
Notch is a real-time, generative visual system, and it's capable of being embedded in a few specific media servers (disguise, Green Hippo, Avolites Ali, and 7th sense). That means that rather than having a physical output like your laptop does, Notch Blocks can be used like other media elements in the server, and its properties can be manipulated in realtime with the media server tools.
For Notch specifically, standalone playback allows you to use Notch blocks without embedding the system in a media server.
Designer & Mad Scientist at Keen Live [link]
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
@keenlive [twitter] | RenderBreak [blog] | Profile [LinkedIn]