online/offline editing

  • Thread starter Thread starter <DummyHead>
  • Start date Start date
Status
Not open for further replies.
D

<DummyHead>

Guest
Wha :P t's the difference between online and offline editing?
 
Offline: Offline is a workflow process where the primary goal is producing a list of edit decisions. The type of equipment or the price of the equipment is totally irrelevant to the definition. You may end up with a program videotape for client approval, but the primary purpose is not to create a program for distribution, but to make creative decisions and reduce the amount of time spent in an online environment.

Online: Online is a process where the final result is an edited program ready for distribution. This process can be a VHS-to-VHS edit or a sophisticated, composited piece created in a very expensive edit suite. It does not matter. The result of an online edit is a competed program, ready for viewing or distribution.
 
Offline is generally used as a rough cut.

Online is a term which used to mean in a room with all the bells and whistles hooked up- switcher, chyron, ESS, DVE, etc. Now it doesn't mean much since online rooms are mostly a thing of the past.
 
Another popular use for the terms in these days of NLEs - offline is a low-resolution edit of the final product. It's done at lower res to take advantage of storage space and faster render-free editing. It is then either "up-res'd" or the EDL is taken to a higher end machine for "finishing".
Online is considered editing at full-res - actually editing the final product.
 
As an Offline/Online editor, I'd say that mostly it is used in the NLE context that Videodoc mentioned now.

However all three decriptions above are accurate. Broadly it is considered the final edit and compile before program layback. The specifics of what that entails will vary depending of the production and production company.
 
Maybe putting all this into a more historical context would help.

The terms online and offline originated in tape to tape editing, where as Icarus mentioned you had some edit rooms that had all the bells and whistles and other booths that were basic cuts, maybe some dissolves. The idea was that you put together a rough cut in the simpler offline room, without all the fancy graphics and transitions, because time rented in those rooms was cheaper. The equipment in the offline room was cheaper, so you could afford to have more of them and waste more time in them. Often the offline room was restricted to simple cuts, without even having dissolves available.

Once you had assembled your offline edit and were happy with what you had, you would take your materials to the online booth. Depending on the facility, the online edit room could simply be the only one that had dissolves, or it could be as complicated as the control room that runs a television newscast, with a huge switcher, lots of sources and plenty of effects and graphics. Because the equipment in these rooms was more expensive, you would have fewer of these, so you couldn't waste any time in them. They also often required more people to operate the various machinery, where an offline room was usually just one guy.

In terms of work flow, the offline room was where you made most of your creative decisions. You could goof around in there for hours if the project allowed it, trying different cuts until you had what you wanted. The online room wasn't set up for that kind of creative decisions; it was assumed you had figured all that out before you got there, and the online editor would expect to blow straight through it as fast as possible without any experimentation.

When nonlinear editors came along, the terms online and offline were adapted to the new technology, as Videodoc and Sycophant mentioned. Back when hard drives, RAM and processing power were expensive, it was not cost effective to do your rough cut at full resolution with all the graphics included. Thus, a post facility might have had a number of slower machines with limited storage on which you only had low resolutions available. You would make all your creative decisions in there, save your edit decision list (EDL) to a disc, and then move to the online NLE to redigitize your footage at a higher resolution and add all the effects that needed more processing power to be rendered.

Even in smaller companies that had only one NLE for all their work, the online/offline workflow process would be used. You would do an offline rough cut at a low resolution, just for speed and space, then remove everything from your drives and start again from your EDL to make an online edit at full resolution. This is actually the way I was originally taught to use NLE systems, because at the time big hard drives were just way too expensive to work at full resolution all the time.

Another variation on this kind of workflow is switching between tape and NLE. When NLEs were really expensive and scarce, sometimes you would see people do a offline rough cut on a tape to tape system, then dump that cut into an NLE to add the effects and graphics in the online edit. I've even seen the practice go the other way, where an editor would make all his creative decisions in an offline edit on a poorly equipped Avid, then go into an online tape to tape room to add all the effects and transitions. You might wonder why the hell anyone would do it that way; the answer is that they already had the very expensive online tape room but couldn't afford a full online Avid system, yet they wanted the creative freedom afforded by the nonlinear editor's ability to shift things around quickly and easily. The workflow simply evolved to their needs and their wallets.

Now all that's changed. Computers are cheaper. Storage is cheaper. The editing packages themselves are cheaper, as are the supporting software packages for graphics and effects. Because you can now do the same work on a $10,000 FCP or Avid DV system that you once needed $100,000 of computer and software to accomplish, the line between online and offline is blurring. Eventually it will become almost meaningless. That probably explains why you don't know what they mean now.

But it's still good that you asked, because old habits die hard and there will still be people who insist on structuring their workflow around the offline/online model. When you encounter them, you'll know what they're thinking.
 
Originally posted by Shaky & Blue:
Now all that's changed. ...Eventually it will become almost meaningless. That probably explains why you don't know what they mean now.
Actually offline editing is about to become more popular then ever. The cost factor involved in feeding video hasn't gone away nor is the technology available for studio quality edits in the field, not unless you pack a dozen cases full off edit gear. Another advandage of off-line proxy editing or browsing, it's much faster then a real time tranfer.

Reality shows have started taking advantage of off line editing - using 1.5 Mbps proxies coming out of IMX e-VTRs and now XDcam. NBC, for a couple of years been experimenting with the IMX e-VTR and now XDcam using the low res (1.5 Mbps) proxies via "MXF". CBS made it clear one reason they choose XDcam for acquisition and IMX tape for archive because of the proxy off line advantage. BBC may be the next giant to adopt this new technology. Of course XDcam will do this with DV as well and some local shops may choose that route but obviously the networks are taking advantage of IMX-50 studio profile.

From NBCs David Mazza, Senior Vice President of NBC Engineering Olympics Division
We added an additional 50 E-VTR MXF capable plug ins for our legacy 1/2 inch Betacam decks. While feeds are being recorded onto IMX video cassettes, they are at the same time being transferred to a file server, creating an MXF MPEG-2 hires file at 50 Mbit/s and an MXF MPEG-4 proxy file at 2 Mbit/s. All three copies - the video cassette, the hi-res video file and the browse proxy - are automatically registered with Media Archive. Keyframes are being extracted instantly from the browse proxy, and keyframes and browse proxies are made available instantly for retrieval, browsing and EDL creation.

From Frank M. Governale CBS News Vice President of Operations
We've embraced MPEG from ingest to playout, so we'd like to keep everything on that platform. We're going to have the production system with the video servers and editing systems, as well as the low-rez proxy. I'm standardizing, my infrastructure stays the same. It'll all be searchable for anybody from anywhere, obviously within our LAN


From CNN Technology Senior Vice President Gordon Castle
"MXF is a key part our integration strategy we intend later this year to integrate our playback, archive and editing systems together on an MPEG based MXF framework infrastructure."

The possibilities of off-line proxies are almost endless. An editor in New York can control a e-VTR in Moscow while viewing the proxies, make his EDL and request the e-VTR to send the IMX full res 4:2:2 stuff that's needed. That kind of off-line stuff.

Now watch "at the plate", like an immature child come into this thread and slam me for telling the truth. "M---X---F" at the plate, the thingy you never heard of in your small minded world, eh? :D

[ December 04, 2004, 04:31 PM: Message edited by: Ivan ]
 
Status
Not open for further replies.
Back
Top