Servicing The Computer
Industry Since 1976.
1
1a
1b
1c
2
3
4
5
6
6a
6b
6c
6d
7
7a
7b
7c
7d
7e
7f
7g
8
8a
8b
8c
8d
8e
8f
8g
9
9a
9b
9c
10
10a
10b
11
11a
11b
11c
11d
11e
11f
11g
11h
12
12a
12b
13
13a
14
15
16
17
18
19
20
21
22
23
24
25
26
27
Join Our Mailing List
Email:
 

Tools/Various
 COVID19 - VOCSN by Ventec
 Worldtime
 What is Digital TV ?
 TV Technology NEWS
 Visit Our Old Web Site
 Terms You Should Know
 Digital Video
 Take a Break from Computers
 Why John Wayne loves America
 Finance / Lease option
 Balboa / Newport
 Free streaming Music

When using search engines, always put in "ieei" for best results
ieei - Servicing the Computer Industry Since 1976
Home | Contact Us  

Check out our new Broadcast Web Site

WWW.IEEIBROADCAST.COM

         
Home > News
News

Apace vStor is used by Digital FX for National Geographic 3D animation collaborative workflow and realtime editing


By Michele Hope


Most digital content creation (DCC) professionals don’t think much about their company’s underlying storage system unless it starts to impact their creative efforts-like being stuck waiting for a file to be copied or rendered, or having to wait before you can move on to the next task in an already insane work schedule.

However, according to the production houses and studios interviewed for this article, the storage systems supporting their artists, animators, editors, and producers play a vital role in the success of each finished product.

Change is a constant in today’s studios as they struggle with crunching deadlines, the need to juggle multiple projects at once, and the prospect of overhauling their server, storage, and networking architectures to make way for the growing wave of high definition (HD) work

In our latest look at the state of storage technology in fast-paced studio environments, we found everything from “storage-on-the-go” systems used for real-time editing in the field to elaborate enterprise-class storage installations that mirror many Fortune 500 companies. These high-powered implementations can include hundreds of terabytes of data and

“tiered-storage” architectures to help move archival and backup data onto lower-cost storage systems. We also found facilities that use homegrown virtualization software to help users access specific files without knowledge of the physical device where the file actually resides.

Underscoring these storage strategies is the goal to make work-in-progress instantly available, shareable, and reusable from a central storage repository. It’s also about producing quality content as efficiently as possible.


The need for speed


“Any time your user data becomes centralized and available from more locations, it can be worked on in a more cost-effective manner,” says Matthew Schneider, director of technology at PostWorks, New York, a film and high definition (HD) post facility that has been involved in a variety of independent feature films and TV shows.

“Sooner or later, storage will become part of the lifeblood that makes it all happen,” says Schneider. “Whether or not it’s interesting to you, it’s something you’ll be forced to learn how to do. Storage is at least half the equation, if not more.”

Schneider’s storage infrastructure includes a variety of Avid workstations connected to Avid’s Unity shared storage systems via 2Gbps Celerity Fibre Channel host bus adapters (HBAs) from Atto Technology (which also sells high-speed 4Gbps Fibre Channel adapters). PostWorks’ total Unity storage capacity exceeds 30TB.


Matthew Schneider, PostWorks’ director of technology, uses 4Gbps Fibre Channel HBAs from Atto in most of the studio’s Avid workstations. The HBAs provide the 350MBps to 400MBps of throughput required for HD work performed on nearly 30TB of Avid’s Unity storage systems.

Improving workflows


According to Greg Milneck, president of Baton Rouge-based Digital FX, one key to choosing the right kind of storage comes down to how well it supports the collaborative workflows of the facility. Digital FX, which recently finished 3D animation work for several National Geographic shows, uses a 4TB vStor v2000 series NAS appliance from Apace Systems on its Gigabit (1Gbps) Ethernet network. The vStor NAS system is used for standard definition capture and simultaneous access at Digital FX and is designed primarily for real-time video editing with support for DV, SD, and HD video. Capacity ranges from 2TB to 10TB per array. (Apace also sells the eStor series of storage systems for video archive and retrieval.)


Operators at Digital FX perform 3D rendering work for in-house research by collaborating on files accessed from a shared vStor v2000 series NAS appliance from Apace Systems.


“The whole facility could be working on either a single project, or all working on separate projects, depending on the demand,” says Milneck. “The Apace storage system works seamlessly in the background and supplies shared footage to the operators. The footage could be special effects shots where the editor digitizes the edit and then has access to the effects in real time.”

“Previously, it was more of a push-and-pull process throughout the facility,” Milneck adds. “The editor would push [content] to another workstation, and then an animator would work on it and push the footage back to the editor.”

High-performance NAS


The proliferation of multiple copies of files is a common problem at studios-and one that used to be experienced by The Napoleon Group, a New York City-based postproduction facility. According to director of engineering Maciek J. Maciak, “The days are long gone when operators used to say, ‘You need this file, in this format, on this drive? Okay, let me go make a copy and get it to your machine.’ People haven’t been calling me about that anymore,” says Maciak. The Napoleon Group implemented a central file repository that now acts as the “backbone” of most of the company’s file storage needs. This backbone is built on a 3.5TB Max-T Sledgehammer NAS system with Fibre Channel drives, from Maximum Throughput, which stores everything from accounting and front-office documents to archives of renders and jobs from all of the edit suites.

“We’ll have as many eight operators accessing either the same footage or other footage on the drives simultaneously,” says Maciak. “The performance is really surprising.” Maximum Throughput claims an aggregate sustained throughput of more than 300MBps, with capacities ranging from 2TB to 32TB per array and support for file systems up to 16TB.


Maciek J. Maciak, director of engineering, sits in front of what he calls the storage “backbone” for The Napoleon Group, a 3.5TB Max-T Sledgehammer disk array from Maximum Throughput that’s used for editing and rendering.


Configuration files for key applications have been modified to automatically export to the Max-T system for archiving. According to Maciak, “Operators know to look for those types of folders on the Max-T, so there’s no more ‘share your drive and I’ll put this on your desktop.’ ”

Maciak also incorporates 3TB of nearline storage that is simply a PC with a RAID storage controller. When a job no longer needs to be stored on the primary Max-T system, the job is sent to a “holding” folder used to move files to nearline storage. “We have a rule that says if the Max-T is 75% full, dump all of the contents of the holding folder to nearline storage. Once the nearline storage gets full, we move it over to tape,” Maciak says.

iSCSI SANs


Sausalito-based music editor Malcolm Fife knows first-hand how much the use of inefficient storage can cost a project. Fife has performed sound work on movies such as The Lord of the Rings trilogy. He is also a partner of Tyrell LLC, which focuses on sound design, music production, and postproduction editing and mixing.

Fife knows that when the director of a big-budget film comes to play the latest sound reel and suggests a few changes, you don’t want to hold up the process trying to move the file from suite to suite in your effort to cut a new version

“These are multimillion dollar timelines. If that change isn’t handled instantly, you could blow thousands of dollars,” says Fife.

After working with London’s Abbey Road recording studio on the score for the Lord of the Rings soundtracks, Fife and his partners became enamored of the workflow there, which was based on several Studio Network Solutions A/V SAN Pro Fibre Channel storage systems that helped streamline the recording, editing, and mixing of the music score for Lord of the Rings: The Two Towers. (In addition to Abbey Road, A/V SAN Pro disk arrays are used by facilities such as Sony/ATV, Wally’s World, and Vidfilm/Technicolor.)

Fife and his partners decided to duplicate much of the Abbey Road setup in their Sausalito facility. However, instead of Fibre Channel SAN connections, they went with two Studio Network Solutions iSCSI-based globalSAN X-4 shared storage systems running on a Gigabit Ethernet network. (iSCSI SANs provide a low-cost alternative to Fibre Channel SANs.) Since the systems are based on Ethernet and iSCSI, Fife’s team found they could even do simple checks of files using a laptop in a different room. By installing SMS client software on the laptop, they could plug into the SAN from anywhere-even over the Internet.

Reducing bottlenecks


During peak production for South Park, it’s not uncommon for the 60+ animators and editors at Los Angeles-based South Park Studios to work approximately 100 hours per week.

According to J.J. Franzen, South Park Studios’ technology supervisor, work begins in earnest a week before the show is due to air, with changes often made as late as 12 hours before airtime. This timeline requires systems to be available at all times-a feat put to the test at the start of the ninth season when a network switch failure made it impossible to access any work-in-progress until the problem was solved.


The storage systems used for animation at South Park Studios include AppleXserve RAID servers and a 15TB Apple Xsan system.


“We realized then we had to get rid of our older stuff and remove single points of failure on our network,” Franzen explains. The studio replaced the main file server-another potential point of failure-with Apple’s Xserve RAID servers and a 15TB Apple Xsan storage configuration capable of supporting the 30MB to 50MB of capacity needed for an average animated scene, as well as the 150MB to 200MB required for larger scenes. The storage system is also mirrored and supports automatic fail-over in case of failure.

In the upgrade process, Franzen also implemented Atempo’s TimeNavigator software, which takes incremental backups four times a day. TimeNavigator now provides time-stamped backups of earlier file versions so animators and editors can quickly access them without having to go through re-rendering.

Titanic storage


Another studio that knows how to keep the digital pipeline humming is Los Angeles-based Rhythm & Hues, an animation and digital effects studio that became known as the “talking animal house” for its award-winning work in the film, Babe.

Recently, Rhythm & Hues put 650 people to work during the hard-core production phase of the Disney movie, Narnia. Chief tasks involved the development of animation and underlying muscle movements of key characters such as Aslan the lion, including computer-generated simulations of smoke, fire, and the lion’s fur.


Rhythm & Hues used three high-speed storage systems from BlueArc to handle the renders and nightly processing of up to 24TB a day of data during peak production of Narnia. Image © Disney Enterprises Inc. and Walden Media LLC


According to Rhythm & Hues’ vice president of technology Mark Brown, this required about 24TB of data to move through the system each night-a process that was managed through a combination of several high-speed Titan storage systems from BlueArc and Rhythm & Hues’ home-grown virtualized file system. The file system allows data to be replicated in front of the BlueArc disk arrays so that bandwidth levels are always maintained.

“You can put 256TB on a storage server [which is what each Titan system can support], but our problem is that we have so many processors going at it that we wouldn’t have the bandwidth we need. So we only put 4TB to 6TB behind each Titan head so that we have the bandwidth to get to the data,” says Brown. BlueArc’s Titan storage systems deliver 300MBps to 400MBps of sustained throughput for Rhythm & Hues and are capable of scaling from 5Gbps to 20Gbps of throughput, according to BlueArc.

NAS + SAN


What do you do when the clips you produce might end up on other TV channels and on other shows, or even on the Internet? According to Jeff Mayzurk, senior vice president of technology at E! Networks, this means the underlying IT infrastructure has to allow production teams to be able to quickly re-use content as needed, in a variety of forms, as soon as it’s produced.

From a storage perspective, this has meant a “hub-and-spoke” architecture that captures any film acquired from the field once, and copies it into the network’s central storage repository. This material is then distributed to various edge (or spoke) locations for their own use, whether on Avid systems or FinalCut Pro systems connected via a director’s home.

What makes this model work is a combination of custom virtualization software that masks the underlying complexity of the storage systems in use. Storage resources at E! Networks include about 200TB of SATA-based NAS from Isilon Systems, NetApp NAS servers, and two Fibre Channel SANs from Data-Direct Networks. Mayzurk says the virtualization capability allows him to remain vendor-agnostic. “We didn’t want to be tied to a particular vendor or type of technology. We want the flexibility to migrate as storage technology improves.”


Jeff Mayzurk, senior vice president of technology for E! Networks, fuels the company’s multimedia productions with several hundred terabytes of networked storage and custom virtualization software.


Isilon’s IQ series of clustered storage systems include the OneFS distributed file system, which scales to 250TB of capacity. The company claims throughput performance of 3GBps. In addition to standard Gigabit Ethernet connections, Isilon’s IQ series systems are also available with higher-speed, lower-latency InfiniBand connections.

Road warrior


Another studio that values flexibility in its use of storage is Biscardi Creative Media. Walter Biscardi, owner of the production and postproduction studio in the Atlanta area, uses a mix of drive types and subsystems for a variety of projects, which include editing the Good Eats show for the Food Network and HD projects such as a short film, called The Rough Cut, in which Biscardi made his directorial debut.

“No one type of hard drive is the solution for everything you need to do in today’s postproduction environment,” says Biscardi. His studio uses both FireWire 400 drives from Maxtor and LaCie along with FireWire 800 G-RAID storage devices from G-Technology (G-Tech). He uses these drives in conjunction with what he calls the “Mac Daddy” in his studio: a 2TB Medea VideoRaid FCR2X Fibre Channel disk array that he says is ideal for HD work or projects with very quick turnaround deadlines. The 10-drive, dual-channel VideoRaid FCR2X supports 2Gbps Fibre Channel and more than 350MBps of throughput, according to Medea.

While he has worked out how best to use each storage device at his studio, perhaps a real test of how far he’s been able to go with his storage comes from Biscardi’s ability to take the Medea array out on the road. “I put my Fibre Channel array in a travel case and can actually go out on the road and edit high definition anywhere,” says Biscardi.


Home   |   About us   |   Contact us   |   How to Buy
110 Agate Ave., Newport Beach, CA 92662, Telephone +1 949-673-2943, Fax +1 949-673-0249
Copyright ©1995-2014 International Electronic Enterprises, Inc. All Rights Reserved.
Powered by    


  Why John Wayne Loves America