Screen graphics for "Arrow" Season 4 Episode 14, courtesy Danny Ho and Scarab Digital.

The art and craft of screen graphics – interview with Danny Ho

October 7th, 2024
Screen graphics for "Arrow" Season 4 Episode 14, courtesy Danny Ho and Scarab Digital.

Continuing the ongoing series of interviews on fantasy user interfaces, it’s a delight to welcome Danny Ho. His path in the industry started almost 30 years ago, at the cusp of the transition from pre-recorded video decks to fully interactive computer generated graphics. Under his leadership, Scarab Digital is a monumental presence in the field of screen graphics (and well beyond), with over a 100 productions under its belt, including more than 1,000 episodes of some of the most memorable television in the last couple of decades, from “The X-Filed” to “Charmed”, from “Sanctuary” to “Siren”, from “Yellowjackets” to “Monarch”, as well as the entire span of the DC television universe in the last 12 years – “Arrow”, “Supergirl”, “Legends of Tomorrow”, “Batwoman”, “The Flash”, and “Superman & Lois”.

In an almost impossible task to fit all of this into one interview, Danny talks about those early days of computer generated graphics and the evolution of hardware and software stacks, challenging his artists to make an impact with their work, how much resolution is too much, the impact of Covid, and the looming emergence of generative AI tools.

Kirill: Please tell us about yourself and the path that took you to where you are today.

Danny: It was back in the 24-frame CRT computer playback days, as practical on-set video playback was already established. There was little post work, and most of it was making real graphics on set, right at the cusp of pre-recorded rendered output from a video deck. You would play linear video, there was no interactivity, and production was used to that. It was done on a three quarter inch or Betacam decks, fully synced to an actual CRT that was synchronized with the film camera.

I got exposed to that through the company that was doing that process on the computer side of it and starting early explorations into interactivity. My buddy was helping someone else do it, and he didn’t mind company. I would go with him to the set and he was, of course, designated to do the work. In between takes I would watch him rush, and try to cable things up, and go to the monitor, and connect to the computer. You’re always under a timeline, and there’s always a struggle of a sort. So of course, I’m not going to sit there and watch him [laughs]. I would go and help him and connect everything until it was all done – and then the actors would work with his stuff.

As I watched and observed, I started thinking that these guys have something here. This is not going to go away. This is going to become more demanding, because we’re natural geeks. We just know how to do this. The old legacy was purely video. They have no idea how to make anything interactive. And sure enough, they landed “The X Files” season two. From there Chris Carter [the creator of the show] became the hottest thing and started doing other shows like “Millennium”, and I ended up working on all those spin-offs for Chris. That was the training grounds for growing this whole thing and taking it to the next level.

Those three quarter inch decks and BetaCam decks were big pieces of equipment, and they had hard cases on them. If you had more than one source, you needed more than one deck, and you have to help carry these things around. I could see the guys who had the market doing video would eventually converge, and you would have a computer that would be able to eventually play video. You could see that coming, even if it wasn’t immediate. It was years away at that point.

I was able to do both. I worked with the video guys and I worked with the computer guys. Back then I did everything as one person – I would coordinate or shoot on set during the day, build graphics at night, then do the whole cycle again the next day. I was burning the candle at both ends. And eventually, as I coordinated projects, you build a network and you solve other people’s problems. So they give you a call and say “Hey, can you help us on our show?” I can say that I was one of the only ones at the time that could do both – I can do the computer and I can do the video playback stuff. The video guys didn’t want to learn the computer side, so they would recommend this fellow who’s well-known in the community. His name is Klaus Melchior and he cornered the market in terms of video.

I knew there would be a certain point where he was going to sail off to the sunset and retire, and sure enough he did. I was happy for him that he was able to retire, he’s an awesome guy. So we picked up the pieces and took the reins for this transition to using computers to play video interactively.


Screen graphics for “The Predator” (2018), courtesy Danny Ho and Scarab Digital.

Kirill: Was there a particular point in time where it felt that computers are powerful enough to take over back then?

Danny: My brother had an internship with Alias Wavefront that later became Maya. I remember visiting him and then he showed me a little trailer, just like we see them every day now, a trailer of a movie. It was playing full real time on an SGI computer, and it was so foreign of a concept to see a video file playing on a computer. It was full resolution and completely smooth. Unbelievable at the time.

It gave me a taste of what the future was going to be on a typical regular computer. It was mind blowing at the time, but of course to us now it’s nothing.

Kirill: How wild is it to look back and remember those days and look at your desk now?

Danny: It’s unbelievable [laughs]. Your phone has more compute power than what the computers were using at the time. If you take the concept that every year you have compute power doubling, it’s pretty staggering.

That’s somewhat of my sweet spot of what I bring to a company. I was able to see where the trends are going, and what the demand might be, and what people might want based on what’s happening now and what’s possible in the future. That is how I’ve grown the business, trying to always stay ahead, looking at what tools might be out there.

There are times I take tools that are completely not meant to do anything related to film or graphics, but they might have to fulfill a purpose for us. We adopt them into the workflow, and we’re always eager to improve and make what seems impossible, possible. Look at the visual effects these days. At one point it couldn’t do plastics and things that look reflective that well. And then later it can do fire and water. It is always progressing. There was a time back then when I was contemplating our company to also do visual effects. Do we do visual effects or do we stay in interactive motion graphics? If you think about what the producer wants, and consider what is doable in VFX. Then you graph VFX against what the producers want vs achievability, they are running almost in parallel, and almost never cross. It’s very difficult to exceed the expectations of a producer for any visual effects. It’s definitely more attainable now, especially when you throw enough money at it.


Screen graphics for “The Flash” Season 8 Episode 19, courtesy Danny Ho and Scarab Digital.

Kirill: Once you exceed the expectations, that becomes the new expectation for the next one.

Danny: It sets a new bar, exactly. I always found that from the creative perspective in motion graphics and FUI, you can always get a notion of what they’re trying to achieve. You can’t guarantee that you will always exceed their expectations, but there’s more of gratification when you do. You can exceed their expectations when you come up with an idea that can convey the story, or even make sense of what they’ve written.

I use the word “schmience” a lot. It’s fake science. It doesn’t make sense. So how do you show that on a graphic that conveys the story and get the audience to understand it within a second?

Kirill: Some call it motion graphics, some playback design, or screen graphics, or fantasy user interfaces. Is it difficult to find one term that captures all the shifting ebbs and flows of it?

Danny: It’s always challenging. Though Mark Coleran came up with the best – FUI. [Rest in Peace Mark!] However, mostly only the fan base of the medium knew the term. It’s funny, none of the film producer level or production based is familiar with it, which is why you still hear playback and screen graphics. Funny – You always want to not do what everyone else does. You always want to try to make it unique, or unique to the project. We all make fun of it. How many ways can you make “Access Denied”?

This is what I tell the artists. I challenge them – as an individual, as an artist, as a creative – to make a graphic that not only just conveys a story, but is so compelling that the director has no choice but to shoot it, because it does too good of a job that he cannot deny it. You’re basically challenging the director.

You went through all the processes, you read the story, you read what the people are talking about. And there’s a graphic that says what the script is supposed to do, or the scene is supposed to do. But the director has a choice of not even focusing on it. He could just focus on the people, and they’re talking about it, and they’re caught up in the emotion. But if your graphic is so good that he cannot deny how much storytelling it does in that 1-2 second span, then you’ve made it because you forced them to shoot an insert on your graphic.

That is the challenge I always put upon our artists. Make them a picture that tells a thousand words.


Screen graphics for “The Flash” Season 8 Episode 13, courtesy Danny Ho and Scarab Digital.

Kirill: I’m looking at the desk behind you, and I see a few cables. You’ve probably seen quite a few cable varieties in your professional career, and I have a box of my own old cables around. It feels like a love-hate relationship at this point.

Danny: I was looking at this analysis where they did an X-ray scan of a Thunderbolt cable. You can buy a cheap $3 cable, and it looks the same as the Apple one that costs $130. But when you look at that scan, and it has two different power supplies, and a whole chip in there. Amazon even took off that cheap cable because all the reviews were saying that the cable didn’t work. The technology keeps on changing, and while that Thunderbolt cable looks simple, you can see how much stuff is on the inside.

Another big leap which is not yet prevalent in our society is going to be nanotechnology. Nanotechnology is going to allow us to manipulate what we are making at the molecular level. You watch these sci-fi movies that have some sort of a fabricator that materializes any item. This is what we’ll have sooner or later.

Kirill: How big is the cabinet of all the cables that you have at your company?

Danny: We have a warehouse [laughs]. It’s bins and bins and bins of cables. We have those original thick HDMI copper cables. They still work, but none of our operators love them, because they’re so big. We use fiber cables now. There’s no end. I have to keep some of the legacy stuff because if they wanted a CRT, I still need to use VGA. You have to be careful to not bend those pins!


Screen graphics for “The Fall of House of Usher” Episode 1, courtesy Danny Ho and Scarab Digital.

Kirill: Speaking of these historical artifacts, how difficult is it getting to be able to open old files from let’s say 20 years ago?

Danny: There are some legacy products that we still have to open. These days, if you need to have a CRT monitor on set, we completely fake that. On the back end it’s a modern machine that can play full HD stuff, but then it converts that to 1024×768 resolution.

But then you run into some older hardware. One of the recent challenges was on “Cruel Summer”. It’s a period show where they had the clamshell MacBook. We needed a Director file to work on the old clamshell, so we actually had to dig up the old version. Luckily we still have licenses around, and we have that machine, and we were able to run that file. This does come up once in a while, for sure.

Kirill: Is there such a thing as too much of a resolution? It used to be 800×600, and now it’s 4K, 6K, 8K. Is there a limit to what the human eye can perceive?

Danny: It depends on the monitor. If your physical screen is 40 inch and up, put 4K on it. But you don’t need 8K on a 40 inch screen. And then probably as you get to the 85 inch mark, you probably want more than 4K if it’s capable. I think the biggest you can buy economically these days is 85 inch 4K. Then when you get to 100 inch, you would probably want more pixels. It’s all about pixelation to camera, or our own eyes for that matter. You can play 1080p on a 32 inch screen. 4K is not necessary for that size.


Screen graphics for “Monarch: Legacy of Monsters” Season 1 Episode 9, courtesy Danny Ho and Scarab Digital.

Kirill: Is there such a thing as a disc big enough these days, or do you always need more?

Danny: You always need more space. It’s also about the playback speed. What can be played back in real time and how many feeds? You definitely need designated media servers for certain purposes. But if it’s for conventional graphics playback, generally you don’t need too much space. In a team environment space matters more, but on set you need just enough to cover whatever files you need for the day of the shoot, or even better for the whole season of that project.

You go through lots of versions as you work on the project, but hard drive space is not where the compromise is. Hard drive space comes in a compromise when you’re at the media server level. You might have multiple screens, and each one has 3-4 layers at 4K, and that’s where you’re starting to want more space.

Kirill: What’s your relationship with the color blue after all these years?

Danny: I like it, but I will say I have a preference towards the gray and less blue. We work on set, and the default starting point is that the screens naturally give off a blue color. Our mission objective is to always correct it so that it becomes white light, and then follow the preference of the director of photography with color. If it’s supposed to be black and white, it’s going to look black and white as a starting point. If they wanted to have a look, then we’ll follow the color and look they prefer whether that’s more blue or tungsten.


Screen graphics for “Arrow” Season 3 Episode 5, courtesy Danny Ho and Scarab Digital.

Kirill: Your studio has worked on a lot of episodic productions and on a lot of movies. Do you have a preference between the two?

Danny: As a company, we definitely focus on episodic. If it’s a good series, it’s a cashflow that you can count on. In the old days, a good show would be 24 episodes. In the more recent times, I would say that’s reduced. We just finished a project with 20 episodes, but you don’t see this very often nowadays. Recently, most of them have been contained to around 10-13 episodes.

The usual timeline for an episodic production at a certain budget level has a 10-11 day schedule to shoot one episode. That means that it’s almost always crunch time. Your stuff is never going to be always playing consistently every episode at the beginning or end, so it’s all peppered throughout the episodes. You have to stay ahead of the schedule.

In terms of preference, it depends. When the schedule is longer, you might have more time to nail and focus on what the producers want. On the other hand, sometimes when you have more time before the shooting starts, the producers have more time to change their mind. In a way, when it gets that fast and furious for 10-day episodes, they don’t really have much of a choice – but take what you’ve presented. A lot of times, what we first present – as long as it nails the story – it typically is approved, barring small things. If it looks great, and it conveys the story, and it shows what the producers and writers are going for, then typically it makes the mark.

It makes a big difference when the writers are involved in our meetings. Without the writers, we still have the director, the producing parties and the art department in the meetings, but it’s less smooth without the writers’ involvement.


Screen graphics for “The Flash” Season 8 Episode 3, courtesy Danny Ho and Scarab Digital.

We had a great long run with “The Flash”. In the beginning, the writers were not as used to writing for playback, but as they saw what we were capable of doing and the turnaround time on an episodic basis, you would see them writing specifically towards playback more and more – so that you can convey more of the story. Sometimes they used it as a tool where it was too expensive to do VFX. We would convey the same story as dots on a screen, or an overhead view of whatever the scenario is, or as symbolized graphic conveying what is actually happening instead of seeing the visual effects.

Throughout the show we saw writers steadily gravitating towards writing more playback, especially as we showed the capability to push the envelope of what’s possible. Everyone benefits. Your production value goes up. It’s so interesting watching teams of writers where they might not have written for playback before, but you can see their confidence grow as they see what they wrote on the page with “schmience” can be made into actualized graphics on the screen. You have this thing that keeps on continuing, a self-fulfilling prophecy.

Kirill: Speaking of the DC Universe, you’ve done quite a few there. You did “Legends of Tomorrow”, “Batwoman”, “Arrow” and “Supergirl”. How much space does it take in your head when you are involved so much in such a sprawling Universe?

Danny: It was so great to be a part of that, and to watch that Universe grow. It started from “Arrow”, and then the next spin-off was “The Flash”, and then “Legends of Tomorrow”, and then “Supergirl”. Everyone was so open about what we would contribute, and each designer had their own input into what they’d like. It was such a fun contribution to that whole Universe.


Screen graphics for “The Flash” Season 6 Episode 6, courtesy Danny Ho and Scarab Digital.

Kirill: Is there such a thing as your favorite DC hero?

Danny: As a kid, I think I was a big favorite of Flash in general. He was one of the few that actually got made into a show way back then. Of course, yes, you had your Batman, but it was not the same back then. Being able to contribute to the new show was pretty special.

Oddly enough, one of the more challenging sets for us, and one of the bigger sets of the entire Universe that we did was Supergirl. They had two big sets, one for her and one for her alternate persona. The funner stuff was definitely inside the big mission control.

Kirill: On these shows, what’s the split between on set and post-production for you?

Danny: It’s around 90% on set. I want to say 90% on-set. You do a few fixes or augmentations in post. Maybe you were getting materials in the wrong production order, where you have to shoot something and it’s not ready – that would be in post. Once in a while we would have holograms, and those would be in post.


Screen graphics for “Monarch: Legacy of Monsters”, courtesy Danny Ho and Scarab Digital.

Kirill: How was it to step into the Godzilla Universe with “Monarch”?

Danny: That was a good show, and we could tell from the time when we were looking at the scripts. I want to give nods to Twisted Media and their team. We knew that this was a prequel to Bryan Cranston’s version of the Godzilla Universe, and we studied a lot of the screens that Twisted Media had established. We wanted to make sure we didn’t go on a completely different tangent.

We met with the production designer, and we gave them a choice. We gave them a splay of concepts of how these graphics could have looked like back in the day. They can’t be too fancy and too futuristic, but still a nod to what they’ve established. So we riffed off of what the Twisted Media team had established and ran with it. The biggest compliment was afterwards, after it was aired and everything, when Chris Kieffer reached out and thanked us for work on the graphics. And I said “That’s awesome coming from you, because you guys established it”. It was a nice nod between comrades.

It was fun to be on that show. Sadly I hear season two is going on and it’s not shooting here, so we’re not involved.


Screen graphics for “Sonic the Hedgehog”, courtesy Danny Ho and Scarab Digital.

Kirill: How different was it to do the Sonic universe? It’s a bit more playful compared to those military installations in Monarch.

Danny: That’s the opposite of the previous TV stuff that we did. About 90% out of everything in the first installment of Sonic the Hedgehog in Robotnik’s lair is in post. There was only one practical monitor. We came up with a whole bunch of different pitches on how we were conveying all the different story points. It was fun to establish what Robotnik’s tech would be. We even pitched some other stuff that didn’t make it into the movie.

Kirill: Do you find that you have a little bit more freedom to explore like color and shape and animation on productions like this?

Danny: Our lead creative director [Jer Unrau] set the tone, and we put other concepts on the wall, and the director Jeff Fowler ended up liking the first stuff that we presented. While it was fun to come up with other different concepts, Jeff led the way back to what we did originally. It was a bit more simple and clean.


Screen graphics for “Sonic the Hedgehog”, courtesy Danny Ho and Scarab Digital.

Kirill: Looking back at your first 30 years, do you have a favorite production or maybe a technology era that you felt was the most challenging, inspiring, or fun?

Danny: There’s one thing I do miss. I feel like some of our team members on set are spoiled in that – in this digital universe – a lot of things just work. Maybe call me super old school, and it’s not that I don’t trust stuff, but I don’t like to take chances. When I go to set and I have to interact with something, you will always see me put a hardwired keyboard in – where so many of our team members use a wireless keyboard. But what if the reception fails? There’s always the what if. This is what I’m so used to. [And I guess I’m used to things failing on me at the most inappropriate time]

And the other part is on the camera side. Everything typically works together pretty well now, but I miss syncing a film camera. Back in the day, there was a [film] gate where you would look into when you were syncing that 24 frame monitor. There’s a point where you have to phase the bar in camera, and if you don’t get that right, it ends up in the dailies. Back then you didn’t even see it in real time. You were seeing a representation of it, because it was a video camera looking at the actual film [gate]. You were not seeing the real thing. Nowadays, you see literally what’s transmitted at the sensor, and what it looks like, fully colored.

That is the part that I miss. The entire crew had to stop, and you had control of the camera. You were sitting right next to the film magazine, and you were looking down the lens to verify, because no one else wanted that responsibility. I liked to have that little bit of control over the set at that time. It was annoying to everyone else, but there was something powerful about that moment. You don’t have that gratification anymore.


Screen graphics for “DC Legends of Tomorrow” Season 3 Episode 1, courtesy Danny Ho and Scarab Digital.

Kirill: How does it feel to sit down and watch one of your productions with the final grading and the final edit and to see how it was incorporated?

Danny: It’s changed over time. Back in the film days you were using light meters and other tools to confirm that everything’s shot correctly. The younger folks these days are spoiled [laughs]. What you see is what you get. You have a very good idea of what it’s going to look like.

In the film days, I would say that some of the scarier moments were with projection. You’re looking at a video tap, where it’s looking at a video camera, looking at the film gate. What you see with your eye doesn’t match, because it depended on how exposed the projection was. You needed to use light meters. I remember this one moment where it looked incorrect to my eye, it looked incorrect on the tap, but my light meter was saying it was correctly exposed. It was going against my gut. My gut was telling me it wasn’t correct, but my tools were telling me it was accurate. And I couldn’t know until the following day when everyone was watching the dailies for the first time. You sit there before screening and you’re wondering if I got it right or wrong.

Kirill: And film was expensive.

Danny: Exactly. You only process what is a good take. There’s a lot of stuff that didn’t even get processed. It was sitting on a cutting room floor. So there was a big gratification [and relief] to see that and find that I actually exposed it correctly.

The next layer after that is to see the whole movie. You see bits and pieces, but it is different to seeing it cut as a whole. And especially back then, seeing your credits is a different feeling. I feel grateful every time. You’re seeing a lot of what the movie will look like as you’re shooting it, so it’s not as surprising. It is exciting to see the full cut and see whether they’ve actually used the insert of your graphics.

It’s back to that question. Did you challenge the director enough that he needed your graphic in the story and he actually did an insert on it? Did it make the trailer? The gratification comes when you see that what you built made the cut into the movie, and even better, if it made it into the trailer.


Screen graphics for “DC Legends of Tomorrow” Season 1 Episode 6, courtesy Danny Ho and Scarab Digital.

Kirill: How was Covid for you? Do you feel that it is behind us?

Danny: Covid was interesting. The whole world stopped, and everyone stopped making movies for a certain period of time. But the industry found the way around it. They found a way to deal with the Covid protocols, and they were able to shoot in a certain amount of time.

Of course, it sucked. It was six months of no production. I want to say we were prepared for something like this. We always have contingency plans internally, but I’m not going to say it wasn’t rough. We enacted the contingency plan, and we were able to get through it until productions came back. Everything continued, and there was a backlog of content they wanted to produce. When it came back, it came swinging pretty hard.

The strikes are a totally different thing. I would even say that the strikes impacted and are still impacting our world more than Covid did. I see way more ripples going on post-strike than post-Covid. The main difference is, generally speaking, that post-strike there’s about 30% less productions being green lit. And those productions that are actually green lit, they are spending about 30% less.

If you think about screen graphics and FUI, that’s a little bit of a luxury item in a budget line item for production. You have to have props, you have to have set decoration, you have to have costumes, hair and makeup. Screen graphics is one of those things… do you have to see that on the set? You can always hide the phone or you can talk about it. If you had to do text messaging, you can do screen graphic bubbles as a post item. There are ways around it to tell the story. If it’s a type of show that needed it, then they might green light it. But I feel like it’s a different world post-strike than it is post-Covid.


Screen graphics for “Peacemaker” Season 1 Episode 3, courtesy Danny Ho and Scarab Digital.

Kirill: Speaking of big changes, how do you see generative AI today?

Danny: Internally, we see it as an advantage. Recently we had an internal discussion, and I pulled up an archive video from the Today show with an interview talking about how people were receiving Photoshop. What do you mean you’re going to have footage that can be doctored and not be the real original photo of what you intended? It’s the same conversation that’s happening around AI. What do you mean you’re going to have all these deep fakes? You’re not going to be able to believe what you’re seeing anymore because it’s generated.

I see it as another tool, just like Photoshop back in the day. You’re going to be able to accelerate what you’re going to be able to create that might be tedious. In our world of content creation, these are the things I would want to apply to it immediately. We want a really nice detailed map, but it takes a long time to create a detailed map. So if this could be generated, but look like a real map, then that’s something we can accelerate that we didn’t have to spend time on. It’s an organic thing, but it needs to have detail to feel organic.

It’s another ace up the sleeve to be able to create iteratively or quickly, so that we can be more creative on behalf of the producers, so that we can be the best we can be. And it’s going to get faster and smarter.

In another way, it’s also going to democratize content creation in general. Someone will be sitting in their home, and they have an idea, and if that idea is great enough and they put some effort and use the tools to do that, then power to them. Look what it took George Lucas to make “Star Wars”. If everyone had access to Industrial Light & Magic in their backyard, then they would be able to take those ideas to life. Obviously it has to be a compelling story and a compelling visual. But if you had the right tools at your disposal, and you’re able to have a tool or a team that can get you there and convey that story, then it could be very powerful.

The interesting part about it is that at this very moment, the studios tell us specifically on their project, whatever you’re doing, no generative AI is allowed. Period!

Kirill: There’s a lot of concern about copyright issues, and what the sources are for the training corpus.

Danny: It does come down to that. Maybe they can have a form of it where it didn’t touch the Internet, or its source material was limited in terms of how it got to its result. Maybe you use this tool, but it only takes stuff from the Sony library. Maybe we get to the point where whoever is supplying the tool can verify that the learning models have not been influenced by the entire web.

There’s a company called Cuebric that is pretty cutting edge in the virtual production space, and as far as I understand, they’re doing exactly that. They’re limiting their generative AI and the learning models. They’re controlling what the input is, so that they can verify the input. That way you have full control of what the output is, and you know where it came from.

Kirill: If you could give a piece of advice to your younger self when you were starting out, what would it be?

Danny: I would attribute some of my success to always having an open mind, having no ego, and being open to volunteering. When I talk about the video processes that I’ve done before, it all came from volunteering. I didn’t expect to have anything out of it other than knowledge and [learning from the] experience [of others].

Be open to learn from others. Have an open mind. Don’t always think that you know everything. Even after learning from others, I would challenge what you’ve learned because without innovation from what you’ve learned, humanity does not progress. Always challenge what you’ve learned, past experience should not always define and dictate the future. Keep pushing for what you think is impossible, because you’ll never experience it if you don’t try. “There’s no fate but what we make” [for ourselves].


Screen graphics for “Sonic the Hedgehog”, courtesy Danny Ho and Scarab Digital.

And here I’d like to thank Danny Ho for taking the time out of his busy schedule to talk with me about the art and craft of screen graphics, and for sharing the supporting materials for the interview. You can find Danny on LinkedIn, and more of Scarab Digital work on their main site. And if you’re interested to read additional interviews about the wonderful world of screen graphics and user interfaces for film and TV, click here for more.