A technologically inclined Star Trek fan known as “Captain Robau” was experimenting with using machine learning algorithms (which is commonly called “AI” in marketing buzzwords) to upscale art assets on the video game Final Fantasy VII. He had such good results with it, that on a whim he decided to do some tests on his favorite Trek show, Deep Space Nine. There are full details on his blog, he also plans to do a follow up with even more technical details in the coming days. The results show promise, and we’ve embedded a few examples below.
As many fans know, there are two Star Trek shows stuck in standard definition, Deep Space Nine and Voyager, and there’s no indication that CBS will do what they did with The Next Generation, a total remaster from ground up, due to high cost and changing markets.
Like many fans, Captain Robau has lamented the fact that DS9 is stuck in the 1990s, but then he thought of using AI upscaling based on his experiences with Final Fantasy:
Just like Final Fantasy 7, of which I am upscaling the backgrounds, textures, and videos in Remako mod, DS9 was also relegated to a non-HD future. While the popular Original Series and The Next Generation were mostly shot on film, the mid 90s DS9 had its visual effects shots (space battles and such) shot on video.
While you can rescan analog film at a higher resolution, video is digital and can’t be rescanned. This makes it much costlier to remaster this TV show, which is one of the reasons why it hasn’t happened.
This is where neural networks could come in, I thought. With tools like AI Gigapixel, I knew it might be possible the low definition frames of DS9 can be scaled up to a higher definition such as 1080p or 4K. It would never be the same as proper remastering, but it would a step in the good direction.
For a first test he tried out still frames from the the beloved episode “The Visitor.” Click to open each image in a new tab, so you can see the differences in details for yourself.
Testing the process on actual video
Captain Robau then decided to use the opening five minutes of the fan favorite season six episode “Sacrifice of Angels” to test the machine learning’s algorithms on moving pictures, not just still frames. You can see the machine learning upscaling at work below in a side-by-side comparison video. Be sure to watch it full screen and at 1080p.
DS9 opening credits in UltraHD
And to get really ambitious he tried out upsampling the opening credits to 4k. As the author notes on his blog:
This nearly melted my computer, as it is a lot more intense to upscale than 1080p so I’ll stick to this single video for 4K examples of DS9 Enhanced.
Not a substitute for a proper remaster
As Captain Robau notes, this is not really going to achieve the same results that a proper remastering would. Upscaling is making educated guesses at deriving higher quality image data from a lower quality source. The example videos show just how variable the end results can be, in some scenes there’s a noticeable uptick in detail, and others it’s nearly as soft. That’s primarily due to the fact the source is simply lacking information to do much with. The source is still limited to the poor color (chroma) and brightness (luminance) information inherent in recording things to NTSC video tape. There are also artifacts from the early CGI that can be seen, since they exist in the source, they’re passed on to the upsampled copy too.
While the results can be quite impressive, even with the best algorithms at work, the old saying “garbage in, garbage out” still holds true. While not a perfect apples-to-apples comparison, check out the AI upscaled image of DS9 and compare it to when the space station was shown on TNG in season 6 as part of “Birthright,” Next Gen of course was given a full remaster that started in 2012.
The Deep Space Nine model has a ton of detail, much of it was not clearly seen before Next Gen was remastered. One of the most obvious reveals from the remaster is that there’s periodic yellow detailing on the station, which was previously lost — turned into a muddy shade of grey in standard definition. Because the AI upscale only has the old standard definition version to use as a reference, the yellow detail is not present.
Captain Robau has made it clear this was just a test and he has no intention of trying to apply this type of process to full episodes. And for the reasons outlined above, it is clear that this process would not – and could never – achieve the kind of results seen with the Star Trek: The Next Generation remastering, which was a meticulously labor-intensive project. However, the tests show that marginal improvements could be made to the standard definition version of the show. Feeding the learning algorithm more examples and, ideally, higher quality images too, could in theory help narrow the gap a bit as well. Perhaps in the coming years, it would be possible for CBS to use a process like this for both Deep Space Nine and Voyager, which would be significantly less expensive than the process used for TNG. This enhanced product would not really be a true HD release but could add value for syndication and streaming sales, especially in the years to come.