Home

Abstract

In this work we explore the possibilities of using convolutional neural networks for style transfer in the context of a real-time deferred renderer like Unreal Engine 5. We explore the possibilities of using G-buffer data as input to the neural network to improve its capabilities over just the final RGB image. An initial implementation in Unreal Engine yields 50 frames per second. Incorporating the G-buffer data improves the output image quality of the network significantly.

Thesis Code1 Code2 Code3

Pretrained Models

Pretrained models are available in the releases section:

Link

Code

Network Training

The source code for getting the dataset and training the different variants of the network can be found here:

Link.

Unreal Project

Unreal Engine Modifications

You need to be a member of the Epic Games organization. To get access follow these instructions: Link

The engine modifications can be found in the UnrealEngine fork on the realtime-style-transfer branch.

Link

The LFS assets for the Lyra project are too big for my GitHub so only parts of it are available on GitHub.

Plugin

The standalone style transfer plugin alone is on my GitHub.

Link

Full Lyra Project

The game project is available on my private GitTea instance. You are still going to need the modified Unreal Engine to run this project.

Link

Videos

rst-960-120-32-3 In Engine

rst-960-120-128-18

rst-960-120-32-18

rst-960-120-32-3

rst-960-120-128-17

rst-960-120-32-17