Home
Abstract
In this work we explore the possibilities of using convolutional neural networks for style transfer in the context of a real-time deferred renderer like Unreal Engine 5. We explore the possibilities of using G-buffer data as input to the neural network to improve its capabilities over just the final RGB image. An initial implementation in Unreal Engine yields 50 frames per second. Incorporating the G-buffer data improves the output image quality of the network significantly.
Thesis | Code1 | Code2 | Code3 |
Pretrained Models
Pretrained models are available in the releases section:
Code
Network Training
The source code for getting the dataset and training the different variants of the network can be found here:
Link.
Unreal Project
Unreal Engine Modifications
You need to be a member of the Epic Games organization. To get access follow these instructions: Link
The engine modifications can be found in the UnrealEngine fork on the realtime-style-transfer branch.
The LFS assets for the Lyra project are too big for my GitHub so only parts of it are available on GitHub.
Plugin
The standalone style transfer plugin alone is on my GitHub.
Full Lyra Project
The game project is available on my private GitTea instance. You are still going to need the modified Unreal Engine to run this project.