Posts in development
Compiling Unreal Engine 4.20

In our next tech post, Steve Longhurst will show you how to compile your own version of Unreal. Not only are we at DO-Games experienced with Unity, but also Epic Games’ Unreal Engine. This video will show you how to compile your own version of the Unreal Engine, an essential starting point for console development and porting.

If you are using the Epic Games Unreal Engine v4 to make a game, and you plan to port to a console such as Xbox One, Switch or PS4, you will need to compile your own custom version of the engine, from source. Epic Games only supply the console support plugins as source code once you have signed up with the console manufacturers development program and signed their Non Disclosure Agreements.

Building the engine yourself might sound daunting, but it's really not hard, and Epic Games provides several useful documentation pages on how to go about it. Good places to start are:

Compiling your own Engine is also a very good way to dig into the inner workings of Unreal, using the Visual Studio debugger. If you write C++ code (not just Blueprint logic), you can debug step through your code and right into the Engine source, allowing you to learn what’s really going on under the covers. Lastly, it also means you can fix bugs or add missing features in the Unreal Engine, which we think is one of the big selling points of Unreal for anyone with coding experience.

This video is my experience in compiling the Unreal Engine, on Windows 10, using Microsoft Visual Studio 2017. Reading the documentation will give you all the theory, but it often helps to see someone actually perform the steps. Just ten minutes of watching this video and you will have first hand experience for when you come to do it yourself.

- SL

Source control with Git Large File Storage

This is the first of hopefully many technical posts, we'll be talking about Unity, game dev and hopefully some retro game dev very soon.  So, to kick things off - let's start at the beginning with some source control...

Recently we have started a few projects which were complete PC games with a large footprint.  The first thing we do is put the project under source control and we've always used GitHub and GitLFS which have worked well.  However with large projects it is can be fairly common to see the dreaded:

"fatal: The remote end hung up unexpectedly"

I've seen various posts on the net about using SSH over HTTPS and setting the HTTP post buffer parameter to a large value:

git config http.postBuffer 524288000

However for me none of these work reliably in all cases.  As you know GitHub has a file size limit of 100MB, but you often encounter problems well before you get to files of that size.

So here are some basic things we do when first pushing a project to GitHub.

Start with a small initial commit

Just setup the README, .gitignore, and your lfs tracking (see below).The reason for this is simple, if you end up needing to rebase in future and everything is in the initial commit, you will soon discover you are stuck.  To rebase you need a parent and so you can't rebase the original commit.

At this point, don't forget to install GitLFS.

git lfs install

Split up large commits and push one at a time

Even if it does work a single giant commit is fairly unwieldy. There is nothing more frustrating that firing off a 'git push' only to have it fail 3% from the end after 20 minutes.  From my experience it's fairly intolerant of transferring large files, so if you missed off a 'git lfs track' or the connection hangs it's much easier if you don't have to start again from scratch and when there is a problem this helps you narrow it down.

So add files to commits in groups, either those that occur naturally or try alphabetically.  You don't need to go too mad on the granularity but a few reasonable sized chunks is all that is needed.

Find all the large files in your project and lfs track them first!

It's quite important to track all the large files in your project to begin with.  If you don't and you commit any large files by accident they'll get added to your git object database and bloat it.  Secondly if you miss large files your pushes will fail either when they get rejected or often when the transfer times out.

While You can rebase and track the files at a later date, that is painful and you end up having to clean out the blobs manually usually anyway.  Much better to get things under control from the start.

Using find and ls in bash, you can list all assets above a certain size: 

find .-type f -size +50M -exec ls {} \;

I'm sure with some sed and grep wizardry you can just list the extensions to track, but this is a good enough start.

You'll often notice examples or demos from Asset Store items in here if using Unity.  Might be time to prune the project unless you really need them eating up your storage space.

GitHub allows files up to 100MB but gives a warning over 50MB.  I usually look at anything over 10 or 20MB.  As well as any obvious binary only assets.

If you do make mistakes and attempt to clean up afterwards I have heard good things about the BFG Repo Cleaner. But have never used it myself.

Until next time...

Hopefully this will help you out when commiting an existing project to any Git based system with file size limits requiring you to use LFS.  As mentioned there will be more technical posts coming up soon including a post-mortem of sorts for The Adventure Pals, and some retro IBM EGA appreciation... so stay tuned.

- MT