Concept
This was a prototype game developed for a job interview using Unity.
This is an PC game where you can move a city around (either around the XZ plane or tilting it), place buildings on it and it has dynamic physics-based events which have to be cleared to allow for more buildings to be placed. The goal is to place the most ammount of buildings in a limited ammount of time. Even though it is a PC game it was created using the new unity input system in a way to allow easy plug & play with XR by adding the needed inputs to the bindings of each input action within the input action map.
Play the Game
Click the link below to play the game: Virtual City Game Demo
Work Done
Overview of Work Done (for a more detailed description check out GitHub Repository ReadMe):
- The game was created for mouse & keyboard, but taking into account it is for an XR game. To solve this it uses the new input system and simply needs to have XR bindings added to the Interactions Input Map Actions in the Editor
- The city gameobjects were originally created using unity probuilder, but for the final version it was changed to external visual assets for everything, except the city ground itself.
- I created a simple mouse and keyboard controller (not needed in XR) to move the camera as needed.
- First I implemented the drag and drop from the UI to the city.
- Then changed the UI to be better by setting it up as a layout group with layout items in the editor.
- Created an animation class that plays an animation and is part of the object to be spawned prefab. This class is extended by others which override the method that sets up the animation parameters. I also added a simple particle effect.
- Then I added more city items (roads & buildings) and changed checks to verify if it wasn't colliding with a city item to spawn it, otherwise it can't spawn (later I also add trees, which do not prevent the spawn of deployables, but are them destroyed when an object is placed on them).
- I also changed the skybox and added transparent background spheres to somewhat be similar to the background halo seen in the test's example image.
- Created a city mover which detects players pressing on the city and depending on the mouse click (left or right button) I either move the city around the XZ plane or tilt the city around the Z axis.
- Then I created an event spawner, which spawns randomly one of two events. These spawn multiple physics-based objects (they have separate physics materials and spawn differently). I also had to change the city to have a rigidbody to prevent clipping and make it kinematic, also editing how I moved the city around to use the rigidbody rather than directly using the transform.
- As additional features I added a SoundManager which handles playing sounds from a sound library scriptable object. Audio Clips are recorded by me, bar from the background music.
- For some more visual appeal I also added a simple moving car to the scene, and created a scene with simple game logic for it to be more of a "game".
- The project scenes are the initial sample scene (only uses external UI assets), the scene with all the new assets and a button to manually trigger the events in the scene (the test scene relating to the job interview I took part in), and a game scene which I did for fun and is playable here.
The git repository is available here!
Game Programming
The script structure goes as follows:
- *Generate Input Action Map C# Class in Unity Editor
- CameraControlls.cs handles camera movement (not relevant for XR)
- UIPanelPositionChange.cs changes the dimensions of the layout group panel where buttons are when entering and exiting the layout group panel
- UIButtonHoverZoom.cs zooms the inner image of a button on hover
- DragNDropPlacer.cs handles deployable object placement including fading out the UI image, checking if placement is possible and instantiating gameobject if so (via input action, so with an XR binding there should not be an issue)
- UIButtonPlacementAction.cs calls DragNDropPlacer.cs to StartGameObjectPlacement given a deployable object prefab and a deployable object image prefab
- ImageFollowMouseCursor.cs is part of deployable image prefab and follows the player mouse position (via input action, so with an XR binding there should not be an issue)
- ObjectStartAnimation.cs on startup plays an animation and deploys particle effects
- VerticalAnimation.cs, SidewaysAnimation.cs and ComingDownAnimation.cs extend ObjectStartAnimation.cs and are part of deployable object prefab and set the type of animation based on 3 parameters.
- CityMover.cs handles city movement based on left or right mouse clicks and position (via input action, so with an XR binding there should not be an issue) via its rigidbody
- UIBlockDetector.cs is called by CityMover.cs to verify that the pointer is not on a UI item (via input action, so with an XR binding there should not be an issue)
- ElementalEffectsPlacer.cs generates the random physics-based events
- HazardItem.cs is the class that is extended to handle random event gameobjects
- TrashItem.cs and SnowItem.cs extend HazardItem.cs to handle the random event gameobjects' own destruction (when too far away from the city)
- HazardItem.cs is the class that is extended to handle random event gameobjects
- UIInfoHover.cs simply displays a UI panel on hover and hides when exiting
- SoundManager.cs is a singleton that handles playing sounds using a queue of audio sources.
- SoundLibrary.cs is a scriptable object that holds all sounds in the game
- Sound.cs defines sounds as an audioClip, volume, pitch, loop and id to later be played by the SoundManager.cs
- SoundLibrary.cs is a scriptable object that holds all sounds in the game
- Extra Features
- SimpleCarMover.cs hanldes moving a car across 4 preset points in the scene (which move with the city)
- GameManager.cs handles the game logic for the GameplayScene
- GameSceneReloader.cs is used by the end screen to reload the scene
Summary
To summarize:
- The city development was done using ProBuilder and External Assets, and game logic uses layers and tags to handle game logic.
- The UI uses layout groups, has visual effects on-hover (image zoom in).
- Object Placement is handled using Raycasts, layers and tags, image follows cursor with its own script.
- Transitions between 2D icons and 3D game objects are handled in the placer and the gameobject itself (which has an animation script). The image is faded out in the placer and the animation itself (which is based on 3 set animation parameters) is its own script which also instantiates a particle effect.
- The Physical Interactions happen between event gameobjects and the city itself. These hazard gameobjects prevent deployable gameobject placements and to be moved the player has to interact with the city. These events can be spawned manually (SceneWithExternalAssets/SampleScene) or are spawned by game logic (GameplayScene). 5.1 To interact with the city there is a CityMover on the city itself (made to allow for multiple cities), which either moves the city on the XZ plane or rotates it on the Z axis based on player button click (left mouse button, right mouse button) and movement (mouse position raycast).
- All interactions use the new Unity input system in a way that for XR adaptation simply add binding to the Interactions Action Map's actions for it and regenerate the C# class in the Unity Editor (and deactivate the mouse controls class).
- Extra Features: 7.1 Skybox changes and background. 7.2 SoundManager which handles playing sounds from a sound library scriptable object. Audio Clips are recorded by me, bar from the background music. 7.3 A simple moving car to the scene, and a scene with simple game logic for it to be more of a "game".
- The project scenes are the initial sample scene (only uses external UI assets), the scene with all the new assets and a button to manually trigger the events in the scene (the test scene relating to this test), and a game scene which I did for fun and is playable here.
- I used external assets for the sound effects used.
Conclusion
This was a great chance to learn how to create another game that allows for XR play, and kept urging me to develop more mixed or augmented reality games. I also further solidified my skills regarding object pooling, unity physics, animations and interactive and intuitive UI. It was a pleasure to do!
I did a lot of coding choices specifically to showcase a vast knowledge of ways to create the needed features, but if I were to make it as optimized as I could I would do the following:
- Subscribing to methods based on inputs (a bit hard to do since I was doing multiple actions on the same input) instead of using updated - EG ImageFollowMouseCursor (DragNDropPlacer and CityMover are ok) or using Actions
- DragNDropPlacer Image fading could have been a separate script (or within ImageFollowMouseCursor)
- More Scriptable Objects (objects to place and spawned event objects)
- More Object Pooling for spawned objects (DragNDropPlacer, ElementalEffectsPlacer)
- Game Manger events could have been an invoke repeating rather than the timer I used for them (within an OnEnable/OnDisable) and a CancelInvoke on game ended
- City Objects having a script to update GameManager - calling method UpdatedBuildings when fully built (avoiding unnecessary tag calls)
- Manager to handle Hazard item pooling - if they got too far away they became disabled, rather than having it within HazardItem extended scripts and ElementalEffectsPlacer update
- Better Animation Solution (ObjectStartAnimation):
- Instead of Scripted Animations using code, I could have used Animator + Animation Clips (Animation Controller)
- Extra note, for blending between multiple animations [moving characters, etc, I could use Blend Trees or Shader-based Animation for VFZ or material-based animations (e.g., dissolve, glow, water effects)].
Once again, if you want to play the game check it out bellow or check out the whole project on GitHub! Virtual City Game Demo GitHub Repository