Felix

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

Felix Knight GAM400A Fall 2020 Instructors: Sward, Picioccio, Kaplan

Introduction
The concept for Living Megastructure was heavily influenced by the last game I worked on as a
Technical Designer: Mu and The Little Reef. The core loop of the game is relatively similar- you explore
the environment with some basic platforming mechanics, and grow plants to either open more of the
level for yourself or help friendly NPCs.

As the technical designer for the team, my responsibilities were primarily:

 Designing and developing most elements of the planting mechanic


 Playtesting and iterating upon the planting user experience
 Designing and implementing our menus
 Guiding our team’s meetings, managing our roadmap, and backlog

Objective
Living Megastructure has three pillars to which all our content and features contribute:
challenges provided by the structure, the extrinsic and intrinsic motivations provided by the character’s
community, and the player’s ability to shape the world, overcoming obstacles and making it habitable
for their community by planting seeds that grow into plants. Our goal is for the player to feel hopeful,
creative, and empowered by using their abilities to help their community.

Prior Research
There were a few key games that inspired our design and development. The first, as mentioned
previously, is the last team game I worked on, Mu and The Little Reef. The rest of the team had
playtested it, as well, and we determined what we liked and disliked about the primary mechanic of
planting. In Mu, players grew plants that then created fruit one could carry, making the fruit much more
interactive and important than the plants. Most challenges also only required a couple of plants, so the
player wasn’t using the mechanic as frequently as we had initially imagined. With Living Megastructure,
we wanted to shift the focus to the plants themselves to fully utilize the content we’d be creating and
have the player use the mechanic more frequently. The discovery of getting and using new seeds was a
solid way to keep players engaged, and they really enjoyed meeting and helping the reef’s characters, so
those were things we wanted to focus on here. We also discussed the popularity and happiness around
games like Animal Crossing and Stardew Valley that have similar community-focused engagement that
may have helped players feel more secure and personally valuable in the last, tumultuous years.

On the tech side, I knew I could implement our planting mechanics in a similar way as I had done
before, with a key difference I learned on my solo project from the previous semester- a sandbox game
with buildings made of destructible and place-able concrete rooms. With this game, we wanted the
player to be planting seeds frequently, and I’d learned that using Hierarchical Instanced Static Meshes
(HISMs) is much more performant than typical static mesh components or actors when it comes to
creating a high volume of the same type of static mesh. It has its own difficulties, but I’d learned how to
work with them and felt prepared to implement them in this game, as well.

Our team never had, nor expected to have, dedicated artists, so it was important to us to define
a visual style for our world we would be capable of creating with our resources that would look as

5/12/2021 Copyright 2020 DigiPen Institute of Technology Page 1 of 7


Felix Knight GAM400A Fall 2020 Instructors: Sward, Picioccio, Kaplan

cohesive as possible and fit the themes and mechanics of the rest of the gameplay. We looked primarily
at two references for this: NaissanceE and Control.

Image source: Steam page- https://store.steampowered.com/app/265690/NaissanceE/

Image source: Control Twitter- https://twitter.com/ControlRemedy/status/1121013192765972483

5/12/2021 Copyright 2020 DigiPen Institute of Technology Page 2 of 7


Felix Knight GAM400A Fall 2020 Instructors: Sward, Picioccio, Kaplan

NaissanceE had a relatively small development team, demonstrating that combining simple
modular pieces could produce relatively varied, detailed, intriguing cityscapes. Control demonstrated
the beauty in juxtaposing natural elements against hard, imposing geometry, which fit our theme and
primary mechanic perfectly. We searched for free environment art assets to use, eventually deciding on
Epic’s Soul City art, which was a departure from the sterile brutalist structures we had originally been
inspired by, but absolutely still captured the feeling of an imposing, inhospitable megastructure.

Implementation and Design


As mentioned before, much of my work time on this project was spent implementing our game’s
planting mechanic and menus using Unreal’s Blueprint and UMG systems. The testing and evaluation of
these systems from a user perspective is described in the next section, Testing. This section will describe
roughly how these systems work and why I created them in this way.

Planting
Below is a diagram showing how the different objects and components in the planting system
interact with one another. To compartmentalize things as much as possible so multiple people could
work on different aspects of the character at once, I made our planting inventory a mostly stand-alone
component with just a handful of interactions with the character blueprint: input for selecting seeds,
collision events for picking up seeds and detecting plants to compost, as examples. The seeds and plants
are driven by a data table, making it very easy to add new plants, especially if they’re just a mesh with
no additional functionality, such as our staircase and bridge.

5/12/2021 Copyright 2020 DigiPen Institute of Technology Page 3 of 7


Felix Knight GAM400A Fall 2020 Instructors: Sward, Picioccio, Kaplan

I decided to use a radial menu for the seeds, as I expected it to be more intuitive and easier to
navigate than a horizontal hotbar common in shooters and survival games, especially with an analog
stick, and we decided at the beginning that supporting gamepad for our platforming would be
appreciated by a fair few players. We decided to use the right stick to control the wheel, as we initially
planned on the camera being static and controlled by the game similar to A Short Hike, but, even with
our player-controlled camera, toggling the seed selection with the right trigger allows the player to
move and select a seed simultaneously.

Making a radial selection with a variable number of components, however, is rather challenging.
We initially didn’t have a ton of ideas for seeds, so I put a stake in the ground for 8 seeds, making the
menu easy to memorize and navigate, and aligning as close as possible with what we estimated to
create for content. This did create a bit of a design constraint, but 8 seeds ended up being a good
progression for our game length and cognitive load, and the only code that would need significant
change to account for more or fewer seed slots is the analog angle-based selection.

When a collider on the character overlaps seeds, the inventory identifies and destroys them,
adding to its count. When the game starts, the inventory creates a plant manager that has a HISM
component for each plant type mesh (driven by the plant data table). When the player is over a
plantable ground (collision and linetrace data sent to the inventory) and uses the planting input, the
plant manager creates a mesh instance. To grow it over time, with the possibility of growing multiple
plants simultaneously, I had to move that task to another object. The plant manager therefore makes a
plant growth manager that animates the mesh and then destroys itself.

When I realized plants would need more functionality, like being compostable, bouncing the
player, having a point light, sucking up pollution, I implemented plant objects that are spawned in the
same location as the mesh instances and handle additional functionality. These could be thought of as
the plants with everything but the mesh, as those all need to live on one object. This includes the
composting animation, which could be though of as a reverse growth. Therefore, it would’ve been
better to have the growth animation on the plant objects as well instead of the growth manager; this is a
byproduct of the order I created the functionality, but never became problematic to warrant refactoring.

As mentioned, before, defining a new plant is relatively simple with this system: meshes for the
seed and plant, its name, and an icon for the inventory are all entered in the plant data table. It gets a
plant object which inherits from a base class that additional functionality may be built off, such as
adding a point light for the lantern plant.

Menus
Implementing the UI layout for the menus was rather simple- I drew on what I’d created for my
current solo project, as I had tested and found it was easily navigable and compliant with DigiPen’s
Game Gallery requirements. I’ve found making a quick mockup directly in Unreal’s UMG editor to be
easier than sketching, personally, and gives me a much better sense of the needed scale for different
elements and text, as well as how they’ll adjust at different aspect ratios. That’s what I did here- the
main menu is a widget, its side buttons cycle a widget switcher through the submenus, which are all
individual widget objects.

5/12/2021 Copyright 2020 DigiPen Institute of Technology Page 4 of 7


Felix Knight GAM400A Fall 2020 Instructors: Sward, Picioccio, Kaplan

Implementing mouse support is mostly trivial, with the exception of the options menu, where
resolutions must be added dynamically for the player’s monitor, and the window mode and resolution
dropdowns must accurately convert between strings and enums with a map.

For the different panel and button graphics, I used an icon maker from the Unreal Marketplace
to create some minimalistic, colorful 9-slice textures I could apply to a variety of different elements.

The real difficulty with the menu came down to gamepad navigation. Unreal’s focus system
doesn’t provide an easy way for finding when an individual element has focus, necessary for showing
feedback, so I initially tried to create a system that would change focus manually using input from the
player controller. Despite trying to disable the automatic focus switching, it still seemed to interfere with
my manual system, so I cut it and used the focus system as usual. To get the element with focus, I’d
either have to have each element derive from a parent class with an override for its focused function, or
loop through the interactive elements at a relatively fast interval to find the one with focus while the
game is paused. With the entire UI created, there weren’t that many interactive elements, and it didn’t
have a significant impact on performance, so I used that method.

Testing
To be transparent: testing and iteration was absolutely the weakest aspect of our process. We
identified this in the first milestone, just a few weeks in, but never fully implemented dedicated working
hours for it. I take personal responsibility for this lack of testing, as I tend to get excited with
implementing everything I can in a sprint, and never really set aside enough time to test every sprint,
which is how often it should’ve happened.

There were a few other factors that contributed to this failure, namely that we kept scoping our
game down from the beginning, cutting features as we learned what our capacity as a team was, but
were still rushing to finish what we considered the bare minimum functionality to deliver our intended
experience: helping the community in a constructive way. Planting and AI were big parts of this, and
both took more time than expected; learning more about our AI changed our game mechanics and
interactions with villagers significantly. We also never allocated time to fully document and prioritize a
product backlog, which may have helped us determine what was really necessary, and what could be
dropped in favor of testing and iterating our already-existing features and content.

The testing we did do was a combination of remote and in-person: I was able to get three formal
tests from family on the user experience of planting, and one informal test from my cousin. We put our
build on the class’ Microsoft Teams files that directed players to a Google Form when they quit,
providing us with anonymous feedback on the planting experience, level design, and sound.

For in-person tests, I recorded notes on specific questions in one section of a Google Form, then
gave it to the players when they were done to fill out the second section: a few 5-point Likert scales
about their experience.

I asked players what kinds of scenarios they could imagine using the plants, in case they had any
interesting ideas about their potential uses, but they were mostly focused on how they had just utilized
them: “allow villagers to go somewhere”, “make pollution go away”. From the Likert scales, I had
indications that:

5/12/2021 Copyright 2020 DigiPen Institute of Technology Page 5 of 7


Felix Knight GAM400A Fall 2020 Instructors: Sward, Picioccio, Kaplan

 Planting didn’t feel entirely organic nor artificial.


 Planting sounds mostly aligned with their aesthetic.
 Planting seeds, seeing the resulting plants, and discovering what they did were all engaging
 Planting the special seed didn’t really make them feel like they were helping the other character.
It’s worth noting the AI had changed and I told the players how it would ideally behave (walking
out onto the plant), but it did not at the time.

Our remote test had the same planting experience questions but many more responses. The
planting questions had similar answers with wider distributions.

Most of the open feedback related to bugs, difficulty climbing the staircase plant, and the first
special plant not looking special enough, as it does use a differently-sized bridge seed mesh.

Feedback on our level indicated:

 Guidance wasn’t very clear


 It was usually clear what to plant and where
 Seeing where villagers were gong was mixed (though they only moved once due to that not
being fully implemented yet)
 The level is too dark.

Responses to our audio questions supported the audio being consistent with the world and
gameplay, well-mixed, and the ambience subtle, as our sound designer intended.

Sadly, as we didn’t set aside much time for testing, we didn’t set aside much time for iteration,
either. By the time we received this remote feedback, we were already beginning to rush to meet all the
game gallery requirements, so improving the quality of working elements was necessarily a lower
priority. Feedback we were able to implement for our process, however, from milestone meetings,
included switching our design to one level instead of three, and incorporating the content from those
planned levels into our emptier spaces. After the first milestone, we focused on implementing content
like art assets to define the game’s style and feel, primarily for audio to create music and sound effects
that would fit the aesthetic our free budget could afford. Each milestone resulted in us cutting features
we deemed unnecessary to our game’s vision so we could focus on the necessary elements without
burning out.

Results
my individual tasks, planting is stable, functions consistently, didn’t interfere with other systems
or team member’s ability to work on the character, and its bugs have been fixed or don’t significantly
impact the player experience. As a mechanic, it provides decent discovery engagement, though it could
feel more organic. I think the meshes themselves would’ve benefited a lot from a dedicated artist
making them, but I could and may adjust the curves for their growth animations to attempt to make that
appear more organic. I also think multi-step animations, like the stalk growing and then its leaves, would
make it appear much more organic, but would take a significant amount of time to implement and tune.

The main menu is quite standard, follows common interface heuristics, and didn’t receive any
negative feedback from our testers, though we didn’t ask specific questions about it. I would’ve liked to

5/12/2021 Copyright 2020 DigiPen Institute of Technology Page 6 of 7


Felix Knight GAM400A Fall 2020 Instructors: Sward, Picioccio, Kaplan

focus and test more on this, but it’s definitely one of the “safer” interactive systems in the game. It is
functional and readable, and that’s all we needed it to be.

As a team, we have produced stable, working builds from the first week on with a significant
increment each sprint. Players are engaged by the discovery of new seeds and the interactions they
have with plants. As our AI exhibited more of the functionality we planned for it to have, player’s sense
of assisting their community increased. Speaking as of Monday, as we finish implementing that level
scripting for them this week, I think there is a good chance the game will deliver on this goal; not as well
as we’d initially hoped, but it will still be achieved. There’s obviously not as much security and excellency
in that as I would like, given it was our primary goal, but consistent progress is being made on the
feature’s implementation in our level, and even the minor interaction we had before achieved our goal
experience for a few players.

5/12/2021 Copyright 2020 DigiPen Institute of Technology Page 7 of 7

You might also like