Unit 9 - Contextual Studies

Introduction to Contextual Studies.

In Unit 9 - Contextual Studies, we will be researching and reporting on various topics that revolve around the games industry. Compared to our previous units in our last Level 3 course, it will be a lot more theory and discussion based, so our first Contextual Studies class was dedicated to going into groups of two, and breaking down how games have grown and impacted the society we live in. I wanted to bring the positive developments of games as a medium to light, like how games are being used as an educational tool in creative ways, and how games like Half-Life really pushed the medium forward into being recognized as just that; another medium.

Below is the brainstorm I had made with another classmate, I wish I had discussed more examples and impactful events in game history, but it was a Monday morning with everyone returning back to college, it happens.


This lesson was a decent foundation in getting me ready for the new unit and what we'll be focusing on. It has me excited. Whatever this unit has to offer, I'm going into it with the intention of getting another distinction on my belt, with another final major project to be proud of.



Research and Notes for the Debate.

5/09/2022


Our first lesson of the unit upon returning for my second year was dedicated to preparing for a "live debate", where two opposing sides would discuss whether or not certain demographics, ie; race, sexuality, religion, etc, are properly represented in video games. One side will be talking for the current representation in the modern games industry, and  I will be on the opposing side; talking against the current representation in the games industry.

What I did first was look at games that were released within the past five years, and games that released over 10 to 20 years ago to see what demographics they represent. I am very familiar with the games I've chosen to display here, and what interests me about them is that they all feature demographics that I greatly want to see represented more in games; Celeste with its personal undertones of sexuality and gender, starring a trans character, and Sonic Unleashed with its emphasis on traveling around the world and seeing different cultures.


Me and my colleagues; Akbarul and Oscar, have been talking a lot about games that represent sexuality, body type, and nationality, but we haven't talked much about religion. I was going to discuss LGBTQ+ and body type representation since they are topics I am passionate about, but then I started to wonder how many games represented religion or aspects of any specific religion, and my lack of knowledge on the subject in context to games is what inspired me to pursue the topic of religion in games.

I'm quite nervous about looking into this topic because while I have a pebble of religious history, my lack of knowledge on other religions like Judaism, Islam and Hinduism mean I will have to research aspects of those religions and see how often they are represented in games, be it respectful, offensive, minor, or major. 

When thinking of "religious" games, only two came to mind; Captain Bible in Dome of Darkness, and the Binding of Isaac. Two games that could not be more different from each other in how they reference Christianity in their tone, purpose, gameplay, and year of release.

[Captain Bible in Dome of Darkness is an edutainment point-and-click game released in 1991. The Binding of Isaac is a top-down randomly generated rogue-like released in 2011.]

I'm hopeful of how this research will pan out, but considering it is a "live debate", I will have to be on top of my research game to prove my case that religion is under represented in games.


Learning Blender.

06/09/2022


In my first week back at college, I spent the lesson learning the basics of the open-source 3D modelling animation program; Blender.

I had attempted to learn the basics of Blender in my free time (via step-by-step instructions from blogs, and tutorials on YouTube), and while I haven't been able to model anything, I was at least more knowledgeable of the interface and how I could potentially sculpt a model, which made returning to Blender in the classroom all the more exciting and engaging.

In the lesson, I learned how to add objects to the viewport and manipulate their geometry to my liking, as seen in the image below; I  had pulled on the polygons of the "Suzanne" object in an attempt to play with the eyebrows. With the base cube, I had used sub-division (a new technique I had learned in this session), to play with the polygons and test the limits of how messy I could make an object. 


While playing with the user-interface, I found the texture painting window and applied colour to my deformed Suzanne object. I have seen texture files before, and experimented with texture creation, so I understood how to colour it easily.


The shortcuts, tools and sub-options were the most difficult aspects of Blender to me, there are so many alternative options and shortcut inputs for quicker access that I ended up mis-clicking a lot of shortcut options and winding up stuck and un-doing a lot of major mistakes because I had accidentally used tools I had no knowledge on how to use.

However, this was only a practice session so I could see what Blender could do and grow adjusted to the user-interface and basic tools (model transformation, x, y, and z axis movement, etc). I'm really excited to learn the fundamentals and make my own assets. 



Learning Blender Modifiers and Tools.

08/09/2022


For today's Blender course, we spent one-half of the lesson adjusting to the tools, user interface, and modifiers we can use by applying them onto the default cube. Our 2nd half of the lesson was spent using the tools we had learnt to make 3D model of a tree.

For the first half of the lesson, I was still as confused as I was the last time I had used Blender, but following the instructions provided by Annette helped a lot in leading me to what tools to use. I had already known how to pull and readjust an object and its individual polygons, but the "Modifiers" were untouched waters for me, so I had some difficulty in trying to understand their functions, as well as which ones were appropriate for their moment. Eventually, I understood how to use them; you can see the specific Modifier settings I had applied onto my cube (which I had stretched and subdivided beforehand so the curving on the object could be smoother). 

I'm very excited to see how I can utilise these tools for my personal projects in the near future.






When using modifiers and modifying polygons, another vital piece of information I learned was "Topology", and the different kinds of polygons I can use to0 make a model.

What is "Topology?


Topology is a term used to reference the way the shape faces are linked together across a model, it can be used to lower and simplify the polygon count as a way to account for rendering. Good topology is when the model's polygon count and physique are consistent enough for smooth, bending arcs, while a bad use of topology is when the model deforms at an arc, and the polygons connecting with each other aren't consistent in their form and size.

The different kinds of polygons used when making a model is really important to consider as well, which there are three of; the tri, the quad, and the n-gon.

[From left to right; Tri-polygon, !uad-polygon, and N-polygon]

Tri's have only three vertices, Quad's have four, and N-gon's have a vertice count higher than quads. When applying retopology to a model, any Quads and N-gons will be automatically morphed into a Tri so the rendering of the model can be as smooth as possible. Because of this, Topology becomes an extremely valuable mindset to be aware of, as having good topology can benefit the model when applying retopology to it.

[Starting with a single quad face]

[I used the "subdivision" tool to split the face into multiple quads, and the "poly build" tool to add extra quads at a curved angle]

[Me applying the "loop cut" tool to split the surface into more polygons]

[Me applying the "bevel" tool in an attempt to smoothen the arcs between polygons]

[My outcome with the subdivision modifier applied.]

The outcome isn't the one that I wanted, but it did help me understand how to properly use the "bevel" tool when it comes to optimizing the surface and polygons of my model.

When it came to making the tree, I had used the vanilla cylinder and cone objects, but in my attempt to try and make it look more natural, I used the poly-build tool to make the polygons extend and clip into each other. I feel this helped give the tree I modelled more visual depth, and I wanted it to look closer to a real tree rather than make do with the simple shapes.


[Assembling the base of the tree model for reference]
 
[I begin deforming a cone mesh so it resembles a collection of leaves on top of a tree]

[My finished tree model]

[I then start applying the colour onto the model]

This lesson really helped me get a better grasp of using the polybuild tool and u.i. I still have a lot to learn, but I am still very excited to progress through these classes and grow more and more experienced with the software.



Anatomical Drawing Practice.

11/09/2022


For this lesson, we were tasked with creating a reference sheet for the 3D human models we are going to create. We were allowed to choose between a male or female body, I chose to base my anatomical sketch off of the male body because I have more experience in drawing that specific build. While I would have no issue trying a body type I haven't drawn before, I wanted to stick with what I know for the sake of my first 3D model looking as decent as possible.



While I'm not 100% accurate when it comes to the proportions and detail, using the head size as a form of measurement for the rest of the body was extremely helpful in drawing an accurate enough body for me to base my model off of. The scaling of the head and the body was definitely the hardest aspect of the model sheet.

Much like my personal art, I used shapes like squares and triangles to help me define parts of the body more accurately; a triangle for the pelvis, circles for the joints and squares for the torso.

[My own model sheet. Look at the clown]

I am extremely happy with how this chart came out. I was very nervous about drawing because of the accuracy, but reading through the rules on how to draw this type of build helped a lot on establishing a base. 

Overall, I had a lot of fun drawing my own model ref sheet. Any digital drawing in Games and Animation is a great pleasure.


Game Design: From 2D to 3D (Homework).

13/09/2022


The homework I've been assigned with was to study and explain the key differences between a 2D game and a 3D one, and I feel the best way to demonstrate how much adding an additional axis can change a game, is to compare two games from the same franchise, with the exact same game design philosophy.

My choice of games for this study are Sonic the Hedgehog, a 2D-platformer released in 1991, and Sonic Adventure, a 3D-platformer, released in 1998. 


In the Sonic the Hedgehog series, you're supposed to use your speed, momentum, and reflexes to navigate through obstacles, enemies and environmental hazards. You can move left and right, jump up, and defeat enemies by landing on them after a successful jump. Sonic the Hedgehog 2 and Sonic the Hedgehog 3 would give Sonic the Spindash, a technique that allows Sonic to curl into a ball and hold himself in place to build up a boost of speed.


The game greatly urges the player to make use of the level terrain whenever they can because while Sonic is moving, he can curl into a ball that can defeat any enemies he rolls into, and, when rolling down slopes, can increase his max speed and send him flying up if there's an upward slope ahead. 

[Notice how much faster Sonic goes when he's curled up in a ball and riding down the slopes]

Sometimes the level design doesn't accommodate for Sonic's speed, but because the game encourages the player to make creative use of the environments to gain speed, the player can find ways to quickly navigate through what could be a slow pace breaker in the level. Experimentation is key in Sonic the Hedgehog. Replay ability is also a core aspect of the game, because sometimes the levels can split into branching paths that can lead to extra rings and lives, or act as a shortcut to make progressing through the level faster.

[The green block is supposed to carry Sonic from one end of the room to the other, but if you're fast and careful enough, you can make Sonic jump on the platforms he was supposed to avoid, and make it to the end of the room quicker than if you had gone the way the game intended on your first playthrough]

For the most part, a lot of these iconic gameplay mechanics and elements are translated thoroughly well in the first fully 3D mainline game: Sonic Adventure.


Because of an additional axis, a lot of Sonic the Hedgehog's core gameplay elements have more complexity to how they're designed. Sonic can still jump and run and perform the Spindash like the original 2D games, except now he has so much more open space to run in and explore, which allows the levels to be bigger and more dense in how they're designed and played.

In the 2D Sonic games, Sonic could run up walls but lose momentum over time. Sonic can still run up walls in Sonic Adventure, but because of that additional axis, he can now run alongside it instead; running on a wall, while maintaining his momentum 

[Sonic running up a wall in Sonic the Hedgehog. He can only run up and down]

[Sonic running up a wall in Sonic Adventure. He can run alongside a wall instead of directly upwards, allowing him to maintain his momentum]

Sonic's method of attacking is one of the only major changes made to his gameplay. In the 2D games, defeating enemies was never took too much effort because you only had four directions to work with, whereas in Sonic Adventure, the player will automatically lock onto nearby enemies if they pressed the jump button while Sonic is in mid-air. This was done to circumvent the difficulty in precisely aiming Sonic on top of an enemy in a 3D space, and keep the flow of the levels consistent with one another.

Another aspect of 2D Sonic present in Sonic Adventure are the branching pathways and shortcuts present in levels. Because the levels are so much more expansive than the compact 2D levels, the branching paths in Sonic Adventure can go in all kinds of different directions, as well as provide the player with more of a challenge if they want to get there because of the 3D space.

[Branching pathways in Sonic the Hedgehog]

[Branching pathways in Sonic Adventure]

While the difference in gameplay feel is major, it should feel familiar with the design of the original Sonic the Hedgehog because of the gameplay elements and mechanics like speed, replay value, momentum and level design motifs from the 2D game being greatly focused on and tweaked so the character could feel familiar in a completely new game environment.



Live Debate Evaluation.

15/09/2022




We finished our live debate, and even if my script was rushed and unfinished as a result of mistaking the date the discussion would be held, I am happy to say that it was very successful in terms of presentation, counter-arguments, and civil discussion.

The purpose of this live debate was to see how well we could compose primary and secondary research for why we think the side we're fighting for is valid, and I think our side did a great job on composing our thoughts and reasons for why games don't represent the demographics we stand for. Oscar was talking about representing sexuality, Akbarul was discussing gender issues, and I was talking about religious representation. I think Oscar and Akbarul did amazing in terms of research and conclusive reasoning; both areas of which I had unfortunately skimped out on. I did talk about the games that represent religion in ways I feel is appropriate, but I should have referenced more voices other than my own and Akbarul's (who I asked for their opinion), but I should have included other people's voices in the matter, even if I was short on time.

Below are screenshots from a Google Doc containing my initial brainstorming for ideas, along with inner monologues, sources of my research, and finished notes and paragraphs.






I was initially going to focus on the representation of specific sexualities, but everyone in my group kind of had the same idea, so I chose religion out of a need for creativity, and genuine interest. Why isn't religion a more popular topic for games? Even if the conclusion I came up with seemed obvious, and I didn't provide all of the proof I wanted in a short time frame, I'm still very happy with the outcome and debate we walked away with.

Below is the script I had typed in preparation for the debate, I had to ad-lib on the extremely short and unfinished paragraphs.




Making a 3D Model with Blender 3.2.0

15/09/22


By using my anatomical practice drawings as a reference, I have created my first full 3D character model. Considering it's my first time creating a character in a 3D asset creation software, I am very happy with the results.

At first, I assumed we would start with a sphere mesh for the head, but to my surprise, we started with a default cube and manipulated it with our tools and modifiers until it resulted in a spherical enough shape. It was very interesting to see it turn from one shape into another by my own hands, and I'm very excited at what else I can do to shapes with this knowledge.


I started with the head, and with the use of the loop cut and extrude tools, I made my way down to the feet, with the arms and hands being the last aspects of the model I worked on. I am not at all happy with the results of the hand, but I was running out of time, and I was having some trouble creating the hands and individual thumb so I had to use the "it's good enough" mindset.












[I was really tempted to go off track with my model to make it unique from the models my classmates were making] 







[The aforementioned hands that I had skimped out on modelling]




Seaming and Texture Painting.

22/09/2022

From September 20th to the 22nd, I've been adjusting to the process of seaming and texturing my model, I was initially confused at first (as I have been with my time using Blender), but this lesson really helped me get a grasp on the very basics of texture placements and texture design.

Seaming allows me to blueprint the location of my textures on the model, so the seaming process was more or less planning the texture placement.



Once the seaming was done and the "island" layout for the textures had been assembled I got to colouring the textures using the paint window, but not without light experimentation. It's up to here I had switched from using the mouse to using the drawing tablet and pen.

[Light scribbles to test the pen pressure, contrasting size between the texture islands and model, and how the colours look on the model]

After colouring the head and the torso, I found it really difficult to tell each specific island apart from where they're located on the models surface, so to circumvent this, I covered the texture islands in saturated, contrasting colours so I could easily identify each body part.

[Island colours with varying hues and saturation, along with an unfinished texture for the torso]

Since my models proportions weren't perfect, I figured I would make a character out of my model; a skeleton in suspenders. I feel this would distract people and myself from the model's imperfections, and give it a goofier look. I've also made the choice to not use any stock textures given to us for the time being as I wanted to use this opportunity to practice my colouring and line art on a 3D character.


Overall, I am very excited about the further
 development of my model and Blender knowledge, and I'm already planning on incorporating 3D models/visuals into my eventual FMP. I hope I can complete my skeleton character alongside the basic steps of Blender.


Hair and Particles in Blender.

26/09/2022


Today I went over the basics of the particle system in Blender, and used it to give my character hair. I gave my character asset a mohawk hairstyle and experimented with the hair particles by using the particle-exclusive tools to change its height value and shape, as seen in the screenshots below.

[The hair particles as soon as you add them]


[The finished hairstyle]

The hair particle is one of those modifiers that I'm unsure of how or when I'll use it again, but I still appreciate learning about the tools Blender has to offer.



Particles in Blender.

27/09/2022

This lesson, I got to practice operating the particle effects of Blender. I found the method of creating particles very interesting; we first want to spawn a default square face, and from there we add a particle system, and assign the default particle effects to whatever asset we have in our collection, and I chose the cube. It gave the particles this cool, abstract visual that I really like.

Being able to change the settings for how fast the particles fall, how many are being created and how big they can be really opened my mind to how I can take advantage of this system.


After the cube particles, I wanted to make a particle generator that looked like an explosion of shapes coming from a tear in the wall. It took me a while to find the right settings for how the particles react (notice the objects rotating and being launched at different heights and directions), 


I consider my experience understanding the particle system to be another vital step to learning the ropes and modifiers, as well as how to apply them in appropriate contexts.

I hope to use the particle tool in any future character models I make, they would make a good addition to any atmospheric models



Rigging in Blender.

29/09/2022


For this lesson, I finally move on to rigging my "finished" model with bones so we can get to posing and animation. Instead of connecting bones together from scratch (that will be in a different lesson), we installed "Riggify", an add-on for Blender that gives us a bundle of preset bone armatures to choose from.

In the image below, I had attempted to merge the advanced bone armature with my model, unaware of the nbasic bone armature I could have used instead. The difference between basic and advanced are the bones dedicated to individual fingers, facial muscles, and toes.

[The advanced bone rig during the 

The images below display the basic bone armature I used to rig my model. It was a challenge to make sure they all align properly, especially with how my model wasn't a 1:1 accurate human. However, looking at this now that I've finished the rigging process, I wonder if using an image of a human skeleton would have made for a great reference.


Below, you'll see the result of my rigged model. I adore how good it came out, even with the difficulty I still have navigating through the menus; before generating the rig for my model, I was unable to rotate some bones because of a setting I had accidentally enabled, and I can't remember what that setting specifically was. Hopefully, my issue of accidentally performing shortcut inputs will be resolved.


Overall, I am very happy with the outcome of my first rig, even with some of the awkwardness in the proportions, it is my first human model after all. In regards to animation, I could manipulate the bones into unnatural poses to, ironically, make the body look more natural in poses since the bones don't align with the body in an accurate way.



Legal and Ethical Constraints in Games Presentation.

29/09/2022



To demonstrate my understanding of game regulations and ethics, I was tasked to create a presentation that features my own explanations of four individual game laws established by the government, two regulating bodies that enforce said laws, what constitutes as an ethical regulation, and an example of a videogame or company not adhering to the law.

I struggled a lot on this particular presentation because to make this presentation and explain the laws and regulations that comes with the game industry, meant that I had to be confident in my knowledge of video game industry law, and at the time I was not. I spent a lot of my valuable time on unnecessary research on regulating bodies and fact-checking the regulations I was focusing on.

While I am still content with the knowledge and research that came with making the presentation, I shouldn't have dedicated so much time to fact-checking and researching laws I probably understood really well. I could have dedicated that time to finishing a script and delivering the controversy surrounding Sensory Sweep Studios in a better fashion. However, I am happy with the points and examples used in the presentation, specifically, the examples used to describe patents and I.P. licensing.

Below is the presentation I assembled:


Compared to most of my previous presentations, I feel that the feedback I had received was more mixed than positive; the structure and pacing of my presentation was stilted and shaky, which is thanks to my unwillingness to make a script at any point of the presentations production. I slurred on my words at multiple points, and at the very end, my teacher told me about an unfinished textbox that I had not noticed before starting the presentation (whoopsie.).

I learned a lot from this particular presentation, but the one lesson I'm most thankful for is to be confident in my research and belief. If I had gone along with what I knew already and spent less time researching multiple different sources regarding the same laws, my outcome would have been a lot more polished and up to the standards I expect from myself.




Manual Bone Rigging.

03/10/2022


After successfully rigging my character model, I was given a choice to either animate and pose my already rigged model, or create a new model, only this time it would be created with the intention of making it inhuman, and I would create the armature from individual bones instead of using a preset armature provided by Riggify. I chose the latter option because I wanted to better my understanding of how the rigging process is structured.

To make the task as easy as I could make it for myself, I decided to use the Squid from the Splatoon series of games as a source of reference for my model because of its simplicity, as well as its rubbery and stretchy movements you from it see in-game.


I started the model creation process as I did before; I started with a cube, split it in half, and used the mirror and subdivision modifiers until I got a starting shape I was happy with.
It was at this point I realize I could have started with a different mesh so I could experiment with my approach more, but going with what I knew helped me make the model relatively quickly.

[Starting with the head]

[The rest of the body is finished]

[The colouring phase]

[Hey presto, it's finished!]

One of the major hurdles I came across when modeling my squid were the extra vertices and faces that would overlap eachother, and when deleted, would take a chunk of the model with it. My teacher gave me a solution however, and that was to merch two separate vertices together so they could be grouped as one vertice. That piece of advice saved me, and I can imagine it being extremely valuable to me when I model more characters later down the line. 


I applied the automatic Weight Paint so the polygons would move with the bones, and applied the squid texture as an image over my character model. After that, I got to posing the character and experimenting with how far I could deform the limbs. In the photos below are the poses I've made, you'll also notice how I've experimented with the lighting by moving it in different directions.

(My squid model scanning the flat, tile-floored horizons of  Blender)



I really like how my poses came out, although I do wish I added more bones to the "helmet" of the squid so I could make the curving more expressive, but I think I did well with what I had.

I experimented with lighting more in the images below; I used different colours for a "dramatic" lighting effect. I placed the light objects at different angles and changed the power of the light itself to get the results I wanted.

(Blue light reflecting ontop of my model)

(Red + blue lights on both sides of my model)


I'm very proud with how my model came out, and while I do wish I had added some extra bones and linked the shoulder bones to the correct place, I'm happy to still be learning from these mistakes and applying more of what I had learned from them to each new model I make.

I've been making notes of keyboard shortcuts and pathways to certain tools and options as a way of making my navigation through Blender a lot easier. Rewriting instructions in my method of writing has been helping a lot too since it helps me understand the steps a lot better.


Next, I'll be moving on to animating my model, which I'm a bit nervous of because of my very amateurishly designed model rig, but nevertheless, I am still excited for how I can execute it.




Animating My Unconventional Model.

10/10/2022


After giving my model bones, I went onto the animation layout in Blender to begin my first Blender walk cycle. I was very nervous at first because my only exposure to 3D animation and model posing came from Source Filmmaker (a free animation and 3D render software created by Valve), but the transition has already been incredibly comfortable! Blender's "auto-keyframing" feature helped tremendously in animating the walk cycle too.

I had already done multiple walk-cycles before, so I didn't need to use a reference for animating one, especially because of how unconventional my 3D model is. I was originally going to animate at 6 frames a second because animating at a low framerate has helped me a lot in previous projects, but Blender automatically interpolates my animation in 24 frames per second, so instead, I posed the character every 6 frames.




Below is my first Blender animation outcome; because my model is based on a squid, I wanted to emphasise the bounce and floppiness through the animation, and I think I executed it fairly well if I'm using positive feedback from classmates as a source of judgment.

The most difficult part of animating my squid was probably figuring out how the knees would arch and bend in the run, the models very unconventional design made it harder to look natural in that regard. I was consistently using my own legs as a reference for how knees feel and look when bending (which is a method of referencing I use a lot for my personal art and college projects).

[Bones visible in the walk-cycle]



[The finished run-cycle]

The motion is admitted;y slow for it being a run cycle, but since it's my first Blender animation, I still think it came out really good.

For my idle animation, I wanted it to be very active and lively, instead of the usual "stand and breath" motion that a lot of idle animations are. This proved to be very difficult for me, however, since the animation I had planned to make required I bend the bone joints as naturally as I could on a model that would constantly deform from each limb arch.

But even with the deformation and unnatural pace between the jumping and "knee" bending, I'm still learning the kinks of Blender while working with my model's severe limitations, so I still did a fairly good job.


Because I'm planning for the character to always be on the move, I wanted the jumping animation to reflect that by spinning in a corkscrew motion whenever they jumped. I used an animation from Splatoon as a source of reference for this particular animation.

I wanted to change the positioning of the limbs between keyframes as well, but that took too much of my time, so I had to rotate the model in a counter-clockwise motion instead.

[My finished jumping animation]

Next week, I should be porting my model into Unity! I am very nervous because it's technically a return to coding in a game engine, I'll cross those hurdles when I get to them.




Rendering Our Own Film in Blender.

17/10/2022


To start making my first rendered animation in Blender, we were given a folder of pre-made assets (a character model, and image textures) to use, and our animation needed to contain a house with windows and a door, a moving camera, lighting effects, and moving characters. 

I started by spawning a simple plane mesh and applying the grass texture image onto it as a material, once I set up the origin point, I got to work on the house that the Vampire would reside in. I wanted it to look more cartoony than normal, so I exaggerated the edges to point in different directions so it would lean into that cartoony style.

One thing I messed up on however were the windows; I didn't realise i was supposed to loop cut windows into the house, what I instead did was create new planes, colour them yellow and make them glow to give the illusion of windows. I really like how it looks, but I it was still a mistake, and I was too far in production with the house to make a change as major as that, so I continued work.


Below are tests I did for the lighting I added into the scene, you'll see how the mansion is casting a shadow over the grass; that's because I added a "sun" light particle and colored it a soft cyan to simulate moonlight, so the mansion could look even more imposing. I feel this direction was a good enough compromise for the lack of real windows, even if it wasn't part of my lesson objectives.


(Rendered screenshots to show how light reflects off of the scenes assets)



Adding lighting objects proved to be a lot more difficult than I thought, for some reason the spotlight object's lighting number need to be in the hundred-thousands for it to be visible, I used the spotlight so that when the door to the house would open, the light would shine out through the doorway.


Overall, I think I did a really good job on the asset and character creation aspect of Blender, I've grown comfortable with the user-interface, and




Rendered Blender Film.

31/10/2022

This is the vampire animation I've been working on, I've used ScreenToGif (an free screen-recording software) to record my rendered outcome because of Blender's unique exporting process that isn;'t really friendly to a newbie like me. When rendering an animation, Blender renders each frame as a PNG image, which I would have to export into a video editing program so I can create a viewable animation. I didn't know how to do this however, and I was very short on time as we had to move onto other class subjects, so I just screen-recorded the rendered animation and converted it into a gif.

I was originally intending to have my Squid armature walk up to the door and knock on it, then the vampire would open the door, confused to see no one, only to look down and notice the Squid seconds after, but I was once again held back by time, so I limited the interaction to a creaking door, followed by the squid instantly grabbing the door hinge and slamming it shut.

I think it adds a small flare of comedy considering how ominous the animation started, the camera weirdly panning away from the house in the same style as a lot of the endings that Christmas films would have makes it the cherry on top.
 

Admittedly, I had spent an inordinate amount of time on the visuals and lighting of the environment rather than having the characters interact with most of the assets in the animation. It is supposed to be about a character opening a door, entering their home, and closing said door, but I ended up animating less of that. 

I'm very happy with how my first, fully rendered Blender animation came out, I wish I didn't skimp out on the windows and adding sound, but I think it's a good reflection of my skills as an artist regardless. 



Localisation and Censorship Report.

31/10/2022

To demonstrate our knowledge on the subjects of censorship and localisation in the media industry, we were tasked to pick two pieces of media; one affected by localisation, and another affected by censorship, and we would dissect each of them with what changes were made, why they were changed, and what the overall public reaction was.

I had to skip on a lot of the changes made to each foreign broadcast of Steven Universe because of how many changes there were, so I picked the ones that I recognised while watching the series, and that I felt were the most easy to explain and understand. I also wanted to go more into detail about Splatoon 2's history of changing the dialogue and interactions between characters, but the lack of time was once again an issue for me, so I focused only on the Splatfest's and exclusive gear.


The most difficult aspect of writing this report was utilising my own primary research to back my points, aside from comparing the UK US broadcasts of Steven Universe by using different streaming platforms, there isn't much primary research involved aside from me watching Steven Universe and playin Splatoon for myself, and that's an aspect of my reports I'm hoping to change. I have my own personal experiences of encountering these localizations and censorships, but I didn't have to do any primary research because a lot of the info was known to me ahead of time, or there on the internet for me to find myself.

I still thoroughly enjoyed writing this though, and I think injecting my personal writing style and opinion into these types of reports helps elevate it to an even higher level of quality. Next time I write a report, I'm hoping to use my own independent research alongside my secondary research.



Unity Engine Testing and Scripting.

04/11/2022

Moving on from Blender, we finally got our hands on Unity, and our fast task was to create a game session with a working user-interface, click-controls, and spawning assets. Even though our introduction to Blender is finished, I'm hoping to experiment with it so I am able to improve with 3D animation and character modeling.




After moving from Blender and by following the on-paper instructions our teacher gave to us (like with the previous lessons), getting a grasp on Unity's UI wasn't as difficult as I expected, so creating assets and materials was all too familiar for me. Where I struggled a LOT with Unity, however, was the general process of game coding via scripting. 



It took me far too long to understand how its scripting language (C#) was formatted. I had eventually figured out how lines should be separated, and how I should separate paragraphs of code via curly brackets, but it took me upwards of an hour to properly and manually type the code into the instructions.


When it came to adding meshes and organizing assets, that's where I excelled with the program, and the areas in which I struggled were made much easier to improve thanks to the help of my classmate; Charlie, who has experience with the Unity game engine. I was able to understand how the player HUD can be edited and be made aware of "hard code" thanks to him.


It may look like I wasn't taking the course seriously, but it was simply my way of coping with the steep difficulty curve Unity has given me. I was under a lot of distress because of the new user-interface I needed to adjust to, along with the process of coding, which I had very little experience in, and that was in a 2D engine.

(Experimenting with the hud texture images and the game text)

Understanding the Unity game engine is very tough for me at the moment, but as long as I play along with the coding format and understand it, I believe my time with Unity will be much easier and faster compared to my first session with it.



Controversies in the Games Industry.





To demonstrate our understanding of ethics and game industry laws, we were tasked to create a presentation showcasing three different controversies, and breaking them down to how they happened, and how impactful they were to the developers/public while giving our own opinion on what we think of the cases and the people behind them. 

This is my final outcome of that presentation, and the three cases chosen were related to topics a care about; Bayonetta 3 and how Hellena Taylor's behavior could sour a lot of people's views on voice actors, Valve's greedy handling of community-made content, and Epic Games V. Apple case pulling the curtain back on how shady major companies like them are.


The feedback I had received was very positive, with the negative criticism directed at how I didn't really go into the aftermath of the Bayonetta 3 situation, and how I explained the Epic Games V. Apple case. I could have talked about how Hellena Taylor's dishonesty could make it even more difficult for voice actors to secure decent payment standards, as voiceactors are truthfully underpaid. The Epic Games V. Apple cases were very long and eventually extended beyond just Apple and Epic Games' business practices, and I should have had a real script my my side to keep me on track of what I should be talking about.

My most positive feedback came from how emotive and conversational I was, hearing that I was talking with an audience instead of at them was very relieving to hear since that's what I wanted to go for. I don't consider this my best presentation because I was admittedly slightly underprepared, but all things considered, it still went relatively well.



Unity Engine Practice: Pusherman


To further expand our knowledge and adapt to how the Unity engine operates, we were tasked to create a small game using pre-made scripts of code and assets. The game would revolve around controlling a character in a 3D space to push a ball through a maze to its destination. I may have gotten heavily sidetracked into making my own type of "game", but I think doing so benefitted me in the long run.


To start, we added a "terrain" mesh that could be molded and changed to our liking, and it's here I made my first mistake of not accounting for the scale of the player character juxtaposed with the game world. Terrain I had designed to be bite-sized platforming challenges were actually giant hills. At this point of my class, why I would design a world without taking the actual size of the player character into account is beyond me.




The trees were something I had struggled with at first; we were supposed to import the stock tree assets from the resources folder, but I didn't understand how to import assets into my scene at the time, so I resorted to attempting to create my own tree. I ended u learning how to import all assets into my project (just drag and drop the 3D assets into the left-hand sidebar and hey presto), but I still found the experience of creating my own tree to bee really cool, even if it was in an unnatural, dead, white hue. So, I ended up using it to develop my world.


I'm embarrassed to admit that I had spent more time detailing and creating the game world rather than coding in any actual game elements, which did result in a slow coding process, but I'm not too beat up about it, I'm still learning the ins and outs of the engine.

It was at this point I was smoothing out jagged mountains and bumpy hills with the intention of allowing the player to run as freely as possible.


After working on my world for long enough, I eventually imported our third-person character preset into the game world, copypasted the controller scripting, positioned the camera, and hit play to test the controls. 


Having a game, regardless of how early in development or amateurish it is, still makes for an exciting experience. It got even better once I changed the running speed of our character, as well as its jump height, its mass, and turning speed. 

My game was shaping up in a way I really enjoyed, the only issue is that once I copypasted the camera scripting from our resources folder, it did work, but the camera was stuck in a solid stationary angle as it followed the player. To circumvent this, I tried searching for a free-to-use controllable camera script, and after copypasting the code into a new script, and linking it with the player object, I eventually got the camera to face the direction the player character was looking in. 

It was very confusing at first because I'm still very confused by the coding process and the language it uses, I got the code working through sheer guesswork after all. However, even with that in mind, I somewhat understood it afterwards.


(I changed the colour of the world light to a soft orange to give my world a warm, sunset colour and feel.)


While working on my Unity game project, we were instructed to use a new player script separate from the script that came pre-packaged with the player model, and it gave me a chance to try and implement a movable camera into the game script. Since I had trouble trying to script my own camera, I tried researching for a free-to-use alternative, and I found a script that I could freely implement into my project.


I had also added a new area into my world terrain to see how well my character could walk through tight areas, but the collision mixed with the speed made it very difficult, to the point where walking up hills proved to be very difficult. I'm hoping that there is another script I could follow or learn to make the collision between characters and the world's terrain smoother. 


I've also been importing some of my Blender assets into Unity, and while it gave me mixed results because of how differently both engines handle rigging and lighting, it was still very exciting regardless. I first started with importing my mansion model from Blender, which came out far bigger than I had anticipated because I hadn't correctly scaled it (oops), but scaling it down was easy to do.


I am currently attempting to port my Squid character model into Unity as a replacement for the third-person player model. It is very difficult to do, but with the right coding and script linking, I'm hoping it will be a smooth transition. One of the reasons it is so difficult is that I have to link certain animations I made with it through scripting and code trees, which given how much I had deviated from the coding aspect of Unity, proved to be a very difficult process for me.

Even with all of this uncertainty, I am still very much excited for the additional things I'll learn in Blender, even if it causes regular headaches. Since I finished learning Blender in college, I have dedicated some of my free time to learn more about Blender modeling, and it's something I will continue to do with the hopes of pursuing 3D modeling and hopefully some more character models to put into Unity.




First-Person Shooter Project

24/11/2022


Continuing with our Unity development, we are now tasked with developing a first-person shooter project to expand our range in what types of games we can make with the game engine. We were tasked to create an environment with a plane for the floor, four walls to prevent any players or enemies from walking off, and environmental obstacles and shapes for the player and enemy to interact and collide with.


I had spent a long time trying to think of where to add the objects, since we were designing the level with an FPS designed character in-mind, I wanted to imagine the level as a map within an FPS game, so I put a lot of thought into where I should place assets, maybe too much thought. 

Because we were making a level designed for a first-person shooter, my immediate thought was to I started with having a large "clock tower" near the center of the level and adding a "warehouse" with a staircase and stack of boxes for the player to hide behind. I'm really happy with how the small building with the staircase came along though, I wanted to give my hypothetical players a place they could scout ahead for enemies while leaving themselves vulnerable on the way with how exposed they would be when running up the stairs to the roof. I will be adding more assets to the roof eventually, maybe I could use something from the asset store to make the alleyway behind the warehouse more visually interesting.


I wanted to add a terrain outside of the boxed level to give a bit of visual depth to a level greatly lacking in visual depth. I felt inspired by the game maps from Team Fortress 2 with how a lot of background visuals outside of the player bounds are often mountainous and desolate. It also allowed me to practice with how well I scale objects to make them seem smaller than they actually are. 

It didn't work as well as I wanted to however, but that's only because I haven't put a lot of planning into how I scale the mountains, which look like hills more than they do mountains because they're put closer to the player than they should be.


The enemy creation was the hardest aspect of this lesson, primarily because it required me to make animations for it in-engine and assign those animations to scripts that the enemy can follow. This part of the lesson threw me for a loop, and it took me around a week to somewhat understand how scripting non-player characters in this engine works.

For the enemy to move, I needed to assign it a "navigation mesh" component, which makes them scan the entire scene for where they are and aren't able to move. I had to make the stack of boxes and clock tower unwalkable as the enemy couldn't reach the surfaces. 

The enemy I had implemented wasn't perfect; its navigation mesh was standard and it could walk up slopes, but it could not walk up the stairs on the side to the warehouse (which probably has to do with how each individual step of the staircase has coll


I eventually added the carbine and reticle to my hud, and I was going to figure out how to implement the particle effects for the player weapon, but I ran into a problem where my movement would be drastically affected depending on which direction I look in. I would shoot to the right if I look diagonally to the left, I would move backwards if I look to the right, etc.

I know that the carbine mesh has something to do with it since it only started doing this after the fact, but even after removing its collision, it gave me the same results, and it was very frustrating to deal with.




Overall, this lesson had proved to be the messiest learning experience I've had with Unity so far, and it made me concerned enough to jump on the fence of game development and making a game on my own, reconsidering if it is really possible for me at this time. However, I know that is because I had spent more time on certain aspects of this project that didn't need the time I was giving them, and if I had moved quickly from one aspect of the game to the next, I think it would have gone a lot smoother. 


Vehicle Waypoint Directing


For our next Unity assignment, we were tasked to use a pre-made waypoint script to command an asset to move. I started by spawning my terrain, applying it with one of our stock texture images, and I spawned in our pre-made tank asset.


Afterwards, I created an asset titled "waypoint", and gave it an extremely saturated and metallic material so it could be easily identifiable. I then applied the waypoint script to the red waypoint asset and duplicated them in an anti-clockwise order since I had assumed that the tank would follow in an order of oldest waypoint spawned to newest, and I was eventually proved correct in my assumption. The tank didn't move at first, which had me concerned, because this was a game engine that I still have very early knowledge of how to operate, but I had simply forgotten to add the second script we were given that had allowed the tank to follow the waypoints.





After understanding how the tank and waypoints function, a thought that followed immediately after was to import an entire racetrack from Mario Kart into Unity, and mark the track with waypoints so the tank could drive around it. I had used models-resource.com to download a ZIP file of the Nintendo DS version of Mushroom Ridge, a racetrack originally on Mario Kart: Double Dash on the Gamecube. I had chosen the DS version exclusively because of the significantly lower polygon count the DS version of the track had, and I didn't want to risk anything happening by using the version that has a much higher polygon count.



To make sure the tank moved around the track, I went through the entire track whilst leaving waypoints underneath the racetrack's road as breadcrumbs for the tank to follow, but for some reason, the tank would phase through the ground if there was a upward slope, or completely ignore the other waypoint markers. I experimented with the collision settings on the tank's mesh and applied standard physics to the tank asset, but I couldn't figure out why these issues were occurring. 

Once I get back to using Unity, I'll have to take a look at how these waypoint scripts are formed so I can gain a better understanding in how to form my own code, I feel that will help me understand the process of scripting a lot more. My time with Unity is done for now, but I would like to go back to it and learn its ins ands outs in my personal time. For the time being, I'm quite content with my minimal knowledge of the engine.



Unit 9 - Contextual Studies Evaluation

09/02/2023



In this unit, our primary goals were to understand the fundamentals of Unity, the game engine, and Blender, the 3D modeling and 3D animation software. Alongside that, we also had to develop our understanding of the impact games have as a medium through reports, deconstructions of games and a live debate which was held to speak in favor of a demographic we felt was underrepresented in game media.

With those unit goals, I had goals of my own; my primary goals were to properly understand two software tools that I feel are important in determining my career path; Unity, and Blender. Blender would determine my path in 3D modelling and animation, while Unity would determine my path in game design and programming. I understand it is a generalization, but these two pieces of software would be my big introductions to the type of media I want to make, and now that I am finished with Unit 9, I can safely say that Blender and 3D animation is something I want to pursue more, but I left Unity feeling undecided on whether or not I truly want to pursue programming and game creation.

With Blender, because it's a software used for animation alongside 3D modeling, I was able to apply rules of traditional 2D animation into Blender so the transition wasn't as difficult, it also proved to be a great test of my character design philosophies and skills as an artist when I got to model a character from 2D to 3D. The modeling and 3D process was something I thoroughly enjoyed because I was applying skills I had known elsewhere in a completely new way. The same can be said for Unity, even if it's to a lesser extent. I was applying my knowledge of game design and environmental design through raw inputs that I am producing, even if it's of ameteur execution.

I got to understand the general user-interface conventions as I used more varied software in this course, so while understanding Blender's user-interface was one of the hardest challenges offered by it, it also proved to have the smoothest learning curve. There are still some tools that Blender has that I can't fully understand, like the skeleton rig construction, or the skeleton layering, but those are components I have no doubt in easily understanding once I pursue my learning with Blender even further.





















No comments:

Post a Comment