Pages

Monday, September 17, 2012

Project Green: Enemy AI and NGUI

Been doing lots of updates on the game lately, completing the enemy AI, fixing some bugs and porting the game main UI into NGUI (Next-Gen User Interface).

Enemy AI

Originally, I was planning to post something which was entirely about the game's enemy AI, but I think I'm just gonna squeeze it into this post instead, in a short way.

First of all, Project Green is a stealth adventure game, inspired by Hitman, Metal Gear Solid, and Zelda. Thus, you can sorta imagine how the AI would turn out to be.

In Hitman's format, the enemy's AI would behave like this:
- minding their own business at first (idle)
- when detected a mysterious figure, suspect him
- after a while, if the mysterious figure is still there, go forward and investigate
- if found out to be (really) suspicious, take out their gun and start shooting (at 47)

In MGS's format, the enemy AI would be more alerted because anyone other than their own kind is the enemy (you can't wear disguise and blend into plain sight as much as you can like in Hitman):
- minding their own business, or just guarding around (idle)
- if they hear a sound, they'll turn around and suspect something fishy
- if they hear more of those sound, they'll walk over to the noise source and investigate
- if they saw Snake, they'll aim their gun and start shooting 

In Zelda's format... Okay, I've never played all of the Zelda games, except Spirit Tracks (DS), which contains a small amount of stealth element. Like the front part where Link have to escort Zelda out of the castle without getting the guard's attention (they can see you, but you can't let them see Zelda sneaking out). Their enemy AI is pretty much static, except when they saw Zelda, they'll run over to her and bring her back. It's still a pretty exciting gameplay experience though.

For Project Green, I originally coded the game to be in Hitman's AI format. But when I started playing Snake Eater this March (yeah, I only started diving into the MGS world this year, never play a single MGS game before March 2012), my mind was totally blown by the amazingness of the stealth-action game, and my AI design started to follow the MGS way because it's more "fit" that way. I mean, it's the main character (a human), against a bunch of barbaric ogres, they have to get alerted when they spotted someone not their kind.

The AI for Project Green has 4 major states:
- Idle: stand or walk around a given set of waypoints
- Suspect: heard a sound somewhere, and suspect something
- Investigate: walk towards the suspected sound source, and check around
- Alert: saw Green (name of the main protagonist), and start chasing him

Just like those games I mentioned above. There's two kinds of "trigger events" which would activate these states:
- the "I see you, and you're not my kind, so I must attack you" event
- noise event

The first one is pretty self-explanatory, when enemy sees you within a given distance, they'll attack you. While the second one is more like activating the states based on the amount of noise that Green make (when he walk or run around), kinda like MGS. Gonna add in the "love box" item later on in the future.


NGUI

Been thinking a lot about optimizing the game more lately. So I went and get the NGUI framework from the Asset Store last week, while the September's Asset Store madness is still on.

What is NGUI? It's short for "Next-Gren User Interface". The description from the original site says:
"NGUI is a powerful UI system and event notification framework for Unity (both Pro and Free)... ... For a programmer this means a much easier time when it comes to working with the kit — from extending its functionality to tweaking the existing one. For everyone else this means better performance, less frustration, and more fun."

The description is quite broad, but ultimately, it means squeezing all your UIs into 1 draw call. 

The past few days has been quite a hard time for me, as I was trying to port my original Unity UI into NGUI, while making sure that everything works properly. In Unity's UI, the button events are activated by simply "if (GUI.Button())"; while in NGUI, you need to add a script to every single button you created, and check for their OnClick, OnHover, OnPress and other mouse/touch events. Which means more work for me to do.

But all those works are worth it, for the sake of my game running smoother on the mobile devices.

Up Next


What feature I'll be working on next:
- Group AI: enemies acting together, attacking together, etc.
- Mission Checking: Set and check mission, check player's stats and calculate a simple result
- Cutscene System: Simple camera tracking, characters animation, dialogue system
- NPC
- Item Collecting: More like an improvement/update, as the feature already exist.

No comments: