Monday, January 28, 2019

battleMETAL - And Yet, Somehow it all works - part 3 - Animatics

Quite an animated conversation

One of the feature’s I wanted to exist in battleMETAL was that of ‘animatics.’ For background, an ‘animatic’ is like a glorified slideshow. These are usually defined as a series of static images displayed in a sequenced order - like a super slow animation. Why would battleMETAL need these exactly? Why for character transmissions of course. I wanted characters to be able to talk at the player, or about the player to other characters in a way that player could visually see and understand. Radio transmissions that don’t show some sort of image are hard to follow in the ebb/flow of a video game. The concept of a ‘radio operator character’ for players to interact with is also a long-established mechanic in gaming.


Part of the original scope of battleMETAL was that the player won’t do any interacting with characters outside of combat. This was mainly due to time and resource constraints, but also because of the story and who the player is in this world. However, the player still needs some things to go off of when playing the game, so I felt that I could implement workable animatics to cover this gap. The system I finished is ‘good enough’ but probably could use some refactoring to make it ‘best’.


I started with CSQC, the animatics being an entirely client-side event, in my mind. I created a short set of api functions to handle the flow of the animatic system. The overall design is something along the lines of:
 

    Receive event from server
    Load animation file
    Validate file
    Setup playback variables
    Render frame 1 
    Render next frame
    End playback


The kick off is the server sending a command to the client to begin an animatic event. Quake was designed from ground up as a client-server game even in single player. Single Player in vanilla Quake is just a local game server with a max player count of 1. So even though battleMETAL only has 1 player in its server, the code still treats that player like any other remote-connect client in the code, which I think is a good thing. On the server, I created a custom map object, event_animatic that can be triggered by player touch or by other map objects, which sends a command to the target client to begin the animatic.


When the player’s CSQC receives the command, in this case changing the player’s “state” variable to _ANIMATIC, the CSQC begins the playback of the animatic. First step is loading the file. I decided that storing animatics in plaintext files was super handy both for readability and performance. The game engine doesn’t need to keep possibly dozens of animatic scripts in memory during gameplay, and because animatics are a low-delta event with no read/write commands, having the game load them from text files seemed like a good approach. I’ve always enjoyed the JSON syntax for data storage, considering it a better alternative to XML. Leveraging the Darkplaces source port’s ability to parse text files, I whipped up a crude JSON-style parsing function. Besides, its always fun making your file extensions.

So in test.anim we see the following:

{
  'music' : ,
  'nomusic' : 0,
  'backimg' : ,

  'trans' : ,
}

The first { } is always the ‘metadata’ tags for the animatic file. Music is which sound you want to play when the entire animatic starts up, and this will play until the end of the animatic. NoMusic will shut off any CD music playing in the background ( don’t worry, it’ll resume the stopped music when the animatic finishes). BackImg is if you want 1 background image to be rendered beneath all subsequent image frames. Finally, Trans is for ‘transparency’, setting the global alpha value of the animatic during playback. These values are then stored in a 1-dimensional string array labelled simply named ANM_META_DATA. Once the metadata is loaded the text parser expects frame data to come next. Frames are defined by { } as well, and there’s no true limit, for the sake of brevity I imposed a 20 frame max (do you really need moar?).


{
  'image' : gfx/hud/target_box.png,
   'pos' : 0.79 0.225,
   'size' : 0.05 0.05,
   'sound' : sound/anim/t1m4_lineb.ogg,
   'text' : ,
   'alpha' : 0.85,
   'color' : 1 0 0,
   'text_color' : 0 0 0,
   'time' : 3,
}

Image is which image you want this frame to display, and you can set it to null / empty.

Pos is ‘screen position’ important: this is done in percentage of screen to make sure that the coordinates are screen-size agnostic.

Size is the size of the image. important: this is done in percentage of screen to make sure that the coordinates are screen-size agnostic.

Sound is the sound file you want to play on this frame. Its non-looping, and is not clipped by any sound on the next frame, so playback overlap is a risk here.

Text is any text you want rendered, I have this hardcoded to be rendered at the bottom of the screen and centered.

Alpha is the transparency of the frame.

Color is the color-tint you want to apply to the frame image and text. RGB values are in vector format between 0.001 - 1.0 for each color.

Text_color same as Color but overrides the color for the text.

Time how many seconds to render the frame for.


In addition to these tags, the code automatically does a fade-in/fade-out effect for each frame but I’m still on the fence if this is necessary or not. When the text parser reads these tags into the code, the frame reference number is used as the primary key for a series of arrays. I’m not particularly happy with this solution but I’ve chalked it up to the roughness of Quake C more than anything else….

string ANM_FRAME_IMG[20];
string ANM_FRAME_SND[20];
float ANM_FRAME_TIM[20];
vector ANM_FRAME_POS[20];
vector ANM_FRAME_SIZE[20];
string ANM_FRAME_MSG[20];
float ANM_FRAME_ALPHA[20];
vector ANM_FRAME_COLOR[20];
vector ANM_FRAME_TXT_CLR[20];


Did I mention that any sort of collections are non-existent in Quake C?

The outcome for all this is a simple but working system for achieving cutscenes and character transmissions for battleMETAL. There’s still some refinement to be had mostly due to a lack of use and testing of the code, but as it stands I’ve tested that the code works at all.

No comments:

Post a Comment