Posts Tagged ‘unity3d’

Unity Networking looks like it is getting closer!

April 15, 2015

Exciting!  It looks like the first iteration of Unity Networking might be implemented in Unity 5.1.  I’m quite enthused by this, as this is the key development I’m waiting on before investing heavily in building the next version of my multiplayer networking project.  Well, not necessarily.  Recently I acquired a license for the bolt engine, which looks like a reasonable alternative contender for this sort of functionality in Unity.  So I may actually proceed with that for the time being, and potentially also the foreseeable future.

In other news, I was working quite hard to deploy a custom build of browserquest on google app engine a month or two back.  I’ll spare you, the reader, the details, but basically I found the main obstacle to be documented in this stack overflow issue, which led to the author of same raising this ticket.  So it doesn’t look like this sort of thing is possible at the moment – at least on google app engine.  On amazon elastic compute cloud though, it should certainly be quite doable to implement.  Maybe one might be able to write a lambda function for the nodejs server requests.  Now that would be an interesting learning experience.

Dungeons and Dragons – video and alpha brainstorming

February 28, 2015

Hi folks,

I just thought that I would provide a brief update as to my current thinking on my dungeons and dragons project.  As mentioned in a previous post, a key barrier to progress here was having access to a good in game level editor.  At first I thought opened was the best thing that there was in order to work with, and found it quite limited when I played around with it.  However, I recently discovered that it is possible to obtain runtime level editors from the unity asset store, relatively affordably.  So I followed through and purchased one, downloaded the data, and then ran it locally in Unity 4.6.2.  You can see the results of my investigations below:

As you can see, it is possible to raise and lower terrain, and add objects to the scene, such as buildings, trees, and jeeps, etc.  So basically everything that I was after, and more.  Furthermore, the package provides complete access to the underlying source, so it is eminently feasible to plumb additional functionality into the program, which I think is quite exciting.

Hence, it now becomes possible to start working towards a first version of my dungeons and dragons dungeon master style multiplayer game.  In particular, I think there are a number of things that I’d now like to do:

  • Plumb in the multiplayer functionality from my previous project.
  • Introduce a simple and straightforward database model for players and dungeon masters.
  • Allow players to spawn in a world in an appropriate way (without falling forever).
  • Allow the dungeon master to move players around.
  • Allow the dungeon master to switch their camera to a creature under their control and move it around.

There are other things I’d like to do, but will probably defer for a later release:

  • Allow the dungeon master to switch their view to a player, default: passive (optional: override player controls, and notify player that their controls are overriden).
  • Saving and loading levels.  The runtime level editor that I acquired does have a save and load functionality (it encodes levels as text files), but it doesn’t work quite the way I’d like it to currently.  Ideally levels should be written as blobs to a database with a secondary key being the user who generated the level so that they only have access to their level list.
  • Give the dungeon master access to master database controls, eg, a switch to reset player positions (if some of them have started to fall forever, for instance).  I’d probably like to give players a reset switch, too (but limited to themselves only, of course).

And then, in a release after that:

  • Enable persistence of player position in each of the worlds in which they have played.  So for instance if player Rodriguez has played Bob’s level ‘OrcTown’ and Jill’s level ‘CottageIndustry’, if either respective DM loads said levels and then Rodriguez logs back in, Rodriguez should appear with the position and rotation coordinates he last held while playing that level.

Plumbing in the multiplayer functionality should be relatively straightforward.  I will need to create prefabs for the players, of course, or at least migrate the ones in my earlier project across.  I will need to create any necessary prefabs for creatures that the dungeon master is to introduce into the world.  I will need to reintroduce a lobby, the ability to log in, have passwords authenticated, and create a game (to become the DM of that game) or join a game (that a player has created).  A messaging server will need to be created (using smartfox, for instance, though that may change with Unity 5), and some sensible database structure built.

On creating a game, a player should have their ‘DM’ flag in the player table set to ‘true’.  If a player joins a game, their ‘DM’ flag should be set to false.

A game should not be joinable (in the lobby) if a terrain has not been deployed.  In this instance the CanJoinGame flag in the player table for a DM who is preparing a level should be set to false, and the game should not appear in the list of available games in the lobby.  If a game is full (eg, 4/4 players), it should not appear in the list of games in the lobby either, but that is something that can be deferred until later (also, one might want to distinguish between ‘public’ and ‘private’ games.  If a game is public, anyone can join, if a game is private, only users who have been added to the dungeon master’s campaign list in PlayerCampaign should have visibility of the game.  Ultimately, too, one would like to be able to filter games in progress so that one could find the one one wished to play exclusively.

Once a player has joined a game, they should spawn just above the terrain say in the centre of the map.  Since terrain can be raised and lowered, this will be an interesting (but important) problem to solve.  Alternatively, perhaps the players could appear as tokens that the dungeon master could place (or they could place) at their will and leisure.  This might be a better way to go, where players would have a trimmed down version of the dungeon master’s interface, but they would still be able to drag and drop themselves, once, into the game.  Then they could jump between the global map view, and first person perspective from any of the creatures they control (the default being just one, their character).

This leads to the need for a fairly important feature in the first version of the game: the ability to toggle between the global controller / camera, and to avatar into a currently controlled token, and use its controller / camera.  This may be difficult to do but I think that the payoff will be worth it.

Moving tokens and objects around should be relatively straightforward, as that is already built into the base template.

Unity, Git, and other meanderings

August 2, 2013

My Unity projects have been starting to become a bit large and in need of a good pruning / refactoring / reorganisation.  However, I am loath to do so with abandon, since I am afraid that I might break something irretrievably!  So, I’ve started to develop an interest in version control systems.  Indeed, with the release of Unity 4.2 I was interested in what could be done with the new version control system that is built to work well with it.

My initial best guess was that it would be Git – instead, it is a different animal, namely, Perforce.  Perforce, apparently, is supposed to be uniquely suited to dealing with teams working on very large (100s of MB, if not upwards of 1 or 2 GB) projects, offering the “best of Git” together with the “best of Subversion”.  Merging is apparently done on the server, rather than on individual machines (terminals?) of users – as opposed to Git, where the merges must be done locally.  Interestingly, it appears that it is possible to obtain fairly reasonably priced plans with the tech – in the cloud – for a reasonable amount of storage, too!

So I had a bit of a poke to see what the story was with implementing a setup of the thing.  Largely speaking, however, I found the profusion of products, terms, and other general goobledegook on the main perforce website a bit too confusing, and difficult to navigate to find, well, the main perforce VCS (maybe it is P4? not sure).   Indeed, even then, I’m still not convinced that it suits my purposes – since I am still working largely by myself with Unity, and therefore am just after a VCS where repositories can be locally managed.  This does not mean that using Perforce isn’t ultimately the best option for VCS with Unity – I’m sure that there may well be some logic to the whole thing – but I’m currently interested in a quick fix.

So, since I already understand Git – and have never version controlled a Unity project before, anyway – I decided to have a poke around the web for ways to get started directly using Git with Unity.  A post on the Unity forums led me to this series of posts, which is essentially more or less a comprehensive ‘how to’ of using these technologies together.

In other news, I’ve been starting to work on my first opensource collaboration.  This has been quite interesting.  I’ve been learning about Oauth for google and twitter, in terms of “logging in” to a service from a website.  Fascinating stuff!  Also some deep and interesting things about python, too.

While I was ferreting around on StackOverflow, moreover, and trying to understand the structure of the project in question, I came across another goldmine of useful information regarding things pythonic – a series of three posts written by a fairly knowledgeable chap.  They were:

There is also an extremely useful post / discussion about *args and **kwargs here.   The top two comments there clarified matters for me on these terms no end.

Regardless, I’m currently up to the stage with the project where I’ve managed to, sort of, get twitter authentication working.  The main roadblock I’m encountering now is a ‘pickle error’ (yes, another pythonesque thing I’ve never encountered before this point).  It turns out that in certain circumstances Python encodes objects (like functions or classes) as a bytestream (this is ‘pickling’), which is sent somewhere, and then decoded (this is called ‘unpickling’).  However certain things cannot be pickled.  For instance, if one function calls another function that is not instantiated within a base class object (as I have done), then that is a no-go.

So basically the plan to eliminate the pickle error is to consolidate a few functions, hopefully in a relatively sensible manner, into an authentication class.  That should be interesting.

Guards vs Skeletons

June 4, 2013

Hi folks,

So I’ve done a little more work on my project.  This time I’ve sandboxed the progress in a new scene, with a road and some buildings.  The main aspects of experimentation have been Unity 4.1’s new MecAnim animation system, patrol pathing, and basic AI for recognition, attack, retarget, knockout and respawn.

TL;DR – here is the video:

As you can see the combat behaviour is not totally optimised but the general idea is there.

In terms of working materials, I have used a couple of assets from the asset store (one purchased – the guard and associated mecanim animations, the other free – the skeleton model and animations), and also some AI scripts following a bread trail starting  here.  Working with the AI scripts was quite interesting.  Amongst the key learnings I gathered (or rather, the key bugs), I discovered that quite a few were simply due to the fact that I had transforms in the wrong location.  Making extensive use of Debug.Log statements to check that the flow of the script was working as I expected was very useful and helped guide me to avoid various pitfalls.  I also found that simply defining obvious booleans (eg, Engaging, Attacking, Friendly) was helpful for debugging.

Basically what I wanted to implement was the following – a situation where there are three types of creature: players, guards, and mobs (enemies).  Enemies view players and guards as viable targets, players and guards view enemies as viable targets.  If there are none of the other present, guards (or monsters) will path on a preset sequence of waypoints.  If a hostile rel them passes into their line of sight (I found that checking for ray collision was largely empty unless I raised the ray emanating from the creature a nominal height; this was due to the nature of capsule colliders I am using for creatures), or is sufficiently close (their “noise” > given threshold value) then they move to engage that target.

Once they are adjacent to the target they move to attack it (I found that I needed to reset the transform periodically for creatures when they were attacking, otherwise it bugged out, ie they did not stay fixed in place.  Even when I did this I became confused since it was still bugging out, but then I realised I need to fix the rotation as well.  So prior to attack I calculated the initial position and rotation information and then fed these as temp vars into the attack function).  If the target moves out of range they follow and re-engage.  If the target is reduced to < 0 hp it is destroyed; the associated spawn point for that target is then empty.  The spawn point then detects this and a cooldown timer is initiated.  Once this cooldown timer reaches 0 a new creature is popped at the spawn point and starts to patrol, etc.  As part of the spawn process the list of targets for the creature which struck the knockout blow is refreshed, and a new goal (hostile creature) assigned if there are any in range.  It then proceeds to attack the new hostile creature.

I decided that skeletons should have 50hp and guards 100hp.  They both do a similar amount of damage.  In the video you can see the cycle of destroy-respawn-destroy-respawn, an unending tide of guards and skeletons leaping to rejoin the fray to replace their fallen comrades.

One more thing is worth mentioning.  I also kept track of variables to make sure that the animation playing is appropriate – this is the MecAnim part of the picture.  So the guards and skeletons have state machines for their animations, as well as for their AI behaviour.

————————————-

I am reasonably satisfied with the outcome, but I know that in theory the implementation could be a lot cleaner.  My code is not totally unreadable but still is, for some scripts, way too long and should be written in a more sensible, concise format.  I’m also certain that in a few places I’ve written out the same functionality more than once; a bit more focus on reusability certainly would not go amiss.  And also redundancy – I’m fairly certain that the code is not as tight as it probably should be, in fact, I know that it isn’t.  This applies more generally to the project at large.  There are many places in my work where I’ve coded the same functionality for different things in different contexts.  Or even functions that have not been used.

So a little work needs to be done at some point on said matters.

In particular, todos that spring foremost to mind with this particular line of work (many of which, such as style, readability, best practice etc. I likely will not implement since this is primarily a “finding my coding feet” ongoing exercise):

  • make it clearer if a creature is in combat (ie, none of this running around in circles spontaneously business) – it could be that artifacts such as running around in circles occur because noise isn’t refreshed immediately for the goal creature when the original target is destroyed.  This could be fairly easily redressed.  Or maybe I might like to simply have a fixed (small) radius wherein if the goal is within that circle the creature automatically moves to attack (this is currently not the case, but it should be, and that is what noise/”sound” detection (generalisation of said idea) was supposed to do)
  • possibly aim to fix creatures in place during combat – try to make things more static
  • morale stat for creatures – chance to run away if allies nearby down / no support (possibly hard) or hp low; then move randomly away from attacker, or seek “wall” tagged object to attempt to throw off attacker?
  • knockout animation & possible option (if player dealt k.o.) for ability to loot the creature; then despawn timer prior to respawn
  • refactor the code to make some of the classes less lengthy and so that they have a more logical structure
  • subclass creature.cs (my main AI script) as part of a SFS_creature.cs script so that it interacts well with smartfoxserver
  • NPC_targeting.cs (my main AI targeting / listing script) – make this dependent more on “Broadcast messages”, possibly local messages.  There is a messaging system I am currently using for health bars – this however is more general and can be extended.  If the world becomes quite large and hence contains 100s of monsters, guards, players etc I do not want every single monster / guard / player to have a list of length 400 encoding and tracking the position of everything else in the world – that would be untenable.  But that is precisely how the program works currently.  Evidently this will have to be fixed, so that only enemies / guards in noise / detection / aggro radius are detected.  Or “targeting radius” which is larger than aggro radius.
  • Incorporate the idea of “threat” levels depending on which creature is damaging which – ie first calculated purely as a function of total damage dealt, then maybe factoring in threat generation abilities (eg taunt – sets to top threat) or threat reduction abilities (feint), and threat reset effects (moving beyond targeting radius – as per point above).  Evidently want the creature with highest threat to be the creature targeted.

The next part of the project will focus on hooking up a mysql database with the smartfoxserver software I am using as described here.  Basically I’d like to aim at storing 8 fields: username, password, spawn location (x,y,z) and last seen location (x,y,z).  Essentially I would like last location to be updated every time the player logs out, and then persisted in the database, so that when they log back in (if it is not a first time logon), they appear at that location.

If it is a first time logon they appear at the default spawn location, and last location is null until populated.

The architecture of the goal application will essentially be

client unity application <-> server SFS2X application <-> mysql database

with <-> denoting two way communication, the client application written in C#, the server application written in Java, as is currently the case; the new component will be the database.  Naturally a database can consist of more than one table, and more than 8 fields.  In a full blown, sophisticated application, there is no reason that all data about the players should be able to be stored in an appropriate database schema.  So certainly there is considerable room for extending this aspect of my learning, once I’ve managed to get this proof-of-principle up and running.

Consequent to this I might look at readapting this patrol example for SFS2X-deployed unity games, or something completely different.

Ideas as to things I should probably look into doing next

May 27, 2013

Hi folks,

In addition to some of the things mentioned in the last post, I’ve had a bit of a further play, and have identified to a better degree as to what I would like to try to implement next.

Namely, I have been having a look into MecAnim to a greater or lesser extent – in particular finding this tutorial quite useful.  I’ve since accumulated a bit of a library of basic animations, and a few models, old and new.  In addition to this, I’ve been looking into patrol pathing for the creatures in question, sandboxing this work within a new scene.

Essentially what I would like to achieve is to have a group of friendlies – guards, on a set patrol path, and a group of enemies – skeletons, also on a set patrol path (inspired by this resource). When the two come within sight radius of each other, I want them to close for melee combat.  Upon attack, I would like the other creature to sense this and also join combat.  Creatures will have to face in the correct direction and be within a certain range for combat to be permissible.  In addition, I would like each creature to have a set number of hit points.  When this reaches zero, I would like an appropriate animation to play – then a moderate delay prior to destroying the prefab.  Then I would like a replacement enemy / guard to spawn and start walking along the original patrol path until it sees an enemy / guard, then charges to initiate combat, etc.

In terms of animations, I would like a walking animation to play while the creatures are walking, maybe a pause and idle animation at each waypoint, or certain specific waypoints, and if an enemy comes into sight radius / arc I would like the model to charge until within melee distance – at which point it is to stop and play an attack animation.  For a more convincing experience, I might also like to build in a “morale” stat, which determines at what health % the enemy / guard is likely to break and run in a random direction, at reduced speed.

I would also like the player to be able to join combat on one side or the other, most likely in favour of the guards.

For a bonus feature, I would like to have a button on the player HUD that allowed the following effect to occur – random spawning of large boulders falling from the sky, at say 5 or 6 (or maybe 10?) fixed and predetermined spawn points for the purpose.  ie so that at every predetermined time interval (allowing for delta time so that the implementation is hardware agnostic), a random number (say 1 to 3) spawn points are randomly selected.  Prior to impact I would like there to be an obvious red glowing marker at the location that the boulder will fall.  Then I would like the boulder to spawn and fall at some predetermined speed, and autocrush any avatars beneath, forcing respawn.  Subsequent to impact I would like the boulder to decay after some period of time, maybe with a lifetime of 1 minute  (In the intervening time I would like it to be treated as a wall for pathing purposes, as according to the particular AI script I will be using).

I would then be interested in implementing a dive and roll animation into the player controls, and possibly introduce a stat “reactions”/”reflex” to guards / skeletons, which make it more likely that they will detect imminent impact and also dive out of the way, even if they are in combat.  I would also like to be able to switch this effect off from the player HUD at will.

Later, time and resources permitting, ideally I would like to introduce missile combat animations (bow and arrow, and/or crossbow, sling, etc), and magic combat animations, to allow for guard / skeleton variants that can engage in combat at range.

So that’s my animation plans.  For the time being, as mentioned above, I intend to sandbox this activity, since converting the work to something that is synchronised in a SmartFoxServer might make things harder to build initially; I can do the legwork for that later.

The second thing I am interested in doing, and will probably take much longer to figure out, is to determine how to integrate a MySQL database with a SmartFoxServer application.  The ultimate goal of this will be to implement an architecture wherein I can store player specific login names, passwords (so, for instance if this was a licensed game, they would need to transact, or at least register to obtain a name and password), together with a table containing the set of all that player’s characters and their names, together with a table that, for each character name (uniquely specified for the server (for chat considerations)), contains the information as to health, stats, and inventory.  Inventory would be a string that contained a list of words, eg, potion, wand, leather armour, broadsword, that would presumably be read by the SFS extension jar file and allow the server to transmit to the client the data that relates to the items they have.  That sort of idea, anyhow – so that the data, related to persistence of character state, for the collection of characters each player has, is stored on the server.

So that’s the goal.  I do not expect implementation to be easy, particularly since documentation seems relatively scarce regarding such.  However, this post looks like a good place to start on said matter.  In particular, it is indicated how one might set up a custom SmartFoxServer via the RightScale platform if one wanted to deploy the server in the cloud, say on Amazon EC2, via the 12th point of the recipe (one can directly modify files while booting up/setting up a virtual machine on RightScale).

Game Development #4 – Shop, basic dialogue, and game state persistence

May 26, 2013

I’ve made some further progress with my unity game.  Here is the latest demonstration video:

Previous videos in the series, in reverse order of appearance:

Bridge Battle

Animation and multiplayer demonstration

Transmission of player location via smartfoxserver

In this video, I demonstrate essentially integration of some functionality from a package from the Unity asset store into my game.  The features added are

  • an action HUD, with “examine”, “use”, “talk”, and “focus” as allowable actions.
  • the ability to shop a store
  • basic dialogue (managed as a state machine) – together with the option to switch from english to afrikaans
  • game state persistence
  • basic inventory and ability to equip items

The things that I would like to look into next would be, in no particular order:

  • The option to choose an avatar (or save game) on entry.  Perhaps allow multiple avatars per player, but have persistence of their state (like in a standard MMO).  Allow for different characters to be played (ie, different avatar appearance) but synchronise the chosen avatar across smartfoxserver so that all players see the same thing; this could be done, for instance, by indexing a set of prefabs which was the allowable avatar form list.  Use MecAnim to standardise animations across all avatars, so that there is no ambiguity when it comes to making a call to a player animation in order to do something in particular (eg walk, strafe, jump, run, fight (melee), fight (archery), fight (magic) ).  [This is in fact a great strength of the current incarnation of Unity, the ability to essentially map animations to different character frames]
  • NPC guards on a patrol path.  I thought I might be able to use Rain{Indie} for this, although that might be slightly over the top.
  • A better inventory system.  Allowing one to have object representations for items and mouseover text, if possible.
  • A character representation for item placement.  Allowing one to drag and drop items to particular locations on the character’s frame, or, alternatively, to be able to select an inventory slot and equip / unequip whatever is available in character inventory.
  • Monsters vs NPCs, with spawn points for both – for an ongoing pitched battle.

Two excellent pieces of news

May 22, 2013

Hi folks,

I’m afraid this is not another content post, but the following items may be of interest.

The first is that Unity is now allowing developers who use their software to deploy not only to standalone, and to the Unity webplayer, but now also to Android and Apple mobile platforms for free.  In order to do this, merely open up Unity, open the “Unity” tab, click “Manage License” and then select “Check for Updates”.  Your license will be updated automatically, and mobile unlocked.  Via this post.

Another interesting development – you may recall that earlier I wrote about dotcloud’s amazing opensource release of Docker.  Now the same chaps have made it even easier to independently deploy “sandbox” style applications on same, via their release of a new project on their GitHub pageSandbox.  The original blog post describing this release is located here.

I have not yet investigated this wonderful new tool, but if you, the reader, are interested in checking it out, the installation instructions, both for docker and sandbox, can be found at this location.

Convolution reverb and audio editing

May 11, 2013

Audio is not really something that is my key interest or passion, but I have heard from someone who knows, that when it comes to sound editing, or studio recording, reverb is an extremely useful effect to add to a particular line of the track.  (By line, I mean singular audio input; multiple audio inputs would obviously come from the different instruments involved).  There are other important effects too, that interest sound engineers – here is a list of the most commonly used, but reverb is the most important, since it provides the impression that the instruments are playing somewhere.

Reverb is useful for completely digital music, music recorded in a studio (where the room is designed so that there are no reflections), or some mix of the two.

It is also a fairly lucrative business to get into.  ValhallaRoom, one of the leading amateur plugins, has sold hundreds of thousands of copies; and other commercial software of this flavour can cost thousands of dollars per license.

Valhalla Room reverb

So what is reverb, and how does it work?

Most generally, a room or space will have a characteristic wave equation associated to it: (\Delta - \partial^{2}_{t})f = 0, with particular boundary conditions determined by the geometry.  If we apply a forcing term to this equation (\Delta - \partial^{2}_{t})f(\vec{x},t) = h(\vec{x},t), then we essentially simulate the addition of sound sources to the room.  This is the general situation we wish to solve for; given an input signal (h) what is the audio response (f) ?

HERE THERE BE GREENS FUNCTIONS

It turns out that this problem is soluble, if one introduces the concept of Green’s function, which in audio speak / signal processing lingo (for this particular PDE) is also known as the impulse response .

We solve for the equation Lf := (\Delta - \partial^{2}_{t})f(\vec{x},t) = \delta(\vec{x} - \vec{y},t - \tau) instead.  Various techniques can then be used; separation of variables is probably best from this point, if one observes \delta(\vec{x} - \vec{y},t - \tau) = \delta(\vec{x} - \vec{y})\delta(t - \tau); the response of the system to a point source of strength one at location (\vec{y}, \tau) .  It is then, with a bit of work, possible to determinate a solution G(\vec{x} - \vec{y},t - \tau) to the above equation; the Green’s function solution.

Then, with a bit more work, if we observe that

h(\vec{x},t) = \int_{\vec{y},\tau}\delta(\vec{x} - \vec{y})\delta(t - \tau)h(\vec{y}, \tau)d\tau d\vec{y}

but h = Lf, and \delta = LG, which suggests that our solution is

f(\vec{x},t) = \int_{\vec{y},\tau}G(\vec{x} - \vec{y}, t - \tau)h(\vec{y},\tau)d\tau d\vec{y}

which, with a bit of careful reasoning, can actually be proved to be the case.  So our solution is the convolution of the impulse response, G, with our input signal, h.

PRACTICALITY AND IMPLEMENTATION

WLOG, we can simplify and absorb the boundary and shape information into the metric of the Laplacian, and assume we are within a cubical domain.  With a further simplication, the more practical scenario, we can simply consider the ODE

f''(t) + Af'(t - c) + Bf(t - d) = \delta(t - \tau) to compute a Green’s function for a ‘room’ with shape parameters A, B, c, and d.  Then to return our signal to the listener, we merely compute the convolution of the solution to this with the original input.  For more of this flavour, there is actually an impressive series of lecture notes at this location: mi.eng.cam.ac.uk/~rwp/Maths/vid09 , in particular l2notes.pdf .

But can we do more than this?  Perhaps, if we allow for nonlinear terms.  Or higher order, ‘soliton’ terms – although I’m not sure that this consideration would be helpful.  But whatever the outcome, at the end of the day, we get a system that has tweakable parameters, like ValhallaRoom, above.

ADVANCED CONSIDERATIONS

So that’s all well and good, but what about the general case?  What happens if one considers an environment with a very complex spatial geometry indeed?  Would it be possible to say create a game environment and experience sound sources from different locations in the scene?  This would be an excellent (if not extremely difficult!) project.

Well, it turns out that this work has already been done.  Some researchers have managed to implement the full solution of the general wave equation, in a general game environment, like Unity, by precomputing the impulse response (requiring anywhere between hours and many 100s of hours of computer time), to ‘bake’ in the impulse response so that it can be convolved with sound sources as the player is walking around the scene.  Note: ‘baking’ is also used for other things that would be computationally expensive to do on the fly in-game, such as lighting.  Apart from the fact that these folks have a working solution, they also seem to have got around a number of the road blocks that have shortcircuited the practicality of such in the past, such as compressing the size of the data file needed to store the impulse response information for an outdoor scene from about 100 gigabytes down to a few megabytes (maybe using techniques such as fast fourier transforms? as well as simplifying assumptions), and using various tricks to make shifting geometry and movement for audio effects computable on the fly.

In the demonstration video, some really cool effects are demonstrated: diffraction of sound around objects, the characteristic ‘double-ring’ of a bell in a bell tower caused by sound reflections, muffling or muting of sound caused by obstacles being in the way, and more!  It is very, very cool.  If such a technology was available for Unity as a plugin/asset/resource from the Unity Asset Store, for, say, an affordable price (like ValhallaRoom) I’d easily snap it up in an instant.

The paper itself, describing a sketch as to the algorithm and method used in the video above, is located here.  The video itself is downloadable from this location.

PLANS / PROJECTS

Consequent to this summary of the basics and also the state of the art, doubtless the question might be raised, where does this leave me on this matter?  Well, as mentioned before, I’d love to use the technology demonstrated above in a Unity game.  As a secondary consideration, I’d be interested in having a play with a simple reverb signal modifier, seeing if I can hack one together, as described above.

Regardless, I’ll be sure to write if I make any progress along such lines.