From ray tracing to AI: best gaming technology advances in the last decade

INSUBCONTINENT EXCLUSIVE:
Back in 2010, Halo: Reach launched to much fanfare
Almost ten years later, the revamped version has just hit PC, receiving critical acclaim once more and bringing the decade full circle.A lot
has changed in those ten years, and you can now enjoy Halo: Reach in 8K at a smooth 60 frames per second (if budget is no object for your PC
hardware, that is).And of course, a lot more has changed in the broader world of videogames throughout the 2010s
In this article, we’re going to highlight a selection of technological advances in gaming which we feel have had a big impact on the
gaming industry.That includes the introduction of ray tracing, adaptive sync technology, and huge leaps in existing tech like motion
capture, along with other advancements which have combined to make games far more realistic-looking and immersive – and more besides.Ray
tracing(Image credit: Remedy Entertainment)Ray tracing may be nothing new in the world of movies, but it is in games, and it hit the
headlines in a big way when Nvidia launched its GeForce RTX graphics cards in 2018 (although note that ray tracing had popped up previously
on PCs, but in very limited fashion with early tech demos).For the uninitiated – if there is indeed still anyone out there yet to hear
about this technology – as the name suggests, it involves tracing the paths of individual light rays to create a lifelike depiction of how
light works in the real world
The result is far more realistic shadows, reflections, and so forth, for much better looking graphics all-round (compared to rasterization,
the traditional method of real-time rendering).Although when RTX GPUs were first launched, there were a number of folks arguing that ray
tracing was really a dastardly plot cooked up by Nvidia to sell a load more expensive graphics cards
And while there’s an element of truth in that – it’s certainly part of a plan to further dominate GPU sales – this isn’t just a
moneymaking marketing drive.Firstly, remember that ray tracing isn’t just an Nvidia thing, but a step forward on the graphics front across
the whole games industry
On PC, AMD and Intel will also be getting on-board, and indeed Microsoft has built ray tracing support right into DirectX 12.But more
importantly, both next-gen consoles from Microsoft and Sony are going to be ray tracing-capable, with Tatu Aalto, lead graphics programmer
at Remedy (which produced Control), noting that ray tracing coming to Xbox Series X and PS5represents a “fundamental shift in how to
approach computer graphics”.It would seem to be clear enough, then, that all major game developers are eventually going to be on-board
with pushing ray tracing – and many already are
Control looks fantastic and uses ray tracing with elements like diffuse lighting, reflections on glass, and ultra-realistic debris, meaning
that compared to the game’s normal graphics, it’s an absolute night-and-day difference.Now, before we get too carried away, there are
lots of caveats with ray tracing in its current state
There are serious issues around performance hits – particularly outside of RTX graphics cards – and games carry different levels of
support, so for example Call of Duty: Modern Warfare only does ray-traced shadows.But as everyone adopts the tech, and GPUs evolve and are
continually honed (complete with driver and software innovations) to better cope with ray tracing, we can expect more new games (and indeed
old ones) fully showcasing the visual finery on offer; particularly when the next-gen consoles arrive
In short, it seems that the future of video game graphics, no less, kicked off in 2018.Motion captureMotion capture, known informally as
mo-cap, is nothing new of course
The process of capturing the movements of human actors – traditionally by sticking a hundred golf balls to a black spandex bodysuit – to
make for more realistic looking characters in games has been around since the 90s (and way before then in movies)
But in the past decade, some big leaps have been made with the technology which are worth highlighting as revolutionary, taking the
animation process to a new level of realism.Hellblade: Senua’s Sacrifice, a 2016 action-adventure from Ninja Theory, used Vicon’s
real-time motion capture system (shown in the above video) to facilitate incredibly lifelike expressions, and coupled with Epic’s Unreal
Engine, the game delivered some quite outstanding results in terms of realism.However, as Engadget points out, there was a bit of fudging
with the real-time mo-cap aspect here, such as the actor having to stick to a roughly choreographed set of actions, so the software could
keep up
But now, Vicon and Epic’s latest mo-cap demo ‘Siren’ offers true live motion capture, whereby a human actor can do whatever they like,
with the results transferred directly to a 3D on-screen avatar.Since Hellblade, computing and GPU power has increased massively
Jeffrey Ovadya, sales director at Vicon, told Engadget: “Unreal Engine has added even more capability and even more realism, so all of a
sudden, all of the background effort that went into Senua’s Sacrifice became just easier, shorter, quicker, and they were able to get to
that point faster.“So if they were investing the same amount of time for a project, let’s just say 100 hours, and the mo-cap part took
80 for Senua’s Sacrifice, for Siren it took an hour.”With motion capture becoming a seriously swift and painless process, developers
will be able to more easily utilize it, and have more time freed up to hone visual details and animations, with the end result being far
more realistic and immersive games (tying in, of course, with the ray tracing developments discussed above).Adaptive sync(Image credit:
Nvidia)Nvidia’s G-Sync and AMD’s rival FreeSync were introduced in the middle of the last decade, and are monitor technologies designed
to combat screen tearing and stuttering
In other words, they aim to give you a smoother gameplay experience, particularly with the likes of fast-paced shooters where any such
stuttering effects could put off your aim.There’s one sentiment commonly trotted out online when potential buyers are asking about whether
to get a display with either G-Sync or FreeSync, and that is adopters saying that they could never go back to a normal monitor now they’ve
experienced the joys of whichever flavor of ‘adaptive sync’ they’ve chosen.When it came to picking out the right adaptive sync-powered
monitor for you, this used to be a straight choice between AMD or Nvidia
AMD-powered screens have traditionally been cheaper because the tech is royalty-free, not requiring proprietary hardware in the monitor, so
it doesn’t cost manufacturers to build in support.Also, in the past, you needed to have an Nvidia or AMD graphics card for G-Sync or
FreeSync respectively, and there was no cross-brand pollination
However, that changed recently, when Nvidia made a move to allow its 10-series and newer GPUs to support adaptive sync on third-party
monitors, including FreeSync models.Even more excitingly for many gamers, these adaptive sync technologies are no longer confined to just
monitors
As 4K TVs have matured – and become considerably cheaper, which is certainly another boon this decade has witnessed – we’ve seen the
introduction of Variable Refresh Rate (VRR) with HDMI 2.1
This means you now get big-screen TVs which support adaptive sync, for example LG introduced G-Sync to its range of 2019 OLEDs.Furthermore,
this opens things up to consoles, too, as the Xbox One X supports VRR tech (but there’s a caveat in that only some games will benefit)
However, with next-gen consoles, we can expect a further push along these lines (we recently heard that the PS5 and Xbox Series X – as
well as AMD graphics cards – might work with future Nvidia G-Sync monitors).Smooth gameplay and fluid frame rates are as big a deal as
realistic and immersive graphics, particularly when it comes to playing online games competitively
So for us, adaptive sync’s introduction – and its broader push for more accessibility across PCs, big-screen TVs and consoles –
represents a sizeable step forward for gaming in the 2010s.Live streaming: Twitch (and more)(Image credit: Twitch)Back in 2011, Twitch was
born, and quickly gained millions upon millions of viewers, all watching professional streamers play games
The service really hit the big time in 2014, when Amazon bought it, and Twitch has gone from strength to strength since, with recent stats
indicating the platform has some 15 million daily viewers (and 2.2 million broadcasters, to boot).Live streaming video games isn’t just
about Twitch, of course
There’s also YouTube, Mixer (Microsoft’s effort, which is doing some serious poaching), and other options besides, plus streaming has
become such a big phenomenon these days that functionality is built directly into consoles.Although really, the rise of live streaming
represents a cluster of technologies advancing together, alongside the creation of the actual platforms to stream on
Certainly one major part of the puzzle was the rollout of faster internet connections (in the UK, BT’s fiber products launched in 2010),
which are vital to enable playing online and streaming that footage simultaneously.And AMD’s hardware has also played its part in enabling
live streaming, by introducing affordable multi-core CPUs with its Ryzen range – processors which are accessible to a mainstream audience,
and capable of coping with the considerable multitasking demands of playing a game and streaming it fluidly at the same time.All of which
has led to the explosion in popularity of games as a spectator sport
Being a pro streamer is now a potential career path for those able to blend their expert game-playing with suitably entertaining and
charismatic commentary
Such is the popularity of watching games, we’ve even witnessed esports being covered on TV by the likes of Sky as early as 2016…Not only
is it fun to spectate professional streamers, what shouldn’t be underestimated is that it is also a powerful learning tool
For those serious about doing well at the latest competitive shooter, or RTS, or whatever – watching how it’s really done can be an
incredibly informative experience
Streaming has certainly revolutionized the concept of tips and tricks for popular games, and taken it to another level.Game streaming(Image
credit: Google)While live streaming is about playing a game and broadcasting that experience online, actual game streaming means you’re
remotely playing a game which is hosted on a server somewhere else.Also known as cloud gaming, these kind of services actually kicked off a
surprisingly long time ago, with OnLive launching right at the start of the decade
Ultimately it flopped, because 2010 was simply too early given the contemporary internet infrastructure, but with the same faster broadband
connections which have helped drive the rise of live streaming, game streaming has become a realistic proposition.Since the early days of
OnLive, we’ve seen the likes of PlayStation Now, and multiple big-name players sprang up towards the end of the decade, including
Nvidia’s GeForce Now and Microsoft’s Project xCloud, although those products are still in beta.All of these streaming services have
their different angles and approaches as to exactly how they work, but probably the most high-profile offering is Google Stadia
This was officially launched in November 2019, and as we noted in our review, it’s a “true console alternative and, in time, a potential
platform killer”
In our testing, we found Stadia to be very stable, and it allows for gaming across a range of devices – including Chromecast Ultra –
plus the service streams 4K HDR.All that said, there are definite issues here, including that hardcore or competitive gamers won’t find
the Stadia experience palatable compared to playing on a local machine, plus those with slower broadband connections will struggle
You need a seriously fast (100Mbps+) net connection to get good performance, so in a way, you’re swapping worries about hardware spec for
worries about the ‘spec’ of your internet connection.Stadia has also been blighted by other launch bugbears like being rather limited in
its game library, and having missing features
But overall, we can’t ignore the promise that game streaming is now (finally) showing, particularly as we look towards a future of faster
broadband connections still.AI(Image credit: Future)Artificial Intelligence (or AI) is another technology which has been driven forward
massively over the past ten years
And while a lot of that progress might involve endless mentions of phrases like ‘machine learning’, ‘training models’ and
‘inferencing’, which may well threaten to send you into a deep (learning) sleep, AI applies to games as well as all this deadly dull
sounding stuff.AI is a pretty broad umbrella, of course, and accomplishes all manner of things in video games
Your first thought might be about AI governing the behavior of NPCs (non-player characters) – we’ll discuss the nuances involved there
later – or maybe even AI-driven ‘bots’ who can stand in as competition if human opposition isn’t available (or for training
purposes).You may have seen DeepMind hitting the headlines regularly in recent times
The British firm made big waves when its AlphaGo neural network beat South Korean champion Lee Se-dol at the game of Go back in 2016
Most recently, another of its AI programs, AlphaStar, proved itself capable of besting 99.8% of all human StarCraft II players – reaching
the top tier of ‘grandmaster’ level
And then we’ve had the likes of the OpenAI engine beating Dota 2 world champs…We can remember being impressed by the AI in Warcraft III,
way back in 2002
Having beaten the single-player campaign, engaging an AI opponent in multiplayer resulted in a quick shock defeat, and a brutal lesson that
we were far from optimal when it came to our base building and ‘creeping’ strategies (and that the computer was surprisingly good; far
better than we were expecting).Looking at where we are now with AI engines overturning the very best human players points to the possibility
of an entirely new standard of AI to train against – so will these artificial intelligence chops spill over to make for much more cunning
or evolved NPCs in the likes of single-player campaigns?No is the short answer to that
This is because what we typically think of as AI in single-player gaming isn’t really true artificial intelligence which the likes of
DeepMind or machine learning researchers truck in
It’s more of a fudge, an illusion of intelligence, with NPCs admittedly still making complex decisions – but based on cleverly crafted
behavior trees, as opposed to ‘true’ AI or self-learning processes.As an illuminating article by The Verge points out, NPC actions
actually require a certain degree of predictability, to ensure that the narrative and overall game experience flows along the lines of what
the developers have in mind for the player
A true AI which could make all sorts of weird and wonderful decisions of its own volition would make for a far more unpredictable game, but
this wouldn’t deliver a satisfying experience
To pick a basic example, an NPC might decide not to perform a critical action which is needed for the player to complete a quest.In short,
the game world would be a far more chaotic place, and that might be a lot more like the real world – but it wouldn’t be much fun (and
could maybe even break the game).The Verge suggests that rather than trying to create freethinking NPCs in games, AI will be involved in
automated game design, facilitating the creation of art assets or levels
And eventually, from a broader perspective, an AI could serve as a ‘games master’ that changes things on-the-fly to better suit the
player, and what it predicts they might enjoy based on what they’ve done and how things have gone so far.We’ve already seen artificial
intelligence spread its tendrils to get involved in all manner of different facets of the gaming world
For example, DeepMind is now helping you choose which games – and apps – you might want to download from the Google Play store.AI is
also helping to revamp the graphics of old games, with modders using the technology to upscale the visuals of classic titles at speed, such
as giving Morrowind a fresh coat of paint.In the nearer-term, though, competitive gamers might be set to benefit from AI the most in terms
of the training possibilities we’ve already mentioned
Imagine an AI ‘game assistant’ hoovering up data as you play a shooter, analyzing every element of your play-style, comparing that to
data crunched from top pro players, and suggesting ways that you could be better, or pointing out bad habits that are getting you killed.And
from a wider perspective, away from multiplayer or esports, these sort of analytics running across an entire player base, maybe coupled with
the aforementioned AI games master at the helm, could usher in a kind of evolving game which constantly adjusts and hones itself over
time.Looking at the progress of AI over the last decade, this sort of experience could easily be in the pipeline, and if that isn’t an
intriguing prospect, we don’t know what is (just don’t start thinking about how this is, of course, wandering further down the
privacy-stripping path of data mining).g2sv8KC2oWV3z4xDcmnzEA.jpg?#