Early Impressions of the 'AI People' Alpha | AI and Games Newsletter 27/11/24
Sponsored by Little Learning Machines
The AI and Games Newsletter brings concise and informative discussion on artificial intelligence for video games each and every week. Plus summarising all of our content released across various channels, from YouTube videos to episodes of our podcast ‘Branching Factor’ and in-person events like the AI and Games Conference.
You can subscribe to and support AI and Games on Substack, with weekly editions appearing in your inbox. The newsletter is also shared with our audience on LinkedIn. If you’d like to sponsor this newsletter, and get your name in front of our audience of over 6000 readers, please visit our sponsorship enquiries page.
Hello all,
here and welcome back to the newsletter. I feel like this is the first issue that’s close to what we might call ‘normal’ around here for a little while, given the recent conference and now Kickstarter campaign.This week we’re going to be talking a little about AI People: a game that is currently in alpha right now that essentially is trying to use generative AI technologies, notably large language models (LLMs) to make a ‘Sims’-like game. I’ve been checking out the alpha playtest and will share some of my experiences later on in the issue.
But also this week we have the first of both my and readers choice for Game of the Year 2024, plus we dig into AI stories in the news and try to shed some light on it all.
With Thanks to Our Sponsor:
Little Learning Machines
You might recall that earlier this year, the newsletter was sponsored by the good folks over at Transitional Forms, whose game Little Learning Machines was the subject of a case study episode. They’re back today to sponsor this issue of the newsletter!
Discover a world where you can nurture and guide real AI companions, shaping their personalities and abilities—no coding required. In Little Learning Machines, every decision you make helps your robots grow as you guide them through charming islands filled with exciting challenges. Customize your companions, train them in real-time, and unlock endless possibilities to level up your AI training skills.
With the Steam Autumn Sale in full swing, get your copy at 40% off! Now with 18 additional islands, fresh quests, and charming new outfits for your robots.
Little Learning Machines will be on sale until December 4th, 2024.
Announcements
Some quick AI and Games related announcements before we get into the news and other stories of the week!
Goal State is funding Stretch Goals
As mentioned in last week’s Digest, Goal State has successfully been funded on Kickstarter, with hundreds of pledges coming in to help make this project a reality. A huge thank you once again to everyone who has supported me in getting this off the ground!
We have a stack of stretch goals for the Kickstarter live now - see the article linked below. But the first three coming up are shown in the diagram above, with tutorials in popular game engines Unity and Unreal, plus the first of four planned source code deep dives into classic games. Starting with the godfather of first-person shooters: DOOM!
Plus tune in later this week as we announce the next phase of the crowdfunding, and how Pledge Goals will add even more value for people interested in all things AI and Games.
Paid Memberships 40% Off Until Xmas
Speaking of financially supporting AI and Games, a huge thank you to everyone who has sponsored this here newsletter in the past year. It’s nice to know people value this stuff enough to throw money at it! On that note, we’re running a discount on annual subscriptions until the end of the calendar year. So yeah, click here and get yourself a paid membership.
As a reminder of what a paid membership offers:
Access to our supporter’s only corner of the AI and Games Discord Server.
Member’s only issues of the newsletter, including the sponsor newsletter and the monthly digest.
Ad-free episodes of the Branching Factor podcast.
Access to the ‘archive’ of old case studies on the site.
Bragging rights, lots of bragging rights.
Thanks once again to everyone - both paid and free - for your support in 2024. It’s been a wild year, and I look forward to doing it all over again in 2025!
GOTY Submissions Still Open
Starting with today’s issue we’re announcing some of my and reader’s selections for 2024 Game of the Year. Go and check out my musings below, plus two reviews from our Discord community. If you want to share a few words about your favourite game of 2024 - a game that you played this year, regardless of when it originally came out - then head over to the Discord server’s #gaming-grove channel, or message me below.
AI (for Games) in the News
Arcane’s Banner Image Pulled for Generative AI Concerns
As reported on Eurogamer, a Twitter user pointed out some weird issues in the hands of the above image: a promotional banner on Netflix for the League of Legends show Arcane. The hand at the far right of the image appears to have been tweaked with generative AI. Franchise owner’s Riot have since had the image taken down, with brand lead for Riot Games and Arcane, Alex Shahmiri, thanking viewers for spotting the issue. Shahmiri later stated that Riot has a “strict stance of no ai for anything relating to arcane cause it's disrespectful to the incredible artists who worked on the show.” While it’s unclear where the fault lies, Shahmiri’s comments suggest that the Netflix perhaps put the image through an AI tool to extend it so it would fit their UI.
Any Arcane fans reading the newsletter? I’ll concede I’ve not got around to it given I have zero knowledge of League. But Castlevania: Nocturne wasn’t bad!
Itch.io Now Requires Developers to Tag (Generative) AI
As generative AI continues its march into the games industry, and while we are seeing some interesting explorations of the technology (I mean, keep reading, it’s this weeks story), the bulk of conversation - and the deserved frustration - is surrounding use of generative art. Games of all shapes and sizes are slipping that stuff into their games, and some players are actively seeking it out, or outright avoiding it. Plus of course, there’s the legal implications of being held responsible for selling a game that is using this technology and its outputs.
So digital distribution platform itch.io is now adding a requirement that all developers tag their games such that they can make clear whether they use generative AI or not. This has, judging by the announcement in the forums, been well received by the community, but interestingly we’re seeing some meaningful discussion of what comprises ‘AI’ within game development. It’s clear also that itch will continue to develop to evolve this feature in the future.
Pokémon Go! Players Help Train Geospatial AI Models
Remember when Pokémon Go was the biggest thing on the planet? And people were wandering around everywhere, even committing the odd trespass or two, all in pursuit of getting that 50th Psyduck? Well it’s perhaps no surprise that when you start thinking about all the data that players generate that you could do something interesting with it.
In fact over the years the positional data alongside realtime images being scanned as you try to capture your next Snorlax have been used to train millions of artificial nerual networks that can understand local geopositional data - effectively making a 3D representation of a real world space. The next step was to expand that into what is referred to as a Large Geospatial Model (LGM) that consolidates all of that information. Essentially building a mechanism to build a 3D representation of geographic locations all over the world. An impressive and somewhat scary feat.
Scary, and even a little invasive, but Niantic themselves were quick to clarify, they didn’t simply scrape data off of all players of their games. Rather the data used for these models is curated from player’s scanning publicly-accessible locations and then sharing them with the developer.
That said, at the recent Bellingfest event hosted by investigate journalism group Bellingcat, Niantic’s Senior VP of Engineering Brian McClendon faced questions over whether the company would allow for government and military to purchase use of their model for their own purposes. It’s one thing scanning the local church because there’s a Dialga in the current raid, but then selling that data so someone can figure out how to blow the church up sounds like a real problem.
Unsurprisingly, his answer was a little evasive.
PlayStation 5 Pro’s PSSR Isn’t a Magic Bullet for Fancy Graphics
So the PlayStation 5 Pro released to something of a muted response a couple of weeks back. It’s a very expensive device for not an awful lot of bang for your buck. A point I made back when they announced it earlier this year.
But one of the big selling points is PSSR, PlayStation Spectral Super Resolution. An AI supersampling technology that helps upscale graphics on the PS5, similar to that which NVIDIA’s DLSS powered GPU’s are capable of. But, recent headlines help remind everyone that just introducing the technology alone does not magically ensure a boost in visual fidelity. It requires work by the developers to support it and get the best out of it. As reported at VGC, several PS5 titles including Silent Hill 2 Remake and Star Wars: Jedi Survivor which are identified as ‘PS5 Pro Enhanced’ actually wind up with poorer visual quality as a result.
Both games are promising patches to address this, with Silent Hill 2 already receiving one at the time of writing. For more insight check out Digital Foundry’s analysis of Jedi Survivor’s performance on PS5 Pro.
Meet F.A.C.U.L.: The Revolutionary FPS AI Companion
This one somehow escaped my notice until about a week ago, but this is a project Tencent are working on as part of their live service extraction shooter Arena Breakout Infinite - a game that I’ll concede I’d never heard of until now. It’s currently in early-access on PC, and is a big-budget reimagining of Arena Breakout, which is a popular mobile game.
In the video, we see a collection of non-player characters supporting the player, and responding to their voice commands. Now if this works as advertised it is quite a fun addition, and something that has long been explored in games prior to now. One only needs to cast their mind back to games such as 2008’s Tom Clancy’s Endwar and Binary Domain from 2012 on the Xbox 360 that tried to have NPCs react to voice commands from the player. Both were fun ideas, but not really something that actually worked - especially if you’re Scottish!
What’s interesting about this demonstration is that it shows voice commands interfacing with pre-built AI behaviours in the game. So telling a character to investigate a particular area of interest leads to it triggering the existing AI code to execute such action. That said, I’m not convinced until I actually try this sort of thing myself. For one the voice work is clearly re-recorded after the fact (it’s too clean), and also there are specific elements that either don’t work as intended (telling an NPC to follow the player upstairs, only for it to take the lead) or seem too clean in execution - after all telling the NPC to check out ‘that red car’ in a map no doubt full of red cars requires extensive pre-processing to that query. But nonetheless, I could see this being a viable option for tactical shooters with NPC squad mates to get them to move into position. Something that a future Rainbow Six or Ghost Recon game could exploit.
Game of the Year 2024
As the year begins to come to a close, it’s time to start talking about some of the best games released, or simply played, these past 12 months. I’ll be sharing with you my top 10 of the year over the next few weeks, plus we have some insights from our readers as well!
Tommy’s #10: Still Wakes the Deep
The Chinese Room, PC / PS5 / Xbox Series S|X, 2024
Still Wakes the Deep sees Brighton-based developers The Chinese Room return to their roots of story-driven first-person psychological horror with a bleak but interesting setup. Cameron ‘Caz’ McLeary is a man on the run, and had fled to an oil rig off the coast of Scotland as a contract electrician until the heat has died down. But now the police are the least of his worries, as the recent drilling operation has uncovered something foreign, something unknown, from the depths of the ocean. And if they don’t do something about it, it could threaten all life on earth.
Those familiar with The Chinese Room’s previous works, notably Amnesia: A Machine for Pigs, and Everybody’s Gone to the Rapture will be right at home with Still Wakes the Deep. It’s a little rough around the edges at times, be it with the climbing mechanics or the design of some of the stealth sequences as you hide from monstrous entities, but what helped cement it in my top 10 this year was something simple yet complex: it’s arguably the best representation of Scottish people I’ve ever seen in a video game.
Last year over on the
, I appeared in the GOTY discussing Venba by Visai Games: a wonderful story-driven puzzle game that challenges players with learning how to cook dishes of the protagonists homeland of India, with recipes and language lifted from Tamil culture. It’s a wonderful example of how to introduce people to South-Asian culture and cuisine. As someone who has many friends from that corner of the world, I loved getting to see their culture and their stories shown in a contemporary video game. It was subsequently nominated for numerous awards and won them at the likes of the IGF, the Game Developer’s Choice Awards and the BAFTAs. All of which very much deserved in my opinion. As I said on VGIM last year, it was perhaps the most profound and wonderful 2 hours of gaming I had in 2023.Still Wakes the Deep isn’t as rewarding as that, nor is it quite as culturally enriching. But it really hit home for me personally as Scottish accents and culture is seldom reflected in media in a way that I would consider remotely authentic. It’s often exaggerated for dramatic - or comedic - effect, something I think it shares with the likes of South or East London or even Texas in the US. It’s the first time I heard two characters converse with one another in a game in a way that not just felt seemingly Scottish, but was clearly evocative of where in the world that I come from.
The game is laced with small references to real world locales, with the protagonist Caz not only originating from a region of Glasgow I know well, but the ‘incident’ he’s running away from occurred about 10 mins walk from where I used to live. The use of specific Scottish (and other British) dialects and phrasing adds an authenticity to the whole thing that I adored. Sure, that meant that some of the dialogue is not for the faint of heart - I had a friend tell me they’d never heard that much foul language in the first 10 minutes of a video game before. But dare I say that’s where I come from, and that’s how many people speak to one another. Sure, the game takes place about 10 years before I was born, but the patter was still largely accurate.
The black humour as everything goes to hell, the panicked conversations laced with sarcasm and cuss words, the grounded way in which every crisis was just another thing to overcome. All of it made me care for the characters in a way that I haven’t felt in a game for a very long time. Because they sounded like people I knew growing up. It speaks to the power of authentic representation in media. Sure, as a Scotsman I readily concede we don’t come out of this game sounding like poets, but it’s not often I can think of a game where I cried both from laughter and sorrow in quick succession.
From the AI and Games Community
But of course it’s not just about me. I wanted to hear what your big games of the year were. Check out some quick reviews below, and if you want your words to appear on the newsletter, head to the #gaming-grove channel on the Discord, or drop me a message.
Pacific Drive
Ironwood Studios, PC / PS5, 2024
Review by Elbi on Discord
I looked through my list [of games played this year], and Pacific Drive still stands out. For one because it's actually from 2024. But also because it's Roguelike-ness frequently had me feel stressed, yet never to the point of paralysis. I was truly surprised how much I -a full-time pedestrian- cared about this car, and how much it felt like mine, with its quirks and loadouts and rituals. Add to that an interesting world full of STALKER/X-Files-esque oddities, and I had a great time just vibing to the radio.
Warhammer 40,000: Space Marine 2
Saber Interactive, PC / PS5 / Xbox Series S|X, 2024
Review by Skupe on Discord
Space Marine 2 does [Warhammer] 40k really well. It has the scale right. Looks and sounds amazing. It's maybe an 8/10 game, but being 40k gets it another 2 points. Watching that swarm of 'nids turn from a background prop into a swarm of enemies is just👌
Early Impressions of the ‘AI People’ Alpha
Back in September, registration opened up to participate in the an alpha playtest for a little known game called ‘AI People’. A game that in many ways is heavily inspired by EA’s The Sims franchise, but focusses entirely on the interaction between the player and numerous non-player characters (NPCs). As we’ll discuss in a moment, it does this by utilising contemporary AI technologies, notably Large Language Models (LLMs), and fits them into traditional game AI architectures to make these NPCs more believable and interactable.
The Games’ Origins
AI People is developed by GoodAI: a startup founded in 2014 by Marek Rosa. GoodAI’s mission is to “develop safe general intelligence - as fast as possible - to help humanity and understand the universe”. No easy task! While he is founder, CEO and CTO of GoodAI, Rosa is largely known as the founder of Keen Software House, the creators of the popular sandbox simulation game Space Engineers. His interest in AI led to him investing $10 million of his own money into the founding of GoodAI.
This setup is I would argue somewhat emblematic of how things are evolving in the world of generative AI for games. When the first wave of generative AI companies emerged that had an interest in games, they often came with little prior experience in the challenges and processes of game development. But here you have a game developer whose migrated into AI research, and then built a game to showcase their developments. GoodAI are not the first, and as we saw in my recent conversations with Jeff Orkin of BitPart AI, they’re not the last of this ongoing trend.
It’s worth stressing that, as noted by the development team, AI People is very much in its earlier stages. A ‘transparency notice’ on the platform’s website reads as follows:
We want to be fully transparent: AI People is currently in early alpha. If you're not comfortable with the idea of playing an early version with potential issues and limitations, we encourage you to wait for future updates. Your enjoyment is important to us, and we understand that some players may prefer a more polished experience.
The game has received several updates since launch, most notably with the removal of a 10-minute time limit that was imposed on each play session. I suspect this was originally a means to stagger server resource. Why is that important? Well, let’s talk about what AI People is actually doing.
How it Plays
The AI People alpha has a variety of scenarios for players to explore. In Quick Start, you will find yourself in a field where there are numerous items and objects in the world to interact with. Each of those items can have some impact on the world around it. Plus there are a handful of non-player characters also in the space who are doing their own thing. Outside of the Quickplay, you can either craft your own scenario, or rely on one built by the community of players who engage with it.
The gameplay itself is rather freeform. While there are some high-level objectives in each scenario, the gameplay is designed to be emergent. Each NPC is free to go about their business, and can interact with the player when they choose to, while also listening to the players text or voice-based input to respond to their questions, or take action within the world. To get a sense of the gameplay, I recorded a few minutes of my time in the Quick Start mode and shared it below.
How Does it Work?
The specifics of the technical implementation have not been made public, but from the public releases alongside a talk delivered by Rosa at the AI and Games Summer School in 2023 was that it’s a combination of traditional game AI systems, with Large Language Models.
One aspect of LLMs that has proven interesting - though quite often overblown - in the past few years is that they’re capable of becoming reasoning engines within a defined space. Context-based learning and processing can occur with an LLM if it understands a sufficient amount of information about the problem. This is achieved through a combination of factors, ranging from training the LLM to understand the problem space, and then refining its understanding of narratives or themes you want through human-driven feedback. But also on top of that you can tailor the model to build custom input and output formats through strong prompt engineering.
And so this is the secret to how AI People works, they make decisions by feeding in a whole bunch of information about the current game state: what is in proximity of them, what the player has said or done, recent action history of the game, and from that it attempts to infer what the characters should do. The resulting output of the LLM is then translated into action, which a traditional game AI system running under the hood manages and executes. But ultimately, in order for the characters to really think, there’s always a small delay as it passes information from the game state and the player up to the language mode..
Naturally, this leads to an additional consideration, in that it needs to connect to an LLM that’s being hosted online in order to run. So even as alpha players, you have to pay for the resource you spend in playing the game. When you sign up your paid monthly subscription allocates you 1000 'tokens’ per month, which per the FAQ suggests around 15 hours of gameplay. I’ve not came anywhere near using my initial allocation, with around 70% remaining after a couple of hours of play. This speaks to a larger challenge of games like this in ensuring it is monetised not to cover the overheads of running AI models in data centres, but then also actually making a profit!
And so as advertised, you can do whatever you like in the space. You can attempt to befriend an NPC, try to romance it, try to hurt it, and both that character and others around them will react. But also they are aware of the broader narrative surrounding their current situation and try to roleplay within it.
A Sense of Space
One aspect that AI People addresses that has bugged me in almost every single generative AI demo of the past year or so, is that the characters have some basis of understanding of the world, and that they can move around within it. It was a real bugbear of mine when playing Inworld’s Origins demo last year, and something that only a handful of Convai’s demos had addressed at the time. A non-player character that can converse with the player is not particularly interesting (to me) if it doesn’t have some semblance of understanding of the world and how to interact with it.
It’s not enough for a character to converse with a player. If you want them to feel like they’re part of the same world, the player would expect that an NPC that can hold a conversation with you should be able to recognise, approach, and interact with objects around them. Or at minimum acknowledge their immediate surroundings and have at least a basic level of spatial awareness.
The reason you don’t see that in a lot of generative AI demos is that it’s not a generative AI problem, it’s a game AI problem. Traditionally, you would need to ensure each object is tagged in the world so that they have a unique identifier. That the NPC can determine which objects are interactable, how to differentiate objects from one another, and then of course there’s the navigation element. In games we rely on navigation meshes for movement - a technology that’s been in games for over 20 years, but with no clear successor at this time (see the video above). So the game needs a navmesh built into it, and then if the player tells the NPC to walk over to an object, a backend game AI system (like say a behaviour tree) needs to process the request and use the navmesh to compute a path to the object.
AI People begins to addresses this, in that characters can move around the world, and can respond to interaction requests. If you ask them to go and pick up a specific object, they can parse the command into a navigation action and an interaction. It’s an easy win in my book, just to have NPCs that look like they actually exist in the world by knowing how to navigate it. Naturally that doesn’t mean it’s easy to implement, and I respect there are no doubt numerous annoying challenges faced in making it a reality. But asking a character where to find an object, and then they could retrieve it for me, was a satisfying interaction.
I wouldn’t say this is entirely resolved though. Having characters wander around and explore the world in several playthroughs proved somewhat interesting, but it’s clear that the environments are not tagged sufficiently for them to understand how to operate with everything in proximity. Or that their sensory systems are built to immediately collate and recognise similar objects. Plus there are common things one would expect to achieve with these objects that isn’t attainable at this time.
For example, in the scenario shown above, we’re in an apartment and the characters are worried about their supplies for surviving whatever commotion is happening outside. Be it a riot or zombie apocalypse. Sarah was left to conduct an audit of the food supplies, and while she found four apples by picking them up - mostly from the bowl of apples behind her - she couldn’t see the cooked food on the counter, or the bread, cheese, and banana. Plus I had asked Sarah to find all of the food she could, and then leave it on the kitchen table. I don’t think she can do that at this time. My repeated attempts to ask her proved fruitless - no pun intended.
Don’t worry, if it sounds like Sarah is getting a hard time I asked Connor - the other NPC in this scenario - to log onto the computer to try and communicate with someone outside the apartment but he apparently couldn’t figure out how to do that and came back with a book on survival strategies that he just happened to have lying around. Thanks Connor.
A Shot of Agency
Much like The Sims, the characters in AI People are free to wander around and do their own thing. They operate within the bounds of the fiction, and often communicate by thought bubbles what their current interests or objectives are. This flavour text helps give some indication of what the current objectives of the NPC is, albeit transposed through the lens of a virtual character to give it some emotional value.
But while the characters move around and do their own thing, they’re so subservient to the players experience that it means they lack any real sense of identity. There’s perhaps an implication of some global objective set by the scenario, by I seldom see these characters move towards it without my guidance. In the field scenario I encouraged the characters to help me figure out what I am supposed to do, and while they would provide exposition on what the objectives are, they would subsequently wait for me to act, rather than show me how to do it myself.
It felt like the blind leading the blind, rather than these characters encouraging me to enter the space and learn how to interact within it alongside them. While of course as the player we want to have some sort of control over the scenario, I struggle to feel like I’m part of an ecosystem where these characters feel ‘alive’ when they’re clearly subservient to my interests.
Personally I’d like to see a stronger sense of personality as well as agency in these characters, such that I can begin to understand how to work with or against them in the context of the scenario. Rather than feeling like I’m handling the strings of multiple semi-autonomous puppets at once.
On top of these agency issues, is an ability to communicate to the player when they cannot do something. This feels like an interesting problem to solve, given it would have to exist somewhere between the original message being sent and then parsing it in the LLM, given I doubt the language model is going to know the state of implemented gameplay mechanics. Trying to get LLMs to respond by saying they don’t know something requires significant additional work, given they’re statistical AI models. So if they don’t something, they just make it up (or as some people call it, hallucinating). Knowing when not to respond is equally an interesting thing to address. It’s a common challenge faced with a lot of these NPC conversation systems, in that it’s difficult to get them to shut up!
A Need for Direction
For me right now the thing that AI People is lacking is a sense of purpose. Understandably, the alpha is focussed on having you interact with these NPCs, to showcase their ability to not just do as instructed, but converse with the player, engage with the ongoing events happening within the game world, and potentially create some drama. But for me personally, I need some more structure to these interactions in order to make them worthwhile.
A common complaint I read online is that building games with NPCs that can converse with the player for hours is not an interesting game concept. Just because they can communicate like a real person, doesn’t mean you would want to. It’s a thesis I largely agree with, lest you can find ways to make that interaction interesting. Going up and asking every NPC in Skyrim their history and knowledge of Tamriel sounds incredibly dire to me. I’ll concede that conversing at all with an NPC doesn’t sound particularly interesting lest you give it structure and purpose.
If it’s clear to me that the character is being evasive or distant - that there is something underneath it all worth discussing - then that sounds like something you can, for lack of a better term, gamify. This is something that Ubisoft’s NEO demo did at GDC this year, in that you built relationships with these characters over time in a way that was rather ‘videogame-esque’. I think that needs to be done in situations like this, because in real life we rely on the choice of words, the tone of voice, and body language to get a sense of how people think. And none of that can be done effectively even using generative AI. It’s why game AI has spent years building approaches to communicate these ideas to players - because it’s hard to do!
AI People Has Potential
Despite what might sound like a rather lengthy list of issues I raised with AI People, I do the see the potential of the game in the long run. I feel it just has a lot of work ahead of it in order to build something that is as flexible or as engaging as it hopes to be. But critically, the core of the experience - of actually conversing with NPCs and getting them to act - seems largely in place?
I think the larger issues are in building structure around these interactions, and then giving NPCs the freedom to act within it. These are largely issues of content and scale, and it will require more work on the part of the developers to recognise all the ways in which people interact with the game, and build up a broader library of NPC actions, of environmental interaction, and of broader sensory perception for use in context. I don’t think that’s impossible. Rather it’s the, ‘easier’ part, given it’s a game design problem we’ve explored in many situations before ranging from The Sims to Scribblenauts.
But one aspect we’ve yet to see, and hopefully will be explored in future updates, is handling more complex scenarios and narratives. The scenarios we see in the demo are frankly very lightweight in terms of narrative structure or detail, and it strikes me as an interesting technical challenge to have them interface with the broader narrative and the actions that the player is making within it. So no doubt I’ll be coming back to check out AI People at some point in the future.
You can check out AI People yourself and register to join the alpha at: https://www.aipeoplegame.com/
Wrapping Up
And there we have it, finally things are returning to normal around here. Thanks for tuning in to this weeks issue of the AI and Games Newsletter, I hope you enjoyed it! We’ll be back next week with the Sponsor Newsletter, but before the paywall we’ll have even more GOTY 2024 entries to check out. Until then, take care folks!
Thanks once again to Transitional Forms for sponsoring this issue. Don’t forget that Little Learning Machines will be on sale until December 4th, 2024.