Crowdfunding & Conferences | AI and Games Newsletter 13/11/2024
In the sweet spot between the two biggest things AI and Games has ever done...
The AI and Games Newsletter brings concise and informative discussion on artificial intelligence for video games each and every week. Plus summarising all of our content released across various channels, from YouTube videos to episodes of our podcast ‘Branching Factor’ and in-person events like the AI and Games Conference.
You can subscribe to and support AI and Games on Substack, with weekly editions appearing in your inbox. The newsletter is also shared with our audience on LinkedIn. If you’d like to sponsor this newsletter, and get your name in front of our audience of over 6000 readers, please visit our sponsorship enquiries page.
Hello all,
here and welcome back to the newsletter. This weeks issue is a little later than usual, because we had some other content to drop earlier today in the form of our final update on Goal State prior to the launch on Kickstarter on Friday November 15th. But it’s also been a funny old week. I’ts been mere days since our first ever AI and Games Conference, and of course the crowdfunding campaign kicks off mere days from now. So it’s safe to say I’m busy, but it’s also been a very emotional and rewarding period.For this issue, we’re going to talk a bit about the AI and Games Conference, the Goal State Kickstarter, and my recent visit to Sheridan’s Online Safety Summit. AI in the news has been surprisingly quiet this week. At least to my knowledge anyway. I suspect much of that is to do with the recent US presidential elections, given it consumes so much of the news cycle that announcing anything significant isn’t going to get the traction that you want if you’re a big tech firm or games studio.
Alrighty, let’s do this!
Announcements & Updates
Not a lot to announce this week, but we have some events, and also GOTY!
AI and Games - Game of the Year 2024
A couple weeks back I opened the call for the AI and Games GOTY 2024. I’ll be working my way through my favourite games from this epoch of the Gregorian calendar. But also, we’re looking to hear from you as to what your favourite games have been this year.
We’re only looking for a couple of paragraphs. Plus it’s worth saying that your GOTY 2024 doesn’t have to a be a game that was released in 2024 - otherwise I’d have little to talk about.
Drop me a message in the AI and Games Discord (we have a thread running in the #gaming-grove chat), or via the messaging app here on Substack to be considered for publication in December!
Events
For those who missed out on the fantastic Next Level event that ran at King’s College London last month, a write-up of the event is now available on the KCL website.
Surely I’ve done enough events this year? Surely!?!? Well, not quite. I’ve deliberately kept my calendar clear after the AI and Games Conference because I think it’s safe to say I might be tired. However, I just agreed to be a speaker at Game Developers Session (GDS) which runs from December 13th to 14th in Prague, Czech Republic. This will most definitely be my final speaking engagement of the year. But I’m very much looking forward to it!
On that note, if you’re running an event in 2025, I’m booking speaking engagements now and happy to discuss if you’re interested in having me present.
Ok, enough shameless self-promotion. Time to talk about the event I ran, that… has my name all over it…
AI and Games Conference 2024
I’m still in something of a daze quite frankly. The last couple months has been a real emotional journey in getting this event up off the ground and making it happen. But seeing it all happen across Thursday and Friday made me realise how much it was worth it.
In short, it was fantastic. I’m so proud of the work our team has done in putting this together. Our organisers, our sponsors, our speakers, and of course everyone who travelled down to London to be a part of the day. One of the key pieces of feedback I’ve received, both written and anecdotally, was the quality of our speakers. The talks were of a very high quality, and this is something that my team worked very hard to put together. Both reaching out to our contacts in the industry to put it together, while also selecting from our open call of submissions. The quality of submissions was also exceptional, making our job very hard to decide which talks to accept.
Some quick announcements with regards to the conference:
A formal round-up, including a video vlog, will be available on the AI and Games YouTube channel in the coming weeks.
All talks have been recorded and we are building the final presentation versions for release in early 2025.
We will be returning for a 2025 iteration, and will have more to say on that early next year.
You’d think today I’d be writing up the post-mortem on this, but no… given we have Goal State happening this week. But, I have it on good authority that my wonderful co-organiser
will have a few words to share in this weeks issue of the .Goal State Launches November 15th!
Goal State launches on Kickstarter this Friday! It’s crazy to think we’re actually here. It’s about to happen!
Earlier today I dropped a new article that fills in the last piece of the Goal State puzzle: the reward tiers! So if you want to know how much it will cost to jump into Goal State at launch, this reveals all! Critically two things I announced today is
There are early bird tiers for the two main products (Goal State, and Goal State Plus).
We will be offering late pledges. Meaning that if we’re successful and you want to support it when financially able, you’ll be able to do that for a limited time.
It’s exciting to think we’re almost there. Fingers crossed we can get it over the line!
The Big Story: AI’s Intersection with Online Safety
So in amongst everything else I have going on, I spent yesterday afternoon in central London at an ‘Online Safety Summit’ event hosted by Sheridans - a media and technology law firm based in London. Regular readers may recognise the name, given just last week at the AI and Games Conference we had Anna Poulter-Jones who is an associate in the firms Computer Games and Digital Media groups present to our community on intellectual property risks when handling generative AI technologies.
However while AI wasn’t the focus at this event, as we’ll see shortly it’s having an impact of every key part of the conversation. As explained by Antonia Gold - a partner in the Technology and Digital Media Groups at Sherdians - online activity is now coming under increasing regulation courtesy of two different sets of laws:
The Digital Services Act (DSA): A series of regulations imposed by the European Union that came into force in 2022.
The Online Safety Act (OSA): A set of rules established by the UK government that was signed into law in 2023.
In each case both set of laws has the same goal: to ensure safe online spaces exist for children and adults, and enforcing that companies who create these spaces take appropriate steps to protect them. While both of these legislations are different, and rolling out at different stages, they have a lot of shared DNA and so it’s important to start thinking about how to ensure compliance with regulators such as Ofcom - who handle the implementation of the OSA in the UK.
Throughout the day we heard from numerous speakers as to the challenges faced. Perhaps the most obvious one from the get-go, in that games have somewhat been treated as an afterthought - a sentiment that Fred Langford, the director of Trust and Safety Technology at Ofcom, largely agreed with.
Much of the OSA in particular was built with social media platforms in mind. The UK government wanted to ensure younger demographics have safer access to online spaces, that activities by terrorist organisations are minimised and that ultimately a lot of the more undesirable aspects of social platforms are reduced. Hence we’ve seen new features like ‘teen accounts’ on Instagram which roll out globally in 2025 as a means to address these issues. After all, there are now significant financial penalties that will be incurred by companies whose users are based in the UK and EU that do not follow the law.
But bringing it to games, it adds additional layers to social interaction that is sometimes harder to encapsulate. Real-time text or voice chat during gameplay, user generated content, in-game ‘griefing’ where a player might not use language to harass another player, but do so using in-game actions. As discussed by Gold, this leads to two significant headaches for game developers:
Mechanisms required for identifying, capturing and reporting content. This introduces processes of assessment, a means to allow appeals, and also for data related to this activity to be shared publicly (either with the regulators or general public).
The OSA adds a new compliance measure in ‘age assurance’. In that a platform holder needs to be able to successfully establish - within reason - how old their users are, and then have that influence what level of service they’re provided.
While many live service online games already tackle the reporting mechanisms, the bigger headache is in age assurance. Most children lie about their age on online platforms such that they can access them. But simply relying on them to fill out a form is no longer sufficient. The OSA requires that companies work towards estimating the ‘true’ age of the user if no legal identification paperwork can be presented.
So with all of this in place, this is where artificial intelligence comes into play. I was the ‘AI guy’ at this event. Quite happy to sit and listen to experts in a field that is not my own. But while I wasn’t surprised to hear AI brought up in almost every presentation as a means to help handle the complexities of these challenges, I was also rather concerned with how readily we see this as a solution without deeper consideration.
AI and Moderation
Ultimately the true benefit of using AI technologies for content moderation is simply handling volume. Millions of posts occur on social media platforms today, but often games have even more content with smaller user bases - and in turn smaller staff resources. As mentioned by Steve Wilson, the Trust and Safety Manager at Jagex, the ever popular MMO Runescape has around 13 million lines of text chat per day.
The conversation around AI adoption in processes came up during a fireside chat with Laura Higgins, Senior Director of Community Safety and Civility, Roblox. Though admittedly that was in part because I asked Laura a question about what complexities Roblox’s own generative AI tools add to the situation. Though I found perhaps the sentiment surrounding all this was best expressed by a person whose name I sadly didn’t catch, who asked Fred Langford about the requirement to use AI as part of moderation. Langford clarified that Ofcom does not enforce the use of AI as a solution to these problems, but it spoke to a broader realisation that it’s impossible to keep on top of the sheer volume of content, and the requirements of these regulations without it.
Naturally the technical aspects of all of this weren’t really discussed, but AI solutions being plugged into different corners of the moderation process came up throughout the afternoon. Whether it’s monitoring text chat, translating voice chat into text for analysis, monitoring in-game behaviour, or arguably the one mentioned the most that I wonder if people understand is actually AI - the age estimation and assurance processes.
Human in the Loop
I am always concerned when we see technology as the primary solution to complex human problems. I find it naïve and uninformed at best and simply foolhardy at worst, but throughout the day it was reassuring to hear how many of those involved see the value in communication, and the exchange of information.
Steve Wilson’s experience in this space clearly came to the fore here. Given of course Runescape has been around for over 25 years, and as such Jagex have been at the forefront of this battle since the beginning - long before AI could even help at all. They’ve not only seen the benefits and often lack thereof of technology-led solutions - often for them technology solutions are ill-fitting given they need to fit the games now admittedly ancient tech stack. But also they’ve led the charge in understanding good human processes. That doesn’t mean relying on end user reporting (where most are not reliable), but rather be proactive and rely on the judgement of your moderation team. But critically, also recognise and support the work of moderators, be it through strong onboarding and support mechanisms, but also in building relationships with local law enforcement to help them be better informed on how to act on specific information they receive.
Laura Higgins reaffirmed this by highlighting the need to talk to experts in various aspects to help with both policy as well as education. Introducing the idea to children at an early age of helping them understand what safe spaces online should look like, and how to work with kids, parents and schools in reinforcing good practice and identifying bad actors.
But a recurring point made throughout, that was highlighted one last time from Luc Delany of k-ID, was the sharing of information. Games have been tackling these issues for so long now, that knowledge exchange on known issues and good practice has long been established in the games industry. While technology is essential in handling the sheer volume of content being generated, and AI is playing a huge part in that, it’s vital that humans remain in the loop.
Overall, it was an interesting afternoon, and it left me with plenty to think about.
Wrapping Up
Things are going to get back to normal around here, I promise! Next week will be the monthly digest, but we’ll wrap up November with an issue focussed on events surrounding AI for games in the news. It’s probably going to be about AI People alpha that I’ve been playing.
Alrighty, until then thanks for reading. Take care and I’ll be back!