Blizzard intends to buy ‘Spellbreak’ studio Proletariat to speed up ‘WoW’ development

It’s a busy spell for Blizzard, with Diablo Immortal, Overwatch 2 and mobile game Warcraft Arclight Rumble all arriving this year. The studio has another major release lined up in the form of World of Warcraft expansion Dragonflight, which is expected to arrive by the end of 2022. To help get WoW expansions out on time and ensure they meet the bar in terms of quality, Blizzard intends to buy Spellbreak studio Proletariat to bolster its ranks of developers, as GamesBeat reports.

The news comes one day after Proletariat announced it will shut down Spellbreak early next year. The free-to-play game is an intriguing take on the battle royale genre, with players using magical powers instead of guns. The game never took off, though. It had an average player count of 166 on Steam over the last month. Apex Legends, on the other hand, has more than a thousand times as many players at any given time on Steam alone.

More than 100 developers from Proletariat may soon be focused on World of Warcraft, though the studio has been working with Blizzard since last month. The Boston-based studio also plans to expand its team.

WoW general manager John Hight has spoken of the difficulties his team has had in hiring to deliver content updates to players more quickly (the publicturmoil at the studio over the last year might have played a role in that). Bringing Proletariat on board should help.

“A big part of caring for our teams is making sure we have the resources to produce experiences our communities will love while giving our teams space to explore even more creative opportunities within their projects,’ Blizzard Entertainment president Mike Ybarra said. “Proletariat is a perfect fit for supporting Blizzard’s mission in bringing high-quality content to our players more often.”

Activision Blizzard is itself in the process of being bought by Microsoft for $68.7 billion. Given the ongoing labor and workplace culture issues at the company, there’s a bit of irony in Blizzard snapping up a studio called Proletariat.

Correction 6/29/22 5:40pm ET: A previous version of this story stated that Blizzard had acquired Proletariat studios. The sale has not been completed and the copy above has been changed to reflect that.

Return to Monkey Island’s first gameplay trailer is a swashbuckling trip of nostalgia

When Return to Monkey Island arrives later this year, players will finally discover the secret of Monkey Island. That’s the pitch series creator Ron Gilbert made in the game’s newest trailer, which premiered today during Nintendo’s latest Direct s…

Both of Valve’s classic Portal games arrive on the Switch today

A few months ago, Valve announced that both of its excellent Portal games were coming to the Nintendo Switch, but we didn’t know when. Today’s Nintendo Direct presentation cleared that up. Portal Companion Collection will arrive on the Switch later today for $19.99. The collection includes both the original Portal from 2007 as well as the more expansive, story-driven Portal 2 from 2011. Whether you missed these games the first time out or just want to replay a pair of classics, this collection sounds like a good way to return to one of the most intriguing worlds Valve ever created.

While the original Portal was strictly a single-player experience, Portal 2 has a split-screen co-op experience; you can also pay this mode with a friend online as well. And while these games originated on the PC, Valve also released Portal 2 for the PlayStation 3 — and if I recall, the game’s controls mapped to a controller very well. Given that the Portal series is more puzzle-based than traditional first-person games, you shouldn’t have any problems navigating the world with a pair of Joy-Con controllers. 

Meta’s latest auditory AIs promise a more immersive AR/VR experience

The Metaverse, as Meta CEO Mark Zuckerberg envisions it, will be a fully immersive virtual experience that rivals reality, at least from the waist up. But the visuals are only part of the overall Metaverse experience.

“Getting spatial audio right is key to delivering a realistic sense of presence in the metaverse,” Zuckerberg wrote in a Friday blog post. “If you’re at a concert, or just talking with friends around a virtual table, a realistic sense of where sound is coming from makes you feel like you’re actually there.”

That concert, the blog post notes, will sound very different if performed in a full-sized concert hall than in a middle school auditorium on account of the differences between their physical spaces and acoustics. As such, Meta’s AI and Reality Lab (MAIR, formerly FAIR) is collaborating with researchers from UT Austin to develop a trio of open source audio “understanding tasks” that will help developers build more immersive AR and VR experiences with more lifelike audio.

The first is MAIR’s Visual Acoustic Matching model, which can adapt a sample audio clip to any given environment using just a picture of the space. Want to hear what the NY Philharmonic would sound like inside San Francisco’s Boom Boom Room? Now you can. Previous simulation models were able to recreate a room’s acoustics based on its layout — but only if the precise geometry and material properties were already known — or from audio sampled within the space, neither of which produced particularly accurate results.

MAIR’s solution is the Visual Acoustic Matching model, called AViTAR, which “learns acoustic matching from in-the-wild web videos, despite their lack of acoustically mismatched audio and unlabeled data,” according to the post.

“One future use case we are interested in involves reliving past memories,” Zuckerberg wrote, betting on nostalgia. “Imagine being able to put on a pair of AR glasses and see an object with the option to play a memory associated with it, such as picking up a tutu and seeing a hologram of your child’s ballet recital. The audio strips away reverberation and makes the memory sound just like the time you experienced it, sitting in your exact seat in the audience.”

MAIR’s Visually-Informed Dereverberation mode (VIDA), on the other hand, will strip the echoey effect from playing an instrument in a large, open space like a subway station or cathedral. You’ll hear just the violin, not the reverberation of it bouncing off distant surfaces. Specifically, it “learns to remove reverberation based on both the observed sounds and the visual stream, which reveals cues about room geometry, materials, and speaker locations,” the post explained. This technology could be used to more effectively isolate vocals and spoken commands, making them easier for both humans and machines to understand.

VisualVoice does the same as VIDA but for voices. It uses both visual and audio cues to learn how to separate voices from background noises during its self-supervised training sessions. Meta anticipates this model getting a lot of work in the machine understanding applications and to improve accessibility. Think, more accurate subtitles, Siri understanding your request even when the room isn’t dead silent or having the acoustics in a virtual chat room shift as people speaking move around the digital room. Again, just ignore the lack of legs.

“We envision a future where people can put on AR glasses and relive a holographic memory that looks and sounds the exact way they experienced it from their vantage point, or feel immersed by not just the graphics but also the sounds as they play games in a virtual world,” Zuckerberg wrote, noting that AViTAR and VIDA can only apply their tasks to the one picture they were trained for and will need a lot more development before public release. “These models are bringing us even closer to the multimodal, immersive experiences we want to build in the future.”

‘Teenage Mutant Ninja Turtles: Shredder’s Revenge’ is a glorious beat-’em-up revival

If you visited arcades in the late ‘80s or early ‘90s, you surely remember the golden age of beat-em-up games. Cabinets like Teenage Mutant Ninja Turtles, The Simpsons, X-Men and more followed a fairly simple formula: take a popular franchise and have …

‘Strange New Worlds’ mixes the maudlin and irreverent

The following article discusses spoilers for The Elysian Kingdom.There’s a genre of writing best embodied by the serial escalation of premises found on forum threads in certain corners of the internet. It’s the sort of energy that imbues this week’s St…

Spotify’s Live Event Feed makes it easier to find out when your favorite artist is touring

Spotify has expanded its old Concert Hub and added more features to make it easier to find information and tickets for live events in your location. The streaming service sources listings for the hub, now called Live Events Feed, from its ticketing partners that include Ticketmaster, AXS, DICE, Eventbrite and See Tickets, among other companies. During the height of COVID-19 lockdowns, the Concert Hub helped users find at-home or studio performances, podcast recordings and other online performances. Turns out Spotify was studying user behavior at the same time. 

Sam Sheridan, Product Manager for Live Events Discovery, said Spotify spent the past two years studying the music industry and its users. One of the most important behaviors the company noticed was that fans would engage with artists on the platform and then leave to search for concert listings or to follow them on social media to be able to stay on top of any upcoming tour dates. “We think the Live Events Feed is an opportunity to help close this loop,” Sheridan said. 

If you don’t see the Live Events Feed in your app, simply search for “live events.” You’ll see a listing of all the performances in your area, and clicking on any of them would lead you to an interface that includes a link where you can find and buy tickets. If the artist you’re listening to has an upcoming tour date, Spotify will show you that event in-app while you’re listening. Spotify has also built a new messaging tool that can notify you about upcoming concerts based on your listening habits. Don’t worry — you can tweak your notification preferences so you don’t have to get messages if you don’t want to. 

Sheridan says Spotify will work “to even further integrate event discovery directly into the app” to make it more intertwined with the listening experience, so we’ll likely see more updates to Live Events in the future. 

Instagram is testing an AI face-scanning tool that can verify your age

Instagram is testing new age verification methods including asking followers to vouch for your age and even using AI that can estimate your age via a video selfie. It’s part of a push to ensure users are at least the minimum 13 years old and “to make sure that teens and adults are in the right experience for their age group,” it announced

For the “social vouching” system, Instagram asks three mutual followers of the user to confirm their age. Those followers must be at least 18 and have three days to respond to the request. Users can still verify their age with pictures of ID cards as well. 

The AI part requires you to take a video selfie, which Meta-owned Instagram then shares with a company called Yoti (it doesn’t provide any other information to Yoti, only the image, it says). “Yoti’s technology estimates your age based on your facial features and shares that estimate with us. Meta and Yoti then delete the image. The technology cannot recognize your identity — only your age,” Instagram says in the blog post.

Despite those reassurances, the system is bound to be controversial. Users widely distrust both Facebook and Instagram with their data, to start with. On top of that, Yoti’s age recognition AI has higher errors depending on your gender, age range and skin tone.

Yoti’s system is already used by the UK and German governments to detect age using deep learning after being trained on “hundreds of thousands” of pictures, Yoti cofounder Robin Tombs told Wired last year. Much like other neural networks, though, how it works is a bit of a black box, so even the company doesn’t know exactly which facial characteristics it uses. Yoti has a YouTube demo (above) where it applies makeup to young users to see if the system can still correctly guess their ages (it can). 

You can try Yoti’s age estimation yourself — I found that it made me considerably younger (four years) when I took off my glasses, so your own mileage may vary. In general, it’s the least accurate (plus or minus 3.97 years) when used on female faces with dark skin and the most accurate (2.38 years) with light-skinned male faces. 

Instagram says it aims to use AI to understand people’s ages to “prevent teens from accessing Facebook Dating, adults from messaging teens and helps teens from receiving restricted ad content, for example.” It looks like this is just the start, as well, as the company said it plans to expand the use of it “widely across our technologies.”