MobileNews

Razer wants you to feel your games with Universal Haptics

Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.


During the recent Game Developers Conference, I had a chance to escape the din of the event to go to Razer‘s offices in San Francisco and hear and feel the pounding of a new set of headphones.

These prototype devices pounded into my ears more than usual because they used Razer’s new universal haptics technology, which lets you feel the sound.

Last year, Razer bought a company called Interhaptics and at GDC 2023 it introduced its Universal Haptics software development kit and directional haptics. It’s all in the name of improving immersion for gamers, coming from a company that wants to be the global lifestyle brand for gamers.

>>Follow VentureBeat’s ongoing GDC 2023 coverage<<

Event

GamesBeat Summit 2023

Join the GamesBeat community in Los Angeles this May 22-23. You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.

Register Here

The free SDK release focuses on enabling a heightened immersive gaming experience, bringing audio and visual effects to life with HD haptic feedback that can now be completely customized through the Interhaptics SDK.

When I wore the Razer HyperSense directional-enabled headphones, the experience of the sound started small and then built up to a kind of crescendo. Listening to the classic THX sound wave, at some point it shifted over from hearing sound with my ears to feeling the sound beating on my skin. It was an example of the Interhaptics technology in action.

Razer lets you feel sound on your skin as well as hear it.

Developers can sign up for the waiting list for the Razer Kraken V3 HyperSense Dev Kit with programmable directional HD haptics at the Interhaptics website. And Interhaptics has expanded its support to include PlayStation 5, PlayStation 4, Meta Quest 2, X-input controllers, iOS, and Android devices for game engines such as Unity and Unreal Engine.

The headphones that I wore were expressive, like having a DualSense controller instead of a normal Rumble controller. The haptics was also easily “composable,” meaning a developer could look at a visualization on a graph that represented the haptic effects that I felt. If you moved around the graph and changed the curve, then the haptic effects also changed, coming in harder or softer.

Razer bought Interhaptics in 2022.

The game developers themselves aren’t the only ones who can compose these haptics. Fans who gather on Discord channels can also learn how to do it. You can imagine watching a game trailer on YouTube and have the sound effects turn into touch effects. It reminds me of that effect in Electronic Arts’ remake of Dead Space, where you can feel the heartbeat of Isaac, the main character of the game.

The Razer folks hinted we’ll see more devices coming with the technology, introducing haptics for Razer fans in different ways.

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.


Author: Dean Takahashi
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!