Sony announces Inzone, a PC-focused gaming gear brand

sony inzone.JPG

Sony is expanding to cater further to the PC gaming market. Its latest move is the launch of Inzone, a gaming gear brand aimed at PC gaming. The brand's name is meant to refer to the immersive feeling gamers experience when they are “in the zone”, Kazuo Kii, Sony president of home entertainment and sound products told The Washington Post in an exclusive article.

“We are entering the gaming gear industry with monitors and headsets at an exciting time, since gaming and esports have gotten even more popular over the last few years,” Kii said. “We are leveraging Sony’s high quality display and audio technologies to deliver products that will allow gamers to immerse themselves into their gaming world.”

Inzone's debut products include monitors and headsets. For monitors, launching this summer is a $899 4K resolution monitor with a 144-hertz refresh rate, and, available sometime later this year, is a $529 1080p monitor with a 240-hertz refresh rate. As for headsets, the Inzone line will include a wireless one retailing for $299 that features noise cancellation and synthetic leather, a $229 wireless headset without leather or noise-canceling features and $99 wired headphones. All 3 models will come with a spatial sound field feature

“We are not saying we are not focusing on the PS5 users. But because we are latecomers to monitors and headphones for [the] gaming segment, we believe we have a chance to catch up,” Kii said. “I believe if top players from top companies mention ‘Oh, Sony’s Inzone is great,’ we can catch up.”

While the monitors and headsets are geared towards PC gaming, the colour palette seems to be inspired by the PS5. Indeed, the monitors are compatible with the PS5 and will optimize screen colors once connected. The monitors are also said to have a "switcher feature", that allows for the simultaneous connection of a mouse, keyboard and headset setup to both a PC and PS5, and switch between the two.

:arrow: SOURCE
 

urbanman2004

Well-Known Member
Member
Joined
Jan 10, 2013
Messages
422
Trophies
0
XP
614
Country
United States
No they're right, 1440p is the .40 S&W of the resolution world: An unnecessary middle between two perfectly fine points, either in resolution, or in caliber.
I actually like the middle ground. Nothing wrong w/ more options to suit a player's preference contrary to your sentiment 😁😈. My graphics card isn't all that good performance-wise in 4K, but my image comes out fuzzy in 1080p so 1440p meets that mark perfectly on large displays.
 

LainaGabranth

Well-Known Member
Member
Joined
Jun 26, 2022
Messages
426
Trophies
0
Age
53
Location
Sneed's Feed and Seed
XP
735
Country
United States
I actually like the middle ground. Nothing wrong w/ more options to suit a player's preference contrary to your sentiment 😁😈. My graphics card isn't all that good performance-wise in 4K, but my image comes out fuzzy in 1080p so 1440p meets that mark perfectly on large displays.
To each their own. I prefer to play at 720p with Nvidia scaling for that dank framerate.
 

SG854

$$$$$
Member
Joined
Feb 17, 2017
Messages
4,937
Trophies
1
XP
7,144
Country
Gabon
I actually like the middle ground. Nothing wrong w/ more options to suit a player's preference contrary to your sentiment 😁😈. My graphics card isn't all that good performance-wise in 4K, but my image comes out fuzzy in 1080p so 1440p meets that mark perfectly on large displays.
The complaint does not make sense to me at all. A 3 types of resolutions are available on the market right now. Based on what a person wants and/or needs.

1080p for people on a budget or super high hz for competitive gaming.

1440p a good middle ground for picture quality and performance.

And 4k for content creators and people into high res gaming.
 

urbanman2004

Well-Known Member
Member
Joined
Jan 10, 2013
Messages
422
Trophies
0
XP
614
Country
United States
The complaint does not make sense to me at all. A 3 types of resolutions are available on the market right now. Based on what a person wants and/or needs.

1080p for people on a budget or super high hz for competitive gaming.

1440p a good middle ground for picture quality and performance.

And 4k for content creators and people into high res gaming.
I agree wholeheartedly w/ your sentiment; the more options, the better to accommodate end user's preference.
 

urbanman2004

Well-Known Member
Member
Joined
Jan 10, 2013
Messages
422
Trophies
0
XP
614
Country
United States
They're going to release 500hz panels this year. I would do the same, lower res to get big framerates.
Yeah, that's awesome how good improvements to display tech are being churned out at such a rapid pace, but LainaGabranth's premise was to intentionally tank performance while playing at a low resolution which I don't think is your intended result.
 

SG854

$$$$$
Member
Joined
Feb 17, 2017
Messages
4,937
Trophies
1
XP
7,144
Country
Gabon
Yeah, that's awesome how good improvements to display tech are being churned out at such a rapid pace, but LainaGabranth's premise was to intentionally tank performance while playing at a low resolution which I don't think is your intended result.
I'm waiting for that golden 1000hz display. Clear CRT like motion without flicker impulse for modern games. And CRT emulation for classic games. Can emulate the way a CRT draws scanlines line by line with a 1000hz monitor without fake scanline filters and also get motion clarity benefits you don't get with scanline filters. Black Frame Insertion has its place. But actual CRT emulation would be golden.
 

urbanman2004

Well-Known Member
Member
Joined
Jan 10, 2013
Messages
422
Trophies
0
XP
614
Country
United States
I'm waiting for that golden 1000hz display. Clear CRT like motion without flicker impulse for modern games. And CRT emulation for classic games. Can emulate the way a CRT draws scanlines line by line with a 1000hz monitor without fake scanline filters and also get motion clarity benefits you don't get with scanline filters. Black Frame Insertion has its place. But actual CRT emulation would be golden.
CRT tech was analog though so users benefited from lower latency. I highly doubt display manufacturers could invent similar tech to replicate its performance and emulation is nothing, but a pipedream stopgap, and if it could happen then it would take years and millions of dollars worth of R&D.
 

urbanman2004

Well-Known Member
Member
Joined
Jan 10, 2013
Messages
422
Trophies
0
XP
614
Country
United States
Lmfao, what??? I get 90s at 1080p, and over 100 with NIS at 720p, soooooooooooo.
I don't know anything a/b your PC specs so I could only infer your situation based on the context of your prior statements... Your comments had me under the impression that you intentionally play at lower resolutions on a display w/ a higher native resolution while using budget oriented/outdated GPUs for minimal performance for games to at least be "playable" [and I also thought you might've come off as sarcastic]. However, I can understand if the monitors you do use come native w/ a lower resolution, but why use a downsampling technology on such low resolutions 🤦🏽‍♂🤷🏽‍♂?
 

SG854

$$$$$
Member
Joined
Feb 17, 2017
Messages
4,937
Trophies
1
XP
7,144
Country
Gabon
CRT tech was analog though so users benefited from lower latency. I highly doubt display manufacturers could invent similar tech to replicate its performance and emulation is nothing, but a pipedream stopgap, and if it could happen then it would take years and millions of dollars worth of R&D.
No company would be interested in emulating CRT. Or making any CRT tech. But enthusiast with a 1000hz LCD display could through software. You could emulate scanline drawing and phosphor decay using sheer hz performance with 1000hz display.
 

urbanman2004

Well-Known Member
Member
Joined
Jan 10, 2013
Messages
422
Trophies
0
XP
614
Country
United States
No company would be interested in emulating CRT. Or making any CRT tech. But enthusiast with a 1000hz LCD display could through software. You could emulate scanline drawing and phosphor decay using sheer hz performance with 1000hz display.
You're actually supporting the point I was making w/ my previous comment, lol.
 

SG854

$$$$$
Member
Joined
Feb 17, 2017
Messages
4,937
Trophies
1
XP
7,144
Country
Gabon
You're actually supporting the point I was making w/ my previous comment, lol.
There is plans to make CRT beam simulators, easier to implement then current motion interpolation, and no need for years of R&D and millions spent. You can get good results with a 240hz OLED. With accurate results at 1000hz though it will take time for those displays to come out. There is already 8k 1000hz prototypes in laboratories right now.


https://forums.blurbusters.com/viewtopic.php?f=7&p=81936
 

urbanman2004

Well-Known Member
Member
Joined
Jan 10, 2013
Messages
422
Trophies
0
XP
614
Country
United States
There is plans to make CRT beam simulators, easier to implement then current motion interpolation, and no need for years of R&D and millions spent. You can get good results with a 240hz OLED. With accurate results at 1000hz though it will take time for those displays to come out. There is already 8k 1000hz prototypes in laboratories right now.


https://forums.blurbusters.com/viewtopic.php?f=7&p=81936
My comment was originally referencing the manufacture of some type of modern-day CRT tech, not emulation, hence me considering emulation "a pipedream stopgap".
 

SG854

$$$$$
Member
Joined
Feb 17, 2017
Messages
4,937
Trophies
1
XP
7,144
Country
Gabon
My comment was originally referencing the manufacture of some type of modern-day CRT tech, not emulation, hence me considering emulation "a pipedream stopgap".
Emulation doesn't need to be a stop gap. It can be the final end goal, something probably better then actual original CRT hardware, no downsides to magnetic interference, or geometry and Convergence issues, or a display that gets blurrier as it ages . No need to look at emulation as lesser then an actual tube display being produced. With what I was talking about you'll see no difference from a real crt to an emulated one.
 
  • Like
Reactions: urbanman2004

Spider_Man

Well-Known Member
Member
Joined
May 28, 2015
Messages
3,682
Trophies
0
Age
36
XP
4,456
Country
United States
"Those" prices? $299, $229, and $99 dollars are "those" prices? It's great you're happy with your purchase, but I'll have to assume you're not very knowledgable in the headphone department & have never heard a $50 set of cans vs a $300+ pair that actually has decent frequency response consistency, treble/midrange/bass accuracy, a good soundstage, etc. $300 is kinda nothing for a good headset. Try a pair of $1500+ open back Sennheisers and suddenly that $50 Pulse 3D headset sounds like a $1 set of ear buds from Wish. Although it's not necessery to spend THAT much on a headset to get something that will blow the Pulse 3D out of the water in a heartbeart. The Astro 50 Gen 4 is under $300 and annihilates the Pulse 3D in almost every aspect. There's a huge difference between "just fine" and actual quality. With some things you really do get what you pay for. As stated, I'm glad you're happy with your purchase. I just personally perfer audio quality over price and I would never be happy with "just fine" headphones.
But then if sony came in at a price point like your tastes, you would moan its too expensive.

But that aside, sony announcing its making products aimed towards pc gamers...... whats next, products aimed towards home cinema.
 

AbdNuts

Member
Newcomer
Joined
Aug 16, 2020
Messages
14
Trophies
0
XP
132
Country
Saudi Arabia
I probably wouldn't buy anything from them since I prefer samsung for my monitors but that's still neat. More competition means more room for companies to start making better products.
 

LainaGabranth

Well-Known Member
Member
Joined
Jun 26, 2022
Messages
426
Trophies
0
Age
53
Location
Sneed's Feed and Seed
XP
735
Country
United States
I don't know anything a/b your PC specs so I could only infer your situation based on the context of your prior statements... Your comments had me under the impression that you intentionally play at lower resolutions on a display w/ a higher native resolution while using budget oriented/outdated GPUs for minimal performance for games to at least be "playable" [and I also thought you might've come off as sarcastic]. However, I can understand if the monitors you do use come native w/ a lower resolution, but why use a downsampling technology on such low resolutions 🤦🏽‍♂🤷🏽‍♂?
it's cool
that's why, simple as
 
  • Like
Reactions: urbanman2004

alt_Human

Well-Known Member
Member
Joined
Jun 9, 2022
Messages
125
Trophies
0
Location
U.S.S. Cygnus
XP
136
Country
United States
Man I don't give a fuck about HDR, that shit doesn't justify a nearly $150 price hike over competitors.

Becasue you don't care about HDR doesn't mean no one else does or that it's not worth an additional $150. From my own personal experince, I don't notice tons of difference between 1080p and 4K unless my face is planted on the screen. But what I DO notice a huge differece with, is when Dolby Vision or HDR10+ is added on top of that 4K. I would personlly NEVER buy a tv or monitor that didn't support those technologies. And I'll glady pay more becasue the diffecent between 4K and 4K HDR is significant and WELL worth it for me personally since I also do equal amounts of gaming and movie watching.
 
General chit-chat
Help Users
  • No one is chatting at the moment.
    KenniesNewName @ KenniesNewName: https://sandwichpd.com/