Hacking Marseille upscaler

notimp

Well-Known Member
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,420
Country
Laos
Spend 100USD to make the picture oversharpened. Well, if that's your thing you'll be able to use it without CFW.
Has a tendency to oversharpen (has found to cause white lines on the edges of black white high contrast patterns to increase 'edge definition') but does a few more things. Anti aliasing (averaging color levels around what it detects as edges), for once - without the tendency to smear an image (because it leans towards then artificially sharpening again :) ).

Also - it doesnt only do edge sharpening, but also full field deblurring (sharpening).. :)

To create detail, that wasn't there in the original. Which means - not 'creators intent' so any purist will shudder at that point automatically.

But then - we live in an age, where we use AI based upscaling (mClassic is not similar, this is real time - low effort, best guess, based on nothing but a stupid algo.. ;)) to make FF9 look better than ever before. (see: https://sites.google.com/view/moguri-mod/ And yes, it is also oversharpened then.) We use texture filtering in 3D game emulation to do something similar.

And it does something your TV upscaler usually does not - which is the antialiasing part (best guess, based on contrast differences) - so its still, interesting (I'm buying one).


Now - all of their marketing is BS. It is not a graphics card. It is not magic. It is not a replacement for a new graphics card. It can not make 720p look like 1080p.

It is an image post processor that only works on the image level (doesnt know whats a texture, whats and object edge - and therefore has to guess), thats probably best compared to how FXAA works (https://en.wikipedia.org/wiki/Fast_approximate_anti-aliasing) that has to upscale an image to be effective (invent 'sub pixels' based on trying to - quickly - identify, whats what) - and that then oversharpens the image. ;)

For how much there is wrong with that producsts marketing, its still interesting though. As it does something akin to anti aliasing - as a post processing effect - and the result doesnt look completely crap. :)


This is a good article to read if you are trying to wrap your head around what it does:
https://pcper.com/2017/09/marseille-mcable-gaming-edition-remove-aliasing-with-an-hdmi-cable/

This is a good video to watch for the same reason:


And yet - at the same time, in that same video, the youtuber looks at a clearly oversharpened image of Luigis Mansion, and hypes it. ;)

So...

Hard to say. Interesting enough, that I'm buying to see, mostly for retrogames. (Me: AV nerd, that calibrates all his TVs (with equipment), and turns down artificial sharpness settings to 0 (where they belong).)

The problem here also is, that with indigogo, as you are "helping a company to get a product out" (yeah, right), you dont have a return policy once you are buying to 'test'. Be aware of that as well.

--------------------- MERGED ---------------------------

Also from how the campaign 'feels' this is a chinese product, that is event marketed in the US, by people that don't know a single thing about the product, and are making up all kinds of bull, as they are going along.

But in their indigogo comments section, they are able to answer highly nerdy questions - its, just not the company founder that can. Also it is a product thats pretty thought out in terms of functionality (does the right stuff with signal level, so the user doesnt have to change TV settings), so at least some intelligent people worked on it.

They have something. :) We just dont know what exactly. And those buying before testing, are paying to find out. ;)
 
Last edited by notimp,

teamlocust

Well-Known Member
Member
Joined
Oct 28, 2017
Messages
315
Trophies
0
Age
40
XP
1,491
Country
India
my samsung SUHD ks9500 delivers awesome graphics and no jaggies.. Whether ps4 pro, xbox one x or switch..
It all depends upon your television or monitor upscalers capabilities so the mcable is basically for people having cheap ass tvs or monitors...
I know people will hate me for saying the above, but i know the truth hurts..
 

notimp

Well-Known Member
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,420
Country
Laos
Also - one more thing:

If your TV's scaler is crap. (Most are.) And this thing give you results that are subjectively better. You just updated your TV for 60-80 bucks. ;) Which kind of isn't bad. (TV will then still scale from 1440p to 4k - but still.)

Also - you could still add more blur with your TVs post processing. :) (Its usually called mpeg noise reduction in TV settings.. ;) )

--------------------- MERGED ---------------------------

my samsung SUHD ks9500 delivers awesome graphics and no jaggies.. Whether ps4 pro, xbox one x or switch..
It all depends upon your television or monitor upscalers capabilities so the mcable is basically for people having cheap ass tvs or monitors...
I know people will hate me for saying the above, but i know the truth hurts..
But it does so with lag. :) If you are playing in gaming mode - it does none of what you are saying - and this is just your imagination at work.

Which kind of also is an issue here. Highly subjective stuff. :)

Also - I have a TV thats more expensive than yours. ;) (Just for good measured one upping.. ;) ).

Also - if you feed 480p or 720p signals from retro consoles, those always will have jaggies. Because they mostly did no form of anti aliasing back then. (For quick reference, see: https://old.reddit.com/r/dreamcast/comments/1h2enu/what_dreamcast_games_use_antialiasing/ )

But - where you are entirely correct - DONT buy for use with PS4 Pros and Xbox One X's - this thing does need to "upscale" to work, so those consoles outputting native 1440p on many games - means the thing can not upscale them at all. (Only upscales to 1440p for 31-120fps content.)

On 24p (fps) and 30p (fps) content, it upscales to 4k. On most games it will only upscale to 1440p. Thats also the reason, they demo it with 1440p monitors. It has a bandwith limitation that doesnt allow it to upscale to 4k on higher than 30fps content (games usually are) (they usually output at 60Hz even if they have framerates barely over 30 fps).

So they don't do it only because monitors dont have great scalers. There are actual reasons why thats the best thing to do for a demo.

But then - we only have opinions here. Your guess is as good as mine (mine _is_ informed though.. ;) ).
 
Last edited by notimp,

notimp

Well-Known Member
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,420
Country
Laos
Also - if your favorite youtuber talks much about "enhancing" colordepth, or color vibrancy. You now know - that you've been listening to a moron all those years. :)

So its great for that as well. :)

The thing here is, that youtubers, that capture a real big color difference in the image, have a result that looks like a big gamma change. Which most likely points at a signal level mismatch in the chain. The product here does the best to prevent that (reads your TVs EDID and tries to match its capability with singnal level) but, if f.e. the capture card they are using isn't telling the right information here - they end up with a signal level (full/limited) mismatch.

If you watch the footage captured by Adam Koralik, or the folks at PCPer, there simply isn't that much color difference introduced (which is good in my book). There is some though, but only minor, and never to an extent where "blacks look darker, and highlights look brighter". Thats bull you get fed by the youtubers (The ones that arent influenced by getting their products free of charge).

I'm positive, that I am correct here, because the company has explained several times in the indigogo comments, that it isnt HDR compatible, and even that its output signal has 8bit color depth, which means "deeper colors" arent even theoretically possible here.. ;) (8bit is the bt709 standard. ;) ).

So any "color adjustment" the device does, is simply making colors more wrong in the non HDR color spectrum. (Probably by not accounting for differences between NTSC and bt.709 color spaces - being optimized for rec709, in the Adam Koralik example) (Thats why minimal color change is preferable anyhow.. ;) )

I might measure what it does exactly once I have it - but again, those sources listed above mostly get it right. But everyone else on youtube - currently doesnt. They are more likely to hype any differences, because of "something". They probably like Arabian Prince from NWA. (Again - event marketing.)

:)

Not sure why I'm taking the time to write about all this.

Probably just - because I have a healthy dislike for people getting fed the wrong terms, and the wrong conclusions, based on morons, which they trust. ;)
 
Last edited by notimp,

notimp

Well-Known Member
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,420
Country
Laos
Also - at the on scene presentations at E3, they didnt set TVs correctly (on purpose, probably) (brightness level mismatches), to get more people to say, that mClassic/mCable looks superior.

Those things are always fun. :) But you see even brands like Samsung resorting to that in demos. Its an industry standard. ;)

On their youtube video 1440p monitor test they didn't though. So thats much appreciated. ;)

Their main claim to fame is, that Linus liked them, several months back (previous product):


So they now try to replicate the awareness hike, by trying to repeat that with more youtuber marketing. The thing is, that they cant. No more, more influencial youtubers out there in their 'field' of potential customers than him. ;)
 
Last edited by notimp,

maddenmike95

Well-Known Member
Newcomer
Joined
Mar 26, 2009
Messages
89
Trophies
1
XP
529
Country
United States
I will be getting one for sure when it's released, mainly using it for my modded wii u. This is going to be great for playing some older nintendo classics, well for me mainly gamecube and wii games. This is a perfect add-on for a modded wii u because it can play the whole back catalog of nintendo games from every nintendo console that was made before the 3ds and switch. Not to mention all the available emulators for the console.
 
Last edited by maddenmike95,

Rahkeesh

Well-Known Member
Member
Joined
Apr 3, 2018
Messages
2,178
Trophies
1
Age
42
XP
3,261
Country
United States
Monitor scalers are always garbage, so this seems pretty good for connecting your Switch to a 1440p monitor, or prior gens to 1080p.
 

fvig2001

Well-Known Member
Member
Joined
Aug 21, 2006
Messages
931
Trophies
1
XP
2,926
Country
Philippines
I own the older models that I got for like $30. It's okay. It does improve the graphics but it's very subtle. I only notice it if I really look for it. I tried chaining 2 and it gets too shiny at times. Annoyingly, it doesn't work well with the PS2 HDMI adapter. This one is probably better in some ways due to the force 4:3 selector but I wouldn't upgrade to this one given my setup. If it upscaled to 4k@60 from 1080p @60 then maybe I would consider upgrading 5 years later. I'd rather be cheap and wait for emulators to catch up so that we can upscale natively rather than use an adapter that guesses best edges based on current data only.
 
Last edited by fvig2001,

notimp

Well-Known Member
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,420
Country
Laos
They are now talking open plattform scalers for potential future products on the comments. So in the future, they might release a product, where you could customize the scaling profile to your liking (amount of sharpening...). Not for this product though. :)

And not confirmed yet - they are looking into it. :)

Also the 4:3 force, doesnt seem to be a "hard force" on 720p+ resolutions. So those always get pushed through "as is" and not morphed to 4:3.

They are trying to minimize user error here. ;)
 

LoggerMan

Well-Known Member
Member
Joined
Jun 10, 2011
Messages
566
Trophies
1
XP
842
Country
Anyone ordered one of the mclassic cables? Apparently for some gamecube/wii games it makes a huge difference. I am playing a lot of those old nintendo games on my WiiU and the picture already looks great, except for the jagged aliasing on hard edges. You can count the jagged edges from across the room and it distracts from the image a lot. If the mclassic could blur even half of those edges then it'd improve the image quality a lot imo. The cable costs $170AUD and I want one, but I know that next year there will be an even better and cheaper cable, and five years from now smart TVs will come with programmable input modes that let developers make apps that do all of this on the fly in the TV itself, I mean probably. If you keep putting super powered ARM processors inside TVs then eventually people will want to use it for all kinds of niche things.

Is the cable worth buying now just to smooth out a few jagged edges. I could wait a few years for something better and cheaper, but will I even be interested in playing old games anymore by then.
 

Nincompoopdo

Well-Known Member
Member
Joined
May 20, 2017
Messages
597
Trophies
0
XP
2,683
Country
United States
Are there any HDMI frame-interpolating devices out there? Sony PSVR comes with a black box that turns 60 fps to 120 fps through PS4 HDMI output signal. I always wondered if there's a 3rd party device that can do the same thing for TV. It will be great to double the frame rate of 30 fps Switch games to 60 fps.
 

pcwizard7

Well-Known Member
Member
Joined
Aug 2, 2013
Messages
1,409
Trophies
0
XP
1,688
Country
Australia
I give this ago if its junk oh well but i do use a hdmi switch setup so I might plug in there

The PS2 isn’t a fair test since it has the worst video processor or output which is why component is best ur going get from it and no upscale looks good with it
 
Last edited by pcwizard7,

The Real Jdbye

*is birb*
Member
Joined
Mar 17, 2010
Messages
23,285
Trophies
4
Location
Space
XP
13,843
Country
Norway
Anyone ordered one of the mclassic cables? Apparently for some gamecube/wii games it makes a huge difference. I am playing a lot of those old nintendo games on my WiiU and the picture already looks great, except for the jagged aliasing on hard edges. You can count the jagged edges from across the room and it distracts from the image a lot. If the mclassic could blur even half of those edges then it'd improve the image quality a lot imo. The cable costs $170AUD and I want one, but I know that next year there will be an even better and cheaper cable, and five years from now smart TVs will come with programmable input modes that let developers make apps that do all of this on the fly in the TV itself, I mean probably. If you keep putting super powered ARM processors inside TVs then eventually people will want to use it for all kinds of niche things.

Is the cable worth buying now just to smooth out a few jagged edges. I could wait a few years for something better and cheaper, but will I even be interested in playing old games anymore by then.
Dunno, it's supposed to help a little, but it's still upscaling. If you want the best quality possible, Dolphin is the way to go. Some games look like they were made for HD when played through Dolphin, like Super Mario Galaxy 1&2.
Are there any HDMI frame-interpolating devices out there? Sony PSVR comes with a black box that turns 60 fps to 120 fps through PS4 HDMI output signal. I always wondered if there's a 3rd party device that can do the same thing for TV. It will be great to double the frame rate of 30 fps Switch games to 60 fps.
Most TVs already have it built in. I'm not aware of a separate device to do it, because most people seem to hate that effect.
 

The Real Jdbye

*is birb*
Member
Joined
Mar 17, 2010
Messages
23,285
Trophies
4
Location
Space
XP
13,843
Country
Norway
My TV doesn't have that feature, nor my computer monitor. 60 fps is bad for movies but it's good for games and porn. :D
It won't be called that, it'll be called something dumb like "Crystal Motion" or something like that. Every manufacturer has their own name for it.
You can't double the framerate from 30 to 60 anyways, because the signal is output at 60hz. Best you could do is double it to 120 and some manufacturers even claim 200hz or 300hz but I doubt the panels are actually capable of displaying those refresh rates so I'm not sure where they're getting those numbers from.
60FPS is fine for both movies and games (the only reason they still do movies in 24 hz is because existing equipment already works with it, and it takes far less editing than 48 or 60. Back in the day 24 hz was a necessity not a choice but these days it's just a remainder of a bygone era) but when it comes to interpolation you can't just create more information than wasn't in the original signal so it's not going to be the same as a true 60FPS signal. Never tried it with games on my TV, didn't mind it for TV/movies, but as said, most people seem to hate it whether it's for games or movies.
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • No one is chatting at the moment.
    I @ idonthave: :)