Absolutely, and that's the same reason there should be no controversy when a platform like Parler gets kicked off Google Play and the Apple Store for failing to moderate entirely. I expect any Trump-founded social media platform will share in that total lack of moderation, and share the same fate as a result.
You have to consider the fact that Twitter and Facebook have the capital required to deploy AI and handle a lot of the moderation automatically - platforms like Parler (which I personally think is a grift) and Gab (which I'm more partial to, but it's still too janky for me to join) moderate manually with limited manpower, mostly basing decisions on reports. In all factuality, I would almost prefer if those decisions were always up to a human instead of a machine, it makes the network more organic, but that simply doesn't scale well unless you let everyone just run wild.
Besides, the Apple and Google Store issue is more a matter of anti-trust than 230 - it's very clear to me that the two companies are simply putting up a hedge around their marketplaces despite knowing full-well that those emerging platforms, or any emerging platform for that matter, will not have the capabilities of Twitter or Facebook overnight. The last thing I want is a sanitised Web, but since we live in closed app ecosystems now a new service doesn't even get a chance to grow if you don't have an app. Scale needs to be a factor. If anything, smaller platforms deserve *more* protection than Facebook or Twitter since they don't have the means to moderate content nearly as effectively.
If 230 isn't modified, and quickly, the future of free exchange of thoughts and ideas lies in encrypted and/or distributed platforms like Mastodon where every peer is a part of the network. I am Spartacus. Can recommend, actually - Mastodon servers are fun. Be careful not to play into Zuck's or Jack's hands too much - you might live to regret it when they look for a new boogeyman.
