Homebrew Collection of old devkitPro versions

bobmcjr

Well-Known Member
Member
Joined
Apr 26, 2013
Messages
1,156
Trophies
1
XP
3,230
Country
United States
i hard disagree.
their "whining" is correct and a hard fork is a stupid idea seeing how much work is put into the toolkits provided by devkitPRO.
they work their asses off, pay huge bills for hosting only to have hostile, problematic, forks and half arsed guides or help cause users to barrage them, who are doing this in their free time mind you, with issues of broken and not sane installation because of projects and behavior like this.

also: to this day, i have yet to see a project that does not have stupid or insane programming practices that breaks with updates of a mature devkitPRO library (unless its caused by nintendo's changes like with switch's controls scheme). things like retroarch including an ancient libogc version in their repo, projects using internal libogc functions or just doing stuff that are fragile af.
all of that would not be an issue had they used decent and sane code or not be from ~2 decades back when libogc wasn't mature.
and im not saying the libraries provided by devkitPRO are perfect, far from. (libogc is in the bad shape, but libnx is lovely).
its because they are fixing said issues that sometimes, yes, things will break. but everyone benefits from said fixes if developers, or whomever is compiling, would update the code.

as for your statement of "they don't give a damn about maintaining compatibility", then you have NO idea how they work and what decisions they make and are blinded by your own hate/anger. as a maintainer of libogc and being in close contact with them i can tell you they most certainly do care or else libogc would look completely different now and not be as shitty as it is. but that breaks compatibility, so we dont do that (same with libnds & libgba)


this is true, and will always be true.
they would rather have homebrew be updated for latest tools (and therefor get clean, good code) but it all depends on who you end up talking to and what your experience in coding is. i know wintermute can sound hostile, and i can only say that you don't know what has pushed him to be like that and what kind of shitty code & setups they get to look at. this is a topic on its own, and this might not be the best place for that.

i would love to help people to update their wii/gc code and im sure i would be friendly chat, so keep in mind that who you talk to is more important than "the whole organization" (which is a lot more people than you'd think). a few "sour" apples, a community does not ruin (i mean, look at this shithole of a forum/website. most users would say this isn't a shithole. not me, but some users)

also, they kinda can't ignore the mirrors, seeing how much problems it gives them. be it bills and traffic control or people copy pasting bad code from some random place and then publicly saying its all broken because they copied bad code from some random mirror/fork/project. Things are a lot more complex than that, sadly.

i could go on and on about the inner workings of devkitPRO, and its problems, but thats kinda out of scope here and i'd love to have a decent convo about it, if youre open to it and start the convo to have a meaningful convo instead of just bashing.
Given nobody's stepped forward to fix Nintendont's issues that have plagued it on the latest versions on dkP, I can only assume it's very non-trivial to debug and fix, to a point where effort is better spent remaining on the same version of the tools rather than upgrading and attempting to validate everything afterwards. I have to deal with that enough at my actual job and its insanely long product lifecycles, and honestly console homebrew isn't much different. Sticking with older toolchains and containerizing them is a valid strategy. I've run out of patience with Intel/TI/etc's toolchain maintenance/licensing practices, which is why I especially don't appreciate a FOSS project playing hard to get with older versions/migration. Especially given FOSS tools *usually* keep old versions of everything around forever and want things preserved (e.g. this release of GCC from 1987 https://ftp.nluug.nl/languages/gcc/old-releases/gcc-1/gcc-0.9.tar.bz2 ).

Look, all I really want is the ability to bisect code generation myself, and roll back to older versions to facilitate that bisect by having some working version as a ground truth. To me, it's insane that dkP makes it that difficult to perform the standard software development process of a bisect between toolchain versions. Even though some versions are half broken, even TI currently attempts to provide older releases of their toolchain/IDE. I know dkP at least offers docker containers, and I applaud them for seemingly not deleting the older tags, but these are a fairly recent addition, and can we trust that these tags will never be deleted?

This homebrew for example seems to have broken some time between dkP from 2017 and 2022. I could be missing something, but it isn't doing anything obviously wrong that I could quickly identify:
https://gbatemp.net/threads/bad-apple-for-the-nintendo-ds.466504/
Edit: oh, and as a bonus, the new dkP build only crashes on hardware. Runs fine on emulators. Very easy to debug.

I will say, yes, libnx and libctr are pretty cool. It's just odd that something that should be as mature as libnds/the rest of the toolchain for it is still at a point where it can break between 2017 and 2022.

And while I could probably ask for support in this case, given dkP, or at least a certain individual in a position of power in dkP, has apparently completely barred people from support/contributing for simply being involved with TWLMenu for example, why should I even bother? Who's to say what other projects are "problematic" enough by simply existing to warrant a github-wide block (maybe Nintendont, ULGX)? This is clearly an instance where I'd say "a few sour apples" *can* ruin the community.
Maybe I'm missing something here too, but at a surface level, it appears as simple as "TWLMenu dev -> blocked". Locking a seemingly valid PR as "too heated" (that seems to have been reworked and merged by dkP staff later) without any visible trouble doesn't reflect well on the organization.

Yes, people will write bad code and use tools poorly. And yes, I fully understand that well over a decade of that can drive a man to madness. But that is the point where one needs to step back and let others handle the first line of support rather than advertising themselves as the first point of contact. Maybe document some of these edge cases or obscure migration steps on the wiki. idk. I've seen communities die at the hand of someone in power who couldn't take a back seat from user support, and while this may not be one of those situations yet, I've talked to enough people who have actively avoided engaging with dkP for these very reasons.

Maybe the mirror bills thing was valid pre-pacman, but given all of dkP's heavy infrastructure is hosted by github/docker now, bills aren't a reason to not scrape/mirror anymore, beyond the forum and wiki. People will always fuck up tutorials and whatever else, so I wouldn't use that as a reason to withhold older tools from people who know what they're doing.
 
Last edited by bobmcjr,

DacoTaco

Well-Known Member
Member
Joined
Oct 8, 2017
Messages
196
Trophies
0
XP
1,299
Country
Antarctica
a lot to unpack here, so allow me to reply in parts
Given nobody's stepped forward to fix Nintendont's issues that have plagued it on the latest versions on dkP, I can only assume it's very non-trivial to debug and fix, to a point where effort is better spent remaining on the same version of the tools rather than upgrading and attempting to validate everything afterwards. I have to deal with that enough at my actual job and its insanely long product lifecycles, and honestly console homebrew isn't much different. Sticking with older toolchains and containerizing them is a valid strategy. I've run out of patience with Intel/TI/etc's toolchain maintenance/licensing practices, which is why I especially don't appreciate a FOSS project playing hard to get with older versions/migration. Especially given FOSS tools *usually* keep old versions of everything around forever and want things preserved (e.g. this release of GCC from 1987 https://ftp.nluug.nl/languages/gcc/old-releases/gcc-1/gcc-0.9.tar.bz2 ).
i don't know anything about any nintendon't issues caused by a devkitPPC/libogc release. all i know about nintendon't is that its code is messy and does some questionable stuff so im not surprised it breaks somewhere. 8/10 im sure its one of the many warnings gcc throws at it. that said, Cios also isn't updated over all these years and devkitARM also throws a shit ton of warnings on it. i looked at its code once and it was full of questionable stuff. i sure hope starstruck will prove its stupidity but thats a few years off.
i know old versions are usually available with FOSS tools, but devkitPRO was semi forced to do what they did. do you know how many people did (and still are) knocking on their door after fucking up their installation environment in the most fucked up ways? hell, some libraries still require people to do so. you should never have to touch the devkitppc or libogc directories manually. ever.
in all these years, i have never had any of my applications break due to updates. at worst i had to deal with things being renamed or improved but never magic "i break now" stuff, and this includes starstruck which has many low level starlet code. that just means bad code was written, or code that doesn't know what a compiler should and can do. and tools like nintendont and cios should be written by people that do know what it'll do or maintained version by version just by the very nature of the tool.
this is also has been my annoyance with 9/10 tools that are here on gbafail.
also, with "dkP" i assume you mean devkitPPC? because devkitPRO is an organisation, not a tool.

Look, all I really want is the ability to bisect code generation myself, and roll back to older versions to facilitate that bisect by having some working version as a ground truth. To me, it's insane that dkP makes it that difficult to perform the standard software development process of a bisect between toolchain versions. Even though some versions are half broken, even TI currently attempts to provide older releases of their toolchain/IDE. I know dkP at least offers docker containers, and I applaud them for seemingly not deleting the older tags, but these are a fairly recent addition, and can we trust that these tags will never be deleted?
yes, yes you can trust those. the docker containers are meant for CI builds, because people were downloading and installing devkitPRO's toolkits in CI builds, which is stupid as hell. this is why docker containers exist, and why they were created.
i personally can't see a reason as why downgrading would be needed, just debug and fix the code. but incase you do have a nasty issue and want to revert, fix the issue and upgrade again. its stupid to say on old versions, and this is what people are doing, while fucking up their installations...

This homebrew for example seems to have broken some time between dkP from 2017 and 2022. I could be missing something, but it isn't doing anything obviously wrong that I could quickly identify:
https://gbatemp.net/threads/bad-apple-for-the-nintendo-ds.466504/
Edit: oh, and as a bonus, the new dkP build only crashes on hardware. Runs fine on emulators. Very easy to debug.
"dkp" is a company, not a tool. i assume you mean devkitARM if youre talking about ds.
i can't say i know ds hardware well, but keep in mind generally emulators are not really accurate. only emulators i know i can say are accurate and should be used for debugging is mgba & bgb. the issue is annoying, but the code seems to do a few weird things. i do know a few data access problems were fixed in devkitARM/libnds that allows for faster code and is more accurate to what the hardware specs are. if a problem comes up, i'd check what the difference between the app and the examples is. they are often used as a test to check if code still works.
im not saying this isn't a problem, don't get me wrong. i just think it gets overblown a bit because many don't know or realize what is changed in the toolkit. is this problem even reported? do they know about it? important questions imo

I will say, yes, libnx and libctr are pretty cool. It's just odd that something that should be as mature as libnds/the rest of the toolchain for it is still at a point where it can break between 2017 and 2022.
easy. gba scene was a mess, and had a messy merge of toolkits to make devkitARM & devkitPRO in general. nobody knew what they were kinda doing then. with every generation of console or handheld step by step they started to make better code, more optimized and with end users in mind. cooperation with devkitPRO has gotten us some advantages stuff.
this leads, sadly, leads to libnds & libogc for example to be suboptimal and require some work (specially libogc which has a shit ton of technical dept and is surrounded by a broken community... sadly.)

And while I could probably ask for support in this case, given dkP, or at least a certain individual in a position of power in dkP, has apparently completely barred people from support/contributing for simply being involved with TWLMenu for example, why should I even bother? Who's to say what other projects are "problematic" enough by simply existing to warrant a github-wide block (maybe Nintendont, ULGX)? This is clearly an instance where I'd say "a few sour apples" *can* ruin the community.
Maybe I'm missing something here too, but at a surface level, it appears as simple as "TWLMenu dev -> blocked". Locking a seemingly valid PR as "too heated" (that seems to have been reworked and merged by dkP staff later) without any visible trouble doesn't reflect well on the organization.
i personally don't know the history between TWLMenu & devkitPRO so i can not comment on this. i'd need to check or dig for that to know what happened (from both sides) to say if it was right or not. could have been that they were wrong, or somebody in TWLMenu was pissing of/working against what devkitPRO was building/doing. many options.
that said, generally if a PR is locked its because of a heated discussion or because the PR has bad code that does is not up to standards. again, i don't know what happened so im just pulling on straws here and can't comment on it further.

Yes, people will write bad code and use tools poorly. And yes, I fully understand that well over a decade of that can drive a man to madness. But that is the point where one needs to step back and let others handle the first line of support rather than advertising themselves as the first point of contact. Maybe document some of these edge cases or obscure migration steps on the wiki. idk. I've seen communities die at the hand of someone in power who couldn't take a back seat from user support, and while this may not be one of those situations yet, I've talked to enough people who have actively avoided engaging with dkP for these very reasons.

Maybe the mirror bills thing was valid pre-pacman, but given all of dkP's heavy infrastructure is hosted by github/docker now, bills aren't a reason to not scrape/mirror anymore, beyond the forum and wiki. People will always fuck up tutorials and whatever else, so I wouldn't use that as a reason to withhold older tools from people who know what they're doing.
i can't say youre wrong, and there is a reason why some of us have started doing some public work, and ive been trying to improve relations with communities of the wii. im here on this forum (sometimes, because fuck gbafail) and a few discords (including the awesome /r/wiihacks crew) and im a firm believer playing more open book could work in some advantages. i realize that communications have been bad with devkitPRO and i would like to see that changed because i understand both sides and can't blame either side either.
i still believe devkitPRO should be the point of contact, but they require people to help maintain it because they alone can't, and shouldn't do so. the organization needs a community of people that know what they are doing. what i mean with this is like how ive been trying to look after libogc, even though i don't consider myself to be worth of it haha.
if you post something on that repo, i will probably be the first point of contact. imo, every part of the toolkits and libraries needs this but its hell trying to find people who's vision aligns with devkitPRO and knows what they are saying.

also, where do you think the pacman packages or docker containers are hosted? its on their own host, not github.
and clearly you don't have to deal with the questions of people who fuck up their installations or compilation by using said tutorials or those who refuse to update because "their [shitty] code no longer compiles right", instead of reporting the issue (i currently have 55 unread messages on the forum and i am NOT looking forward to those)

Look, i believe the problem with devkitPRO is a problem on both ends of the line. both Wintermute being misunderstood because the lack of knowledge/history and people like you who bash on devkitPRO because they are doing weird things (why the hell do you need to compare a toolkit/gcc version. there is also ghidra for that) but also don't fully understand what is going on behind the curtains...
 
  • Like
Reactions: vgmoose

Extrems

GameCube Wizard
Member
Joined
Jan 17, 2013
Messages
429
Trophies
1
Location
Quebec, Canada
Website
www.extremscorner.org
XP
2,992
Country
Canada
and clearly you don't have to deal with the questions of people who fuck up their installations or compilation by using said tutorials or those who refuse to update because "their [shitty] code no longer compiles right", instead of reporting the issue (i currently have 55 unread messages on the forum and i am NOT looking forward to those)
What about when it's libogc internal code that breaks? :P
Unfortunately I got blocked from the organization for doing this as a temporary solution.

https://github.com/devkitPro/libogc/pull/103
 
  • Like
Reactions: niuus and bobmcjr

bobmcjr

Well-Known Member
Member
Joined
Apr 26, 2013
Messages
1,156
Trophies
1
XP
3,230
Country
United States
i don't know anything about any nintendon't issues caused by a devkitPPC/libogc release. all i know about nintendon't is that its code is messy and does some questionable stuff so im not surprised it breaks somewhere. 8/10 im sure its one of the many warnings gcc throws at it. that said, Cios also isn't updated over all these years and devkitARM also throws a shit ton of warnings on it. i looked at its code once and it was full of questionable stuff. i sure hope starstruck will prove its stupidity but thats a few years off.
i know old versions are usually available with FOSS tools, but devkitPRO was semi forced to do what they did. do you know how many people did (and still are) knocking on their door after fucking up their installation environment in the most fucked up ways? hell, some libraries still require people to do so. you should never have to touch the devkitppc or libogc directories manually. ever.
in all these years, i have never had any of my applications break due to updates. at worst i had to deal with things being renamed or improved but never magic "i break now" stuff, and this includes starstruck which has many low level starlet code. that just means bad code was written, or code that doesn't know what a compiler should and can do. and tools like nintendont and cios should be written by people that do know what it'll do or maintained version by version just by the very nature of the tool.
this is also has been my annoyance with 9/10 tools that are here on gbafail.
also, with "dkP" i assume you mean devkitPPC? because devkitPRO is an organisation, not a tool.
I'm still of the belief that outright nuking older versions of the tools is never acceptable. Yes, managing support can be terrible. Yes, you can hide the older versions and make it very clear that no support is provided, or demand they retry in a clean modern devkitPRO environment set up in a very specific way. These are solved issues that other projects have dealt with. Just imo, if you want more competent FOSS people to join your FOSS toolchain project, stick to standard FOSS practices. There are times when user noise can be ignored.

In regards to Nintendont in particular, there are a couple of issues. But as I understand it, the big one stemming from devkitPRO's policies is this:
  • Nintendont runs on Starlet by hijacking IOS58 and patching it to load its own code
  • Starlet is big-endian ARM
  • devkitARM dropped support for big-endian ARM, which is 100% understandable
  • devkitPRO nukes the older releases and has DMCA'd FIX94's upload of a compatible devkitARM in the past
What is the solution here? The currently-shipped gcc itself still supports big-endian ARM, and Nintendont has just resorted to uploading compiled the required static library/object files + CRT stub to the project itself. Hardly ideal, but that's the state of things.
(and fwiw the main issue on newer toolchain releases is Nintendont hanging/freezing the Wii when exiting from a game).


yes, yes you can trust those. the docker containers are meant for CI builds, because people were downloading and installing devkitPRO's toolkits in CI builds, which is stupid as hell. this is why docker containers exist, and why they were created.
i personally can't see a reason as why downgrading would be needed, just debug and fix the code. but incase you do have a nasty issue and want to revert, fix the issue and upgrade again. its stupid to say on old versions, and this is what people are doing, while fucking up their installations...
I'm just forced to question it because the older toolchains used to be available too. It doesn't instill a good sense of trust.

"dkp" is a company, not a tool. i assume you mean devkitARM if youre talking about ds.
i can't say i know ds hardware well, but keep in mind generally emulators are not really accurate. only emulators i know i can say are accurate and should be used for debugging is mgba & bgb. the issue is annoying, but the code seems to do a few weird things. i do know a few data access problems were fixed in devkitARM/libnds that allows for faster code and is more accurate to what the hardware specs are. if a problem comes up, i'd check what the difference between the app and the examples is. they are often used as a test to check if code still works.
im not saying this isn't a problem, don't get me wrong. i just think it gets overblown a bit because many don't know or realize what is changed in the toolkit. is this problem even reported? do they know about it? important questions imo
One of my main issues is that this is that breakage is something I've encountered more often than not. Which is fine, I don't expect old projects to keep up with new toolchains. And then for any other project, at my job or elsewhere, I would then go to construct a build environment using an old version of Ubuntu in a container, or wherever else. And then I get frustrated because I can't find the old versions of the tools. There's enough problems to deal with when attempting to maintain/upgrade code without having to fight the tools.

i personally don't know the history between TWLMenu & devkitPRO so i can not comment on this. i'd need to check or dig for that to know what happened (from both sides) to say if it was right or not. could have been that they were wrong, or somebody in TWLMenu was pissing of/working against what devkitPRO was building/doing. many options.
that said, generally if a PR is locked its because of a heated discussion or because the PR has bad code that does is not up to standards. again, i don't know what happened so im just pulling on straws here and can't comment on it further.
The big thing is optics. I personally make a conscious effort to avoid projects where locking gets seemingly abused as it hinders discussion and requires going through extra steps to attempt to address; where seemingly legitimate bug reports, feature requests the submitter intends to work on for tracking progress, or good-faith contributions get locked and closed with minimal public discourse.

i can't say youre wrong, and there is a reason why some of us have started doing some public work, and ive been trying to improve relations with communities of the wii. im here on this forum (sometimes, because fuck gbafail) and a few discords (including the awesome /r/wiihacks crew) and im a firm believer playing more open book could work in some advantages. i realize that communications have been bad with devkitPRO and i would like to see that changed because i understand both sides and can't blame either side either.
i still believe devkitPRO should be the point of contact, but they require people to help maintain it because they alone can't, and shouldn't do so. the organization needs a community of people that know what they are doing. what i mean with this is like how ive been trying to look after libogc, even though i don't consider myself to be worth of it haha.
if you post something on that repo, i will probably be the first point of contact. imo, every part of the toolkits and libraries needs this but its hell trying to find people who's vision aligns with devkitPRO and knows what they are saying.

also, where do you think the pacman packages or docker containers are hosted? its on their own host, not github.
and clearly you don't have to deal with the questions of people who fuck up their installations or compilation by using said tutorials or those who refuse to update because "their [shitty] code no longer compiles right", instead of reporting the issue (i currently have 55 unread messages on the forum and i am NOT looking forward to those)

Look, i believe the problem with devkitPRO is a problem on both ends of the line. both Wintermute being misunderstood because the lack of knowledge/history and people like you who bash on devkitPRO because they are doing weird things (why the hell do you need to compare a toolkit/gcc version. there is also ghidra for that) but also don't fully understand what is going on behind the curtains...
Again, optics. I believe many of us perceive devkitPRO as rather hostile. And honestly, I believe someone new shouldn't need to know Wintermute's history. Users need a better onboarding experience if that is the case. New contributors generally don't go straight to Torvalds who is also known for his wonderful ability to call out bullshit. They start on support forums or IRC and work their way up as needed. Something needs to change, and yes, a community is a very important thing to have. Again, yes, users are shitty. There's no changing that. But I am saying there are better ways to deal with the situation than how it is being handled now. Legitimate users of the toolchain are being negatively affected by devkitPRO's reactions to what they see as users doing shitty things.

I apologize, I do see the pacman packages are indeed hosted on the website. I was confused and assumed it was the pacman metadata only. Is the docker account sponsored by devkitPRO? As I understood it the docker hub images pull from docker hub's servers which should be free for FOSS.

I will reiterate, comparing builds between toolchains is not an uncommon practice. It will certainly point you in a better direction when comparing generated code rather than just digging into Ghidra blind, and if you're lucky, you can read through the commits of GCC itself/the supporting toolchain to figure out what in particular may have changed. More information is good.
 

DacoTaco

Well-Known Member
Member
Joined
Oct 8, 2017
Messages
196
Trophies
0
XP
1,299
Country
Antarctica
I'm still of the belief that outright nuking older versions of the tools is never acceptable. Yes, managing support can be terrible. Yes, you can hide the older versions and make it very clear that no support is provided, or demand they retry in a clean modern devkitPRO environment set up in a very specific way. These are solved issues that other projects have dealt with. Just imo, if you want more competent FOSS people to join your FOSS toolchain project, stick to standard FOSS practices. There are times when user noise can be ignored.

In regards to Nintendont in particular, there are a couple of issues. But as I understand it, the big one stemming from devkitPRO's policies is this:
  • Nintendont runs on Starlet by hijacking IOS58 and patching it to load its own code
  • Starlet is big-endian ARM
  • devkitARM dropped support for big-endian ARM, which is 100% understandable
  • devkitPRO nukes the older releases and has DMCA'd FIX94's upload of a compatible devkitARM in the past
What is the solution here? The currently-shipped gcc itself still supports big-endian ARM, and Nintendont has just resorted to uploading compiled the required static library/object files + CRT stub to the project itself. Hardly ideal, but that's the state of things.
(and fwiw the main issue on newer toolchain releases is Nintendont hanging/freezing the Wii when exiting from a game).
see, and this is where misinformation, and euh... bullshit claims begin and where things go wrong. devkitARM still has big endian support and compiling stuff for starlet still works just fine ...
if i can create an open source IOS from scratch, with the latest devkitARM, then the whole claim is bullshit.
as for the DMCA, i understand why they did it. it was a stupid move to upload that to the repo and it was pushing people to do the same. that is not ok and very bad practices. wouldn't be surprised that is why they did it, but don't quote me on it. so the solution? just use latest devkitARM and fix that code :)

I'm just forced to question it because the older toolchains used to be available too. It doesn't instill a good sense of trust.
time will tell, but im sure its a-ok. and until that time, trust the system man.

One of my main issues is that this is that breakage is something I've encountered more often than not. Which is fine, I don't expect old projects to keep up with new toolchains. And then for any other project, at my job or elsewhere, I would then go to construct a build environment using an old version of Ubuntu in a container, or wherever else. And then I get frustrated because I can't find the old versions of the tools. There's enough problems to deal with when attempting to maintain/upgrade code without having to fight the tools.
thats the thing, you don't have to fight the tools. why don't you do what everyone else does : build , fix errors, build, fix errors, build, fix warnings, build, fix more warnings, test and fix. that said, i can understand the need for old tools then but this is not the reason they dislike the whole old versions thing. what they do dislike is exactly what was in nintendont and guides and people telling random people to install the old versions. that is a huge nono and its better to just install latest and port the app. is it hard? yes, but you'll end up having wasted less time and probably refactoring code that needed refactoring to begin with.

The big thing is optics. I personally make a conscious effort to avoid projects where locking gets seemingly abused as it hinders discussion and requires going through extra steps to attempt to address; where seemingly legitimate bug reports, feature requests the submitter intends to work on for tracking progress, or good-faith contributions get locked and closed with minimal public discourse.
i can understand that, and optics is a thing. you see those cases from the end users point of view, i see them from the devkitPRO point of view, as i know what they are fighting sometimes. you need to moderate discussions at some point because youre putting things out there that are just... bad. and i see this in my professional life too. developers that copy paste shit from internet (stack overflow, github, ... ) without knowing what they are copy pasting.
sometimes i would have loved some things to have been filtered out myself so coworkers didn't copy paste bad, unstable code "because it worked". sure, it works but its like saying a diesel car that had petrol put in the tank would run. sure, it runs... for 5 minutes and than it breaks. after that you could should "BUT IT WORKED".
i won't deny either the closing/hiding etc stuff looks bad from the outside, but at one point you need to do something.

Again, optics. I believe many of us perceive devkitPRO as rather hostile. And honestly, I believe someone new shouldn't need to know Wintermute's history. Users need a better onboarding experience if that is the case. New contributors generally don't go straight to Torvalds who is also known for his wonderful ability to call out bullshit. They start on support forums or IRC and work their way up as needed. Something needs to change, and yes, a community is a very important thing to have. Again, yes, users are shitty. There's no changing that. But I am saying there are better ways to deal with the situation than how it is being handled now. Legitimate users of the toolchain are being negatively affected by devkitPRO's reactions to what they see as users doing shitty things.
this is why i want to change to face of libogc and devkitPPC but doing communications. i understand both sides, and want to get some hands to be shaken again haha.
that said, if i can filter stuff from wintermute, fincs & mtheall, i will because a filter is required, as you have said yourself.
this is also why we need a community, because we are all busy and all have a personal life. im not going to do 16h of dev work every day you know :P
devkitPRO has issues in trusting other communities, because of all the drama they have witnessed (lol gc, wii & switch drama).

I apologize, I do see the pacman packages are indeed hosted on the website. I was confused and assumed it was the pacman metadata only. Is the docker account sponsored by devkitPRO? As I understood it the docker hub images pull from docker hub's servers which should be free for FOSS.

I will reiterate, comparing builds between toolchains is not an uncommon practice. It will certainly point you in a better direction when comparing generated code rather than just digging into Ghidra blind, and if you're lucky, you can read through the commits of GCC itself/the supporting toolchain to figure out what in particular may have changed. More information is good.
nah, the packages are hosted on the server. this is why comcast connections had a lovely issue recently of not being able to download them. the 'download' in the hostname triggered comcast to inject their shit into the connection and fucking it all up. the docker is part of their stuff, yes.
is digging into ghidra blind though, if you have the elf from the release?
i also still believe its better to rewrite code that is broken because of a toolchain update than anything else. if a toolchain update broke it, shit is very unstable and unaware of compiler stuff
 
  • Like
Reactions: vgmoose

bobmcjr

Well-Known Member
Member
Joined
Apr 26, 2013
Messages
1,156
Trophies
1
XP
3,230
Country
United States
thats the thing, you don't have to fight the tools. why don't you do what everyone else does : build , fix errors, build, fix errors, build, fix warnings, build, fix more warnings, test and fix. that said, i can understand the need for old tools then but this is not the reason they dislike the whole old versions thing. what they do dislike is exactly what was in nintendont and guides and people telling random people to install the old versions. that is a huge nono and its better to just install latest and port the app. is it hard? yes, but you'll end up having wasted less time and probably refactoring code that needed refactoring to begin with.

Honestly last thing I have to say. Fixing the code is a nice thing to strive for, but ultimately I don't see fixing and maintaining every single project as a realistic achievable goal. For projects whose core development is ultimately finished, especially those of a specific complexity, for quick bugfix or small feature addition, it is not reasonable to demand that project do major refactorings to upgrade to the latest and greatest. Especially in industry, you're not going to upgrade that working codebase targeting a platform 10+ years old. You're going to containerize the working build environment and ensure that build environment is portable (via VM it whatever else), backed up, and keeps working. "Build, fix" is too idealistic. When you're working with things of a certain complexity, you don't even necessarily know what much of the code is supposed to do in detail, and that is a prerequisite to even get to a state where you are able to confidently say you've correctly updated the codebase. Debugging spooky subtle behavioral bugs that are exposed only on newer toolchains will take much much more time than performing that quick bugfix or feature addition on the old toolchains that builds the software such that it works correctly.

Final thoughts: I and many others will still perceive devkitPRO as overbearing, overly opinionated, and hostile so long as the silent org-bans of talented developers such as Extrems and the TWLMenu team continue (and really the burden of proof is on devkitPRO for at least these particular bans), and so long as devkitPRO believes that the DMCA and other coercive methods are acceptable to strong-arm projects into following what devkitPRO believes are "best practices", rather than working with them and offering pointers and code changes/PRs publicly while still understanding that some projects may not have the resources to constantly refactor and upgrade. If people want to try to put a square peg in a round hole, it's not your problem. Let them fail, and take a more passive approach to support.
 

Extrems

GameCube Wizard
Member
Joined
Jan 17, 2013
Messages
429
Trophies
1
Location
Quebec, Canada
Website
www.extremscorner.org
XP
2,992
Country
Canada
It's not a showstopper for me since I've worked independently from them for more than a decade (Swiss is my only project using devkitPPC), but it is problematic for sharing back to a wider audience, and so I've formally forked libogc (again) to do just that in a manner that works for me.

Despite all this, I've so far contributed USD$135 via Patreon.
 
Last edited by Extrems,

godreborn

Welcome to the Machine
Member
Joined
Oct 10, 2009
Messages
38,471
Trophies
3
XP
29,138
Country
United States
The one issue I'm aware of for nintendont and later versions of libogc, devkitPPC, and devkitARM is that it freezes upon exit, so you have to use these older versions. It can be frustrating, because you expect a project to fail at compiling if something is wrong.
 

vgmoose

Well-Known Member
Member
Joined
Jan 31, 2016
Messages
360
Trophies
1
Website
github.com
XP
3,078
Country
United States
i still believe devkitPRO should be the point of contact, but they require people to help maintain it because they alone can't, and shouldn't do so. the organization needs a community of people that know what they are doing. what i mean with this is like how ive been trying to look after libogc, even though i don't consider myself to be worth of it haha.

as for the DMCA, i understand why they did it. it was a stupid move to upload that to the repo and it was pushing people to do the same. that is not ok and very bad practices. wouldn't be surprised that is why they did it, but don't quote me on it. so the solution? just use latest devkitARM and fix that code :)

Thanks for having the conversation, and I agree with much of what's been said, but these two statements still seem in conflict with each other. A community of people is going to do "stupid" things, and we should hold firmly that a DMCA should never be understandable or acceptable to issue in that kind of situation. The DMCA is the "bad practice" here, not FIX94 providing the open-source toolchain as a direct download. Leseratte's archive of older dkP tools here is one example of the community helping to try to maintain it, and implying that it shouldn't exist because the tools and code should just be updated instead is rejecting that community assistance.

It's less work to let the community sort it out, rather than trying to moderate and stamp out every occurrence of it. I'm really trying to sympathize here but, how is the fix to this problem not to just redirect them to dkp-pacman and the latest installation instructions?

My initial interaction with WM was on this thread, where I had posted linux-oriented instructions for how to compile and "use" the still-in-development libnx (a question others were asking) to build a hello world binary. In those old instructions, I suggested some build steps that I now fully understand are undesired by the devkitPro organization. I did end up replacing the instructions with a redirect to an official dkP page on how to set up the toolchain (4 months after they were posted, following a private chat w/ WM). This redirect is actually still the top Google search result for "compiling libnx"... (I have voiced to a major switchbrew contributor the concern about needing address their own Google SEO issues, but it's a separate topic).

However, the dkP wiki doesn't tell you how to build libnx, only how to install the latest pre-built version via pacman. Granted, if you know what you're doing, building it can be considered a pretty straightforward process. But, this is the Internet, and there are tons of tech-oriented articles for how to configure packages, download utilities, create binaries, maintain environments... Especially for a developer audience. Googling error codes and piecing together information online through sometimes-outdated-articles is a huge part of the developer skillset.

I won't link to my old instructions (the removed content can still be viewed through a git diff), but I did, just now, try to follow them. They're 5 years old! There's an outdated link to sourceforge, which you could replace a different download source for devkitA64 (pacman repo, Leseratte's archive, mega download, etc...), and then there's a build error, that actually has a search result on the libnx issues page. The topic is locked, and fincs states that building it isn't advised, but the original poster did comment a "fix" that solves the error (which essentially boils down to, put other required tools on the PATH).

So we're arriving at the elephant in the room which is... Sometimes, when you're a developer, you need to have a different environment than others, depending on what it is you're looking to do. There are some in the FOSS scene who will build everything from source anyway, just to make sure it's all reproducible! On linux, this might mean, for example, you sometimes will have a custom or temporary modification to the build folders and paths that you use. "Just" updating $DEVKITPRO to point to an older installation directory actually works very well!

To be even more specific, building libnx from master in this manner (or an older revision w/ an older devkitA64) is NOT like "saying a diesel car that had petrol put in the tank would run", I would be hard pressed to imagine the "official" packages are built any other way, actually. And we're talking about disagreements over stuff like which environment variables and paths to use! Maybe on Windows this is a bigger deal that involves the GUI, but on linux it's almost a relief to see an article suggest that you need to set some ENV variables or update your path– that might address your issue!

I'm glad they now have some docker images available, although I do think the community would be able to help here as well. Personally, I now do most of my development in docker (arm64 linux) on an M1 Mac, but the actual release/development binaries are done in the cloud via Github Actions CI (x86_64 linux). Since I target multiple platforms (A64+ARM+PPC), and need to have the same versions of the tooling across two different OS architectures, I use my own cross-arch docker image (the image is built once via locally-hosted CI runner and is not a strain on their servers). The images and old tags are hosted for free fby the Github Container Registry, and the date of each tag reflects the state of all the required tools at build time.

This is an common practice across the industry– both in the FOSS community and internally within private orgs. Spinning up your own image for you and others to use for no-dependency reproducible builds is almost the point of the containerization technology. So that's how I personally have solved the "how to install and version" these tools issue. I've read and poured over WM's blog posts (1, and 2) on this topic. This is not a confusion of cats, except maybe, if you were a diehard Windows user and don't see the value in these kinds of practices.

I can say for certain that what IS more like a "bad practice" is distributing an unsigned binary of another OS's package manager, and asking the user to install it by overriding system policy. I can absolutely see why the effort to making this work across every OS and its package manager is not trivial, however, that's exactly the sort of stuff that a community can help create. In the case of macOS, for instance, I would expect to be able to do something like "brew install dkp-homebrew-tools" to just pull in everything. And when there are known bugs, other users can make posts and articles about them, without having to worry about it being copyright-striken or escalating. If a community maintainer or set of instructions is outdated, HB devs can judge that for themselves by looking at the date and comments if this guide or another guide makes more sense.

And sometimes stuff actually and unavoidably breaks. For instance, the way input was detected changed between Switch system firmware version. In that case, the literal only advice that can be taken is to pull in the latest release of libnx and re-compile all old binaries. If it's "maintained" via community wrappers, or even a frozen docker image, it's easy to see how this could create confusion or delays. I will readily admit that! However, this is not a unique problem either. All you have to do in that scenario is ask the user to check if they're using the latest toolset, and if the wrapper is popular enough it will either be updated or someone will fork and fix it.

Anyway, it's another long post, but I still believe it's better to treat dkP more like an actual organization and less like a set of FOSS tools that a community should be involved in. Unless there's been a serious change in opinion to the issues surrounding re-hosting and redistribute older (or even current) versions of their tools. From this perspective, I can fully understand the concern around wanting to prevent derivatives and forks, in the same manner that Nintendo patches their consoles/games and don't allow downgrades. You want one cohesive set of tools that is endorsed, supported, and all works together. It just runs against the goal of community-led efforts to catalogue, document, and make certain "unsupported" use cases easier.
 
  • Like
Reactions: cristian64

vgmoose

Well-Known Member
Member
Joined
Jan 31, 2016
Messages
360
Trophies
1
Website
github.com
XP
3,078
Country
United States
At least for docker, pulling and pushing them to Github or Gitlab's container registry should be pretty straightforward. Really annoying move from Docker here...
 

impeeza

¡Kabito!
Member
Joined
Apr 5, 2011
Messages
6,397
Trophies
3
Age
46
Location
At my chair.
XP
18,900
Country
Colombia
Just now you become my lifesaver, Studios-pancake have a incompatibility with latest devkitA64 so downgrading using your repo is the way to go now. THANKS A LOT. :grog:
 

DacoTaco

Well-Known Member
Member
Joined
Oct 8, 2017
Messages
196
Trophies
0
XP
1,299
Country
Antarctica
my C++ knowledge is not good enough, but this seems to be an issue of specs not being followed by the compiler before and it is now (since its now using GCC 13 )
see : https://stackoverflow.com/questions/11069108/uint32-t-does-not-name-a-type

if you want to use std::uint32_t you should include <cstdint>. this was always the case, but GCC used to implicitly include it
this has been changed so that code need to actually include what they will use.
see : https://gcc.gnu.org/gcc-13/porting_to.html

... this was 5 min of looking around and asking 1 C++ developer...
 
Last edited by DacoTaco,
  • Love
Reactions: impeeza

impeeza

¡Kabito!
Member
Joined
Apr 5, 2011
Messages
6,397
Trophies
3
Age
46
Location
At my chair.
XP
18,900
Country
Colombia
my C++ knowledge is not good enough, but this seems to be an issue of specs not being followed by the compiler before and it is now (since its not using GCC 13 )
see : https://stackoverflow.com/questions/11069108/uint32-t-does-not-name-a-type

if you want to use std::uint32_t you should include <cstdint>. this was always the case, but GCC used implicitly include it
this has been changed so that code need to actually include what they will use.
see : https://gcc.gnu.org/gcc-13/porting_to.html

... this was 5 min of looking around and asking 1 C++ developer...
Thanks, yes I am on the same train, I really don't know too much about C++ the strange thing is if your environment have devkitA64 r21-3 the same code build fine, but once you update to devkitA64 r22-1 the exact same code of Studious generate that error.
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
    AncientBoi @ AncientBoi: :tpi::rofl2: +1