Giving ratings in itself maybe isnt the big issue here. Its basically that you announce that you are doing it, that they will matter, that they arent entirely transparent, but that people can do a few things to better them.
Also to make sure that people believe in their importance you establish that everyone monitors each other, while the state monitors you. Then you tell people, that you are doing it that way. (And that it will impact your tax score.
)
What you are producing is 'desired' behavior, social expectation of surveillance and self censorship. Together with a feeling of 'that is the new reality'. Kind of like what 'facebook optimizing' meant to people, but without an opt out.
And that you cant buy insurance, or board a train on the 'bad end' of the outcome.
The ratings stuff is not so outrageous, people tend to do it all the time anyhow. Its actually the importance that that rating gets, and that you are aware of it.
And the virtual invisible hand idea, which means, that as a state you can influence markets without anyone knowing. I think I have to spell that out once more.
If you build in those cascading effects, you can f.e. harm a company that isnt your target, to harm the target,.. And you can do it by flipping a few bits.
Oh - and this is related:
https://gbatemp.net/threads/millenn...epts-fascinating-at-hacker-conference.527431/
If you act based on ratings that arent transparent to people, it becomes a problem as well though, because of non explicit biases, and people not having the chance to counteract them. But thats maybe a lesser level of problematic (because we got used to it a while ago). The bigger problem here is that we get delegation of responsibility. "Computer said no" problem. When you dont know why (no causality - just probability), and when engineers will always 'tweak along' to try to make stuff 'work' (again). And doing that with messy data, based on a hunch. (Big Data issues
)
Also to make sure that people believe in their importance you establish that everyone monitors each other, while the state monitors you. Then you tell people, that you are doing it that way. (And that it will impact your tax score.
What you are producing is 'desired' behavior, social expectation of surveillance and self censorship. Together with a feeling of 'that is the new reality'. Kind of like what 'facebook optimizing' meant to people, but without an opt out.
The ratings stuff is not so outrageous, people tend to do it all the time anyhow. Its actually the importance that that rating gets, and that you are aware of it.
And the virtual invisible hand idea, which means, that as a state you can influence markets without anyone knowing. I think I have to spell that out once more.
Oh - and this is related:
https://gbatemp.net/threads/millenn...epts-fascinating-at-hacker-conference.527431/
If you act based on ratings that arent transparent to people, it becomes a problem as well though, because of non explicit biases, and people not having the chance to counteract them. But thats maybe a lesser level of problematic (because we got used to it a while ago). The bigger problem here is that we get delegation of responsibility. "Computer said no" problem. When you dont know why (no causality - just probability), and when engineers will always 'tweak along' to try to make stuff 'work' (again). And doing that with messy data, based on a hunch. (Big Data issues
Last edited by notimp,

