

BGA, like in the photo, isn’t the only option. There are options only slightly larger with hand-solderable packages (if you’re good at soldering)
BGA, like in the photo, isn’t the only option. There are options only slightly larger with hand-solderable packages (if you’re good at soldering)
In principle, yes. It depends on your Linux distribution though, I’m not familiar with the one you’re using
It’s asking for the ability to take screenshots, which is definitely suspicious unless there’s an in-app screenshot feature, and for the ability to launch discord and interact with it. The thing is it’ll be interacting using your discord account, I expect. That means it’ll be able to see your conversations and all the servers you’re in. It’ll also be able to post as you. Again, that’s the sort of thing which is very suspicious unless there’s some way in the app to have conversations over discord for some reason (maybe a bug report button, or a social feature).
Basically, I’d consider both of these alarming but not necessarily evidence that they’re spying on you to collect personal data or training data for an AI
True, that should have occurred to me. That’s what I get for not touching a compiler since the Christmas holidays started
That’s easy. The 2038 problem is fixed by using 64-bit processors running 64-bit applications. Just about everything built in the last 15 years has already got the fix
Using that fix, the problem doesn’t come up again for about 300 billion years
Stories about events we can identify in the archeological record, probably. Forest fires, major battles, geological events, things like that which can be used to line the stories up with specific real-world events
No, I’m arguing that the extra complexity is something to avoid because it creates new attack surfaces, new opportunities for bugs, and is very unlikely to accurately deal with all of the edge cases.
Especially when you consider that the behaviour we have was established way before there even was a unicode standard which could have been applied, and when the alternative you want isn’t unambiguously better than what it does now.
“What is language” is a far more insightful question than you clearly intended, because our collective best answer to that question right now is the unicode standard, and even that’s not perfect. Making the very core of the filesystem have to deal with that is a can of worms which a competent engineer wouldn’t open without very good reason, and at best I’m seeing a weak and subjective reason here.
The reason, I suspect, is fundamentally because there’s no relationship between the uppercase and lowercase characters unless someone goes out of their way to create it. That requires that the filesystem contain knowledge of the alphabet, which might work if all you wanted was to handle ASCII in American English, but isn’t good for a system which needs to support the whole world.
In fact, the UNIX filesystem isn’t ASCII. It’s also not unicode. UNIX uses arbitrary byte strings, with special significance given to a very small number of bytes (just ‘/’ and ‘\0’, I think). That means people are free to label files in whatever way they like, and their terminals or other applications are free to render them in whatever way seems appropriate, without the filesystem having to understand unicode.
Adding case insensitivity would therefore actually be significant and unnecessary complexity to add to the filesystem drivers, and we’d probably take a big step backwards in support for other languages
I couldn’t find the actual pinout for the 8 pin package, but the block diagrams make me think they’re power, ground, and 6 general purpose pins which can all be GPIO. Other functions, like ADC, SPI and I2C (all of which it has) will be secondary or tertiary functions on those same pins, selected in software.
So the actual answer you’re looking for is basically that all of the pins are everything, and the pinout is almost entirely software defined