• 1 Post
  • 81 Comments
Joined 3 years ago
cake
Cake day: January 17th, 2022

help-circle




  • The general consensus amongst the Android community is that rooting is detrimental to privacy. In a sense, I agree with them since privilege escalation because of human error becomes a much bigger threat if the user has root access.

    No, that’s BS. It entirely depends on your “threat model” just like security.

    Namely if you go full OSHW/FLOSS and yet you volunteer your data on Facebook.com (or whatever that website is called today) then you have no privacy. It’s not a technical problem, it’s a behavior problem.

    If your threat model is about government hiring dedicated staff to know what you are up to, or that the infrastructure you rely is can’t be trusted, then rooting is the last of your problems.

    I’m not saying you shouldn’t worry but I don’t see the relevance of rooting Android in that situation. Root or not does not somehow change how your modem behaves, you’re still at the mercy of the drivers.

    I recommend you check projects like Precursor (at https://precursor.dev/ redirecting to the CrowdSupply page) which try to tackle, if I understood correctly, the kind of worry you have, namely actually understand the entire stack.

    That being said, even in such context, you still rely on some infrastructure to relay messages to others so you need that and the recipients to also respect your privacy. If not (which would be a fair assumption) then at least you must understand the cryptographic primitives you rely on… and if you don’t (which most people don’t, me included despite my interest in the mathematics behind that, in particular one-way functions) then you have to some trust in the public research in the domain.

    So… I do have a Precursor, tinker with it, PinePhone and PinePhone Pro, had an iOS phone until recently, switched to (rooted) /e/OS and my personal position is that while interacting with others (and a mobile is 100% about that) one has to make pragmatic about their choices.



  • the world runs off GitHub whether we like it or not

    It doesn’t and we don’t like it anyway.

    PS: to clarify, yes GitHub is wildly popular but, and the kernel is a particularly interesting example, it does not host ALL projects, only a lot of popular ones. A lot of very popular ones are also NOT there but rather on their own git, mailing list, GitLab instance, Gitea, etc. It’s a shortcut, I understand that, but by ascertaining it as “truth” it’s hiding a reality that is quite different and showing that reliable alternatives do exist.




  • utopiah@lemmy.mltoLinux@lemmy.ml*Permanently Deleted*
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    12 days ago

    main difference between raster graphics and vector graphics was the quality

    It’s not. The primitives, the most basic constitutive building blocks, are different, for raster it’s the pixel (a mix of colors, e.g. red/green/blue) whereas for vector it’s the … vector (a relative position elements, e.g. line, circle, rectangle or text start with).

    This is a fundamental distinction on how you interact with the content. For raster you basically paint over pixels, changing the values of pixels, whereas for vector you change values of elements and add/remove elements. Both can be lossless though (vector always is) as for raster can have no compression or lossless compression. That being said raster does have a grid size (i.e. how many pixels are stored, e.g. 800x600) whereas vector does not, letting you zoom infinitely and see no aliasing on straight lines.

    Anyway yes it’s fascinating. In fact you can even modify SVG straight from the browser, no image editor or text editor needed, thanks to your browser inspector (easy to change the color of a rectangle for example) or even the console itself then via JavaScript and contentDocument you can change a lot more programmatically (e.g. change the color of all rectangles).

    It’s a lot of fun to tinker with!


  • Switched from iOS (iPhone XS) to Android (/e/OS on CMF Nothing, installed by Murena) and 0 regret.

    I switched the same day but I didn’t transfer all content, only contacts, 2FA auth and installed most apps I needed. Transition was very easy thanks to Firefox Account and because most of what I really is Web based anyway (e.g. HomeAssistant for my self-hosted IoT setup). KDE Connect was indeed a great surprise, I thought it’d be the same as on iOS but it’s a LOT more functional. Also using Termux (rather than iSH on iOS) with access to the storage made tinkering way easier and powerful.

    My new phone is actually 1/3rd of the price of the flagship I bought 6 years ago… but they feel the same. I like that a lot because I do NOT want my phone to “feel” special, I want it to “just” be a functional piece of tech, valuable only for what it does, not what it “is”. It’s not a totem, it’s just a thing I rely on. So yes switching made that very striking.

    Overall if you want to “just” move away from iOS or Googled Android I find Murena value proposition to be on point.




  • never could get away from Windows entirely. Especially for gaming, and a few critical apps.

    Been gaming exclusively on Linux now for few years, including in VR. Just few hours ago before my work day I was playing Elden Ring with controller. 0 tinkering, System key, “EL”[ENTER] then play. So… unless you need kernel level anti-cheat, Linux is pretty good for gaming nowadays.

    Same of the few “critical” apps, I don’t know what these are but rare are the ones without equivalent and/or that don’t work with Wine, sometimes even better that on Windows.

    Anyway : Debian. Plain and simple, not BS with a mix bag of installers (but you can still use AppImage or am or even nix whenever you want to). It just works and keep on working.





  • The propaganda aspect is import so I’m adding this to a reply rather than yet another edit.

    This research is interesting. What the article tries to do isn’t clarifying the work rather than put a nation “first”. Other nations do that too. That’s not a good thing. We should celebrate research as a better understanding of our world, both natural and engineered. We should share what has been learned and built on top of each other.

    Now when a nation, being China, or the US, or any other country, is saying they are “first” and “ahead” of anybody else, it’s to bolster nationalistic pride. It’s not to educate citizens on the topic. It’s important to be able to disentangle the two regardless of the source.

    That’s WHY I’m being so finicky about facts in here. It’s not that I care about the topic particularly, rather it’s about the overall political process, not the science.


  • Thanks for taking the time to clarify all that.

    It’s not a typo because the paper itself does mention 3090 as a benchmark.

    I do tinker with FPGAs at home, for the fun of if (I’m no expert but the fact that I own few already shows that I know more about the topic than most people who don’t even know what it is, or what it’s for) so I’m quite aware of what some of the benefits (and trade of) can be. It’s an interesting research path (again, otherwise I wouldn’t even have invested my own resources to learn more about that architecture in the first place) so I’m not criticizing that either.

    What I’m calling BS on… is the title and the “popularization” (and propaganda, let’s be honest here) article. Qualifying a 5 years old chip as flagship (when, again, it never was) and implying what the title does, is wrong. It’s overblown otherwise interesting work. That being said, I’m not surprised, OP share this kind of things regularly, to the point that I ended up blocking him.

    Edit: not sure if I really have to say so but the 4090, in March 2025, is NOT the NVIDIA flagship, that’s 1 generation behind. I’m not arguing for the quality of NVIDIA or AMD or whatever chip here. I’m again only trying to highlight the sensationalization of the article to make the title look more impressive.

    Edit2: the 5090, in March 2025 again, is NOT even the flagship in this context anyway. That’s only for gamers… but here the article, again, is talking about “energy-efficient AI systems” and for that, NVIDIA has an entire array of products, from Jetson to GB200. So… sure the 3090 isn’t a “bad” card for a benchmark but in that context, it is no flagship.

    PS: taking the occasion to highlight that I do wish OP to actually go to China, work and live there. If that’s their true belief and they can do so, to not solely “admire” a political system from the outside, from the perspective of not participating to it, but rather give up on their citizenship and do move to China.