DefederateLemmyMl

  • Gen𝕏
  • Engineer ⚙
  • Techie 💻
  • Linux user 🐧
  • Ukraine supporter 🇺🇦
  • Pro science 💉
  • Dutch speaker
  • 0 Posts
  • 67 Comments
Joined 1 year ago
cake
Cake day: August 8th, 2023

help-circle






  • if it’s good enough for the majority of historians

    It isn’t. Historians would love to have independent evidence of the existence and crucifixion of Jesus, but there isn’t… so most historians refrain from taking a position one way or the other. The ones that do have to make do with what little objective information they have, and the best they can come up with is: well because of this embarassing thing, it’s more likely that he did exist and was crucified than that he didn’t, because why would they make that up?

    That’s rather weak evidence, and far from “proof”.

    Not sure why you’d need more

    Well for one because the more prominent people who have studied this have a vested interest in wanting it to be true. For example, John P. Meier, who posited this criterion of embarassment that I outlined in my previous comment, isn’t really a historian but a catholic priest, professor of theology (not history) and a writer of books on the subject.



  • We are talking about addresses, not counters. An inherently hierarchical one at that. If you don’t use the bits you are actually wasting them.

    Bullshit.

    I have a 64-bit computer, it can address up to 18.4 exabytes, but my computer only has 32GB, so I will never use the vast majority that address space. Am I “wasting” it?

    All the 128 bits are used in IPv6. ;)

    Yes they are all “used” but you don’t need them. We are not using 2^128 ip addresses in the world. In your own terminology: you are using 4 registers for a 2 register problem. That is much more wasteful in terms of hardware than using 40 bits to represent an ip address and wasting 24 bits.


  • you are wasting 24 bits of a 64-bit register

    You’re not “wasting” them if you just don’t need the extra bits, Are you wasting a 32-bit integer if your program only ever counts up to 1000000?

    Even so when you do start to need them, you can gradually make the other bits available in the form of more octets. Like you can just define it as a.b.c.d.e = 0.a.b.c.d.e = 0.0.a.b.c.d.e = 0.0.0.a.b.c.d.e

    Recall that IPv6 came out just a year before the Nintendo 64

    If you’re worried about wasting registers it makes even less sense to switch from a 32-bit addressing space to a 128-bit one in one go.

    Anyway, your explanation is a perfect example of “second system effect” at work. You get all caught up in the mistakes of the first system, in casu the lack of addressing bits, and then you go all out to correct those mistakes for your second system, giving it all the bits humanity could ever need before the heat death of the universe, while ignoring the real world implications of your choices. And now you are surprised that nobody wants to use your 128-bit abomination.




  • You don’t even have to NAT the fuck out of your network. NAT is usually only needed in one place: where your internal network meets the outside world, and it provides a clean separation between the two as well, which I like.

    For most internal networks there really are no advantages to moving to IPv6 other than bragging rights.

    The more I think about it, the more I find IPv6 a huge overly complicated mistake. For the issue they wanted to solve, worldwide public IP shortage, they could have just added an octet to IPv4 to multiply the number of available addresses with 256 and called it a day. Not every square cm of the planet needs a public IP.


  • It’s when you have to set static routes and such.

    For example I have a couple of locations tied together with a Wireguard site-to-site VPN, each with several subnets. I had to write wg config files and set static routes with hardcoded subnets and IP addresses. Writing the wg config files and getting it working was already a bit daunting with IPv4, because I was also wrapping my head around wireguard concepts at the same time. It would have been so much worse to debug with IPv6 unreadable subnet names.

    Network ACLs and firewall rules are another thing where you have to work with raw IPv6 addresses. For example: let’s say you have a Samba share or proxy server that you only want to be accessible from one specific subnet, you have to use IPv6 addresses. You can’t solve that with DNS names.

    Anyway my point is: the idea that you can simply avoid IPv6’s complexity by using DNS names is just wrong.



  • white-adjacent

    You keep using that word as if it will somehow transform the color yellow into white and make your argument for you. It won’t happen. It’s yellow, and not just pale yellow but an extremely saturated and bright version of yellow. It is clearly not a natural skin tone of any race unless that person is very ill.

    If you look at a white person’s skin tone, it’s not a saturated color and the hue is certainly not yellow. If anything, it’s pink. How you can arrive at “yellow = white-adjacent” just boggles my mind. There are literally billions of people on this planet who are not white and whose skin tone is closer to the yellow of a smiley face. You can call any color with sufficient luminosity white adjacent then. Bright blue: white-adjacent. Bright red: white-adjacent. Bright green: white-adjacent. Wee look at all those white-adjacent colors:

    Anyway, I’m done with this discussion because I find you truly insufferable and I no longer want to spend my energy on it. If I can give you one piece of life advice: go find something worthwhile to get up in arms about.


  • yellow skin tone is clearly adjacent to whiteness and this was well established before aughts.

    Not it was not and it still isn’t. The reason we think of the Simpsons as white is because the context makes it crystal clear that they’re a typical white suburban family, not because of their color. If Matt Groening had made Simpsons green, purple or blue we’d still think of them as white, and at the same time smileys and later emojis would still be yellow. At best there is some parallel evolution here in the sense that both Matt Groening and Harvey Ball both chose yellow for the same reason: because it is perceived as a bright happy color.

    If you then associate yellowness exclusively with whiteness that’s purely a you thing, and honestly I find it pretty fucked up to see racial connotations like this in the most innocent things. Stop projecting your own prejudices.

    emojis caught widespread support in the mid/late aughts

    My argument is that bright yellow smileys have their own cultural lineage dating back to 1963, and it has nothing to do with skin color or race. Using these yellow smileys to express emotion in computer programs has been a thing since at least the mid nineties, not the mid/late aughts as you claim. The reason that it only appeared in the mid nineties and not earlier is technological and cultural. It has to do with the developing graphical and networking capabilities of computers around that time, and because smileys were popular in other aspects of culture around the same time. It has nothing to do with The Simpsons or other supposedly white cartoon characters.


  • The Simpsons came out in 88. You are saying most of the world got the Simpsons about half a decade later. I would say this proves the exact opposite of your point and that it is a huge world cultural phenomena. I’m shocked that I’m having the defend the Simpsons as one of the most important and impactful TV shows of all time.

    My point is, I didn’t even hear about the Simpsons until I was in Uni, which puts it around 1995-ish, but I sure knew what a yellow smiley was.

    Emoticon != emoji. Characters don’t have skin tone colors. The first emojis didn’t come out until 1999

    I meant smileys really, because that’s what they were initially called. Emojis is a more recent retroactive rebranding/appropriation of smileys by Apple when they launched the iphone.

    Anyway ICQ had yellow smiley faces 1996-ish. AIM had them 1997-ish. Yahoo!Pager, later Yahoo!Messenger, had yellow smileys in 1998. And MSN definitely had them in 1999.

    And then there’s friggin minesweeper that had a yellow smiley face all the way back in 1992:

    Image

    I guess they all watched too much Simpsons?


  • My point is that everyone, who is being honest at least, interprets the Simpsons as being white. Do you think they’re white?

    Yes, from the context it’s crystal clear that they’re white, they could be purple or green and they’d still be “white”, but I think it’s not relevant in a discussion about emojis.

    As I said, it’s no surprise the default emoji is closest to white skin. Even if that association comes from the Simpsons, emojis didn’t come out until decades after the Simpsons became a cultural mainstay.

    My point is that yellow smiley faces have been a cultural mainstay independent of the Simpsons, and that you grossly overestimate the worldwide cultural impact of the Simpsons. Most of the non-US world didn’t even get the Simpsons on TV until the mid 1990s, while smiley face t-shirts and pins were all the rage in the late 1980s and 1990s. Source: I wore them myself when I was a kid, and from your comment I’m guessing you weren’t born yet.

    And decades? The Simpsons started in 1989, while the first instant messengers already had smiley face emoticons in the mid 90s.