• barsquid@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Going from 1 TB to 2 TB SSD on this iPad is somehow $400? That is a clown price but it is a 50% discount on going from 512 GB to 1 TB. What the actual fuck?

        • Valmond@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          8GB should like be illegal, lol. Wonder if you can upgrade it (or is it soldered) or if it’s just for the landfill.

          • bamboo@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            All of Apple’s M series chips have the ram right on the same package as the rest of the SoC. Not upgradeable, but also much faster than traditional off-package memory.

            • Valmond@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              7 months ago

              Funnily Apple marketed their slower ram as something better in the nineties when memory speed was king. Now when nobody cares about memory speed (I bet most people confound bandwith with speed, forgetting latence too) but we all like to have lots of it because we use lots of programs & apps, Apple is doing the reverse.

              Lol.

              • bamboo@lemm.ee
                link
                fedilink
                English
                arrow-up
                0
                ·
                7 months ago

                In this case I was referring to bandwidth and latency, which on-package memory helps with. It does make a difference in memory-intensive applications, but the majority of people would never notice a difference. Also Apple will absolutely give you a ton of memory, you just have to pay for it. They offer 128GB on the MacBook Pro, and it’s unified so the GPU has full access to it, which makes it surprisingly good for running LLMs locally, for example.