• 9point6@lemmy.world
    link
    fedilink
    arrow-up
    156
    arrow-down
    2
    ·
    2 days ago

    We really need someone other than Qualcomm & Apple to come up with lossless Bluetooth audio codecs.

    TBF the whole Bluetooth audio situation is a complete mess

      • cmnybo@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        20
        ·
        1 day ago

        That’s what happens when you have a 25 year old protocol and try to maintain backwards compatibility through all of the versions.

        • Comment105@lemm.ee
          link
          fedilink
          arrow-up
          6
          ·
          edit-2
          23 hours ago

          The world of audio would be more of a mess if Bluetooth was developed scrapped and replaced according to what seems to be your recommendations. I’m glad they did it the way they did.

          It’s not time for change. Just alternatives for snobs.

      • tabularasa@lemmy.ca
        link
        fedilink
        arrow-up
        11
        ·
        1 day ago

        Can we name a more poorly implemented protocol? Probably. One used as much as Bluetooth? Probably not.

      • Onomatopoeia@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 days ago

        Comes from being a compromise “standard”. The name says it all, being named after a king that brought multiple tribes together.

    • Natanael@slrpnk.net
      link
      fedilink
      arrow-up
      64
      arrow-down
      1
      ·
      2 days ago

      Opus! It’s a merge of a codec designed for speech (from Skype!) with one designed for high quality audio by Xiph (same people who made OGG/Vorbis).

      Although it needs some more work on latency, it prefers to work on bigger frames but default than Bluetooth packets likes, but I’ve seen there’s work on standardizing a version that fits Bluetooth. Google even has it implemented now on Pixel devices.

      Fully free codec!

        • Natanael@slrpnk.net
          link
          fedilink
          arrow-up
          24
          arrow-down
          10
          ·
          edit-2
          12 hours ago

          Nobody needs lossless over Bluetooth

          Edit: plenty of downvotes by people who have never listened to ABX tests with high quality lossy compare versus lossless

          At high bitrate lossy you literally can’t distinguish it. There’s math to prove it;

          https://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem

          At 44 kHz 16 bit with over 192 Kbps with good encoders your ear literally can’t physically discern the difference

            • Natanael@slrpnk.net
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              11 hours ago

              Why use lossless for that when transparent lossy compression already does that with so much less bandwidth?

              Opus is indistinguishable from lossless at 192 Kbps. Lossless needs roughly 800 - 1400 Kbps. That’s a savings of between 4x - 7x with the exact same quality.

              Your wireless antenna often draws more energy in proportion to bandwidth use than the decoder chip does, so using high quality lossy even gives you better battery life, on top of also being more tolerant to radio noise (easier to add error correction) and having better latency (less time needed to send each audio packet). And you can even get better range with equivalent radio chips due to needing less bandwidth!

              You only need lossless for editing or as a source for transcoding, there’s no need for it when just listening to media

              • gaylord_fartmaster@lemmy.world
                link
                fedilink
                arrow-up
                2
                ·
                11 hours ago

                This has strong “nobody needs a monitor over 120Hz because the human eye can’t see it” logic. Transparency is completely subjective and people have different perceptions and sensitivities to audio and video compression artifacts. The quality of the hardware playing it back is also going to make a difference, and different setups are going to have a different ceiling for what can be heard.

                The vast majority of people are genuinely going to hear zero difference between even 320kbps and a FLAC but that doesn’t mean there actually is zero difference, you’re still losing audio data. Even going from a 24-bit to a 16-bit FLAC can have a perceptible difference.

                • Natanael@slrpnk.net
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  edit-2
                  6 hours ago

                  The Nyquist-Shannon sampling theorem isn’t subjective, it’s physics.

                  Your example isn’t great because it’s about misconceptions about the eye, not about physical limits. The physical limits for transparency are real and absolute, not subjective. The eye can perceive quick flashes of objects that takes less than a thousandth of a second. The reason we rarely go above 120 Hz for monitors (other than cost) is because differences in continous movement barely can be perceived so it’s rarely worth it.

                  We know where the upper limits for perception are. The difference typically lies in the encoder / decoder or physical setup, not the information a good codec is able to embedd with that bitrate.

        • Natanael@slrpnk.net
          link
          fedilink
          arrow-up
          5
          ·
          1 day ago

          That’s more than a codec question, that’s a Bluetooth audio profile question. Bluetooth LE Audio should support higher quality (including with Opus)

    • BlackEco@lemmy.blackeco.com
      link
      fedilink
      arrow-up
      23
      ·
      2 days ago

      Wait, did Apple implement its own codec? I thought even the Airpods Max used AAC, which is lossy.

      As for Qualcomm, only aptX Lossless is lossless and I’m not aware of many products supporting it (most supports aptX HD at most)

      • cogman@lemmy.world
        link
        fedilink
        arrow-up
        24
        ·
        2 days ago

        Yeah, the problem (imo) isn’t lossy v lossless. It’s that the supported codecs are part of the Bluetooth standard and they were developed in like the 90s.

        There are far better codecs out there and we can’t use them without incompatible extensions on Bluetooth.

        • Natanael@slrpnk.net
          link
          fedilink
          arrow-up
          15
          arrow-down
          1
          ·
          2 days ago

          There’s a push for Opus now, it’s the perfect codec for Bluetooth because it’s a singular codec that fits the whole spectrum from low bandwidth speech to high quality audio, and it’s fully free

            • Natanael@slrpnk.net
              link
              fedilink
              arrow-up
              7
              ·
              1 day ago

              Transparency is good enough, it’s intended to be a good fit for streaming, not masters for editing

                • Natanael@slrpnk.net
                  link
                  fedilink
                  arrow-up
                  11
                  ·
                  1 day ago

                  You literally can not distinguish 192 Kbps Opus from true lossless. Not even with movie theater grade speakers. You only benefit from lossless if you’re editing / applying multiple effects, etc, which you will not do at the receiving end of a Bluetooth connection.

      • KoalaUnknown@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        2 days ago

        The newer H2 SoC AirPods support ALAC, Apple’s lossless codec; however, their phones don’t yet support it, so the only way to use it is with the Vision Pro.

        • zod000@lemmy.ml
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          2 days ago

          AFAIK, ALAC will not be actually lossless over bluetooth for the sames reason LDAC can’t be lossless; there simply isn’t enough bandwidth. That doesn’t mean that it won’t sound great or perhaps work better than LDAC.

            • zod000@lemmy.ml
              link
              fedilink
              arrow-up
              5
              ·
              2 days ago

              Oh, so they aren’t on bluetooth at all? That is an entirely different story, thanks for the info.

              • KoalaUnknown@lemmy.world
                link
                fedilink
                arrow-up
                2
                ·
                2 days ago

                Yes, the protocol used is currently proprietary. That being said, so was ALAC at launch and they later made it open-sourced and royalty free.

    • conicalscientist@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      2 days ago

      Well bluetooth doesn’t carry enough bitrate to accomplish this. Besides. Apple won’t and doesn’t need to because their AAC encoder is superior. There is no other bluetooth codec that comes even close. Every codec that claims to be the best one yet is more marketing than anything.

      Vendors reframed the narrative for SBC to be dog shit so they can push their own as cutting edge new tech. In reality SBC isn’t that bad. The vendor codecs aren’t that good. And Apple has some kind of secret sauce in their AAC encoder that results in really good quality reproduction of audio.

      As far as I’ve seen most of the gimmicky codecs are spins of existing old technology. AAC itself is old too but at least one vendor Apple has focused on making their implementation good. We don’t need another standard+1. We just need a common standard done well. If only Apple would open theirs.

      • JohnEdwa@sopuli.xyz
        link
        fedilink
        arrow-up
        4
        ·
        1 day ago

        BT 5 has max bandwidth of 2Mbps, which would in theory be enough for “CD quality”, i.e 44.1khz/16 bit raw uncompressed audio, as that’s around 1.4Mbps. In real life conditions it isn’t. AFAIK aptX lossless gets close by doing some compression.

        But if you go full audiophile levels and start demanding lossless 192khz 24 bit audio, that’s 10Mbps and not even remotely possible over BT no matter what you’d try.