• @carl_dungeon@lemmy.world
    link
    fedilink
    English
    101 year ago

    I mean, cudos to them for participating in the arms race, but again, M3s are shipping right now, not in 6 months, and judging by the only 11 month window between m2 and m3, we’ll see another bump from Apple next year too. Further, the articles goes on and on about how the 80watt TDP isn’t for the chip but the whole system but then also points out that no one knows the power draw of the chip at all, yet still goes on and on about efficiency.

    I’m happy that other chip makers are jumping in the race, even intel has gotten a kick in the pants to stop sitting around releasing high priced garbage because no competition, but all this marketing hype around the X Elite is premature and just a marketing gimmick to try and keep people interested for 6 more months until something finally ships.

    • @morrowind@lemmy.mlOP
      link
      fedilink
      11 year ago

      It’s still fairly impressive for their first chip.

      Also apple just moved to a new node with m3. I doubt next year’s will be as big of a jump.

      • @carl_dungeon@lemmy.world
        link
        fedilink
        English
        11 year ago

        Compared to what- m1? So m1 and m2 use the same 5nm process, but m2 had small evolutionary improvements https://www.tomsguide.com/face-off/apple-m2-vs-m1

        M3 has been another fairly substantial leap forward. I got an m3 MacBook last week and I’m seeing insane speed ups over my octo-core i9 it’s replacing, especially for heavy compute tasks like video encoding and ML workloads while generating next to no heat, rarely any fans at all, and an crazy battery life. Generating stable diffusion images went from about a minute to 5 seconds, handbrake encodes went from about 35fps to 196. I fired up steam and gamed for 4 or 5 hours and barely used 25% of the battery. 95% of the tasks I do don’t even turn on the fans, totally silent passive cooling most of the time. The whole thing is barely warm on my lap.

        Now, my i9 on the other hand was a lap burner that had two fan settings: annoying and holy hell.

        • @iopq@lemmy.world
          link
          fedilink
          01 year ago

          That’s what I’m saying, m4 will be a refresh of the m3, don’t expect more than an Intel performance increase

  • @Hiko0@feddit.de
    link
    fedilink
    English
    101 year ago

    Let‘s wait for independent benchmarks then. I wouldn‘t trust any company after the devastating shake up of the industry after the release of Apple Silicon for computers.

  • AutoTL;DRB
    link
    English
    -11 year ago

    This is the best summary I could come up with:


    That makes all those benchmarks relative but a tad outdated, especially since Intel’s Meteor Lake is a considerable jump in efficiency and performance, as claimed by the company.

    Recently, those Apple M3 chips have been reviewed, primarily to critical acclaim, and we can see how they compare to Qualcomm’s Snapdragon X Elite and its Oryon processor (via Geekbench 6).

    The company is coming out with its first-gen Oryon processor, and not only does it do well against the M2 series, but it also competes very well against the new M3 and M3 Pro (M3 Max is much more powerful due to it having 16 cores versus Qualcomm’s 12).

    That said, if we’re being honest, it is Intel and AMD that are Qualcomm’s real competition, as most humans don’t jump between PC and Mac when buying a new laptop and instead tend to stick with one ecosystem or even one brand due to familiarity.

    Clock speed, cache size, memory structure, and the chip’s architecture also matter, so AMD often competes against Intel using fewer cores, costing less money, and using less power.

    Likewise, Apple is cagey with its performance-per-watt claims, only noting that when matching the M1’s peak performance (so the M3 is not running at its actual higher speed), the new M3 uses 50% less power.


    The original article contains 1,802 words, the summary contains 216 words. Saved 88%. I’m a bot and I’m open source!