苹果硅Mac:为什么总是讨厌? | 痛苦诚实的技术



新的Silicon Macs即将面世,但它们似乎比我预料的更具争议性。 评论,推文,视频,其中很多在很多方面都是负面的。 但为什么? 我有一些想法。 让我们看一些基准测试,看看是否有帮助。 JoMo的视频:https://youtu.be/xUkDku_Qt5c Linus的视频:https://youtu.be/ljApzn9YWmk更多PHT苹果资料:https://www.youtube.com/playlist?list=PLWrpcd0vRa5Ss6bKxAIAgGtgogL1v29N5#M1MacBook ?订阅Painfully Honest Tech:https://bit.ly/2FJNZeE?成为频道成员:https://tinyurl.com/thyrzbf?Twitter:@jasontlewisPHT?电子邮件:[email protected]?PHTFacebook组:http ://bit.ly/PHTfacebook +++?适用于任何YouTuber的最佳多人工具-Tube Buddy:https://www.tubebuddy.com/PainfullyHonestTech?尝试“ Audible”并获得免费的有声读物:http:/ /www.audibletrial.com/painfullyhonesttech +++我的音乐:http://bit.ly/StarCityTunes http://bit.ly/SadIronMusic我的小说:http://amzn.to/2FQS9Pt +++☑️此视频和说明包含会员链接,这意味着,如果您单击其中一个产品链接,我将获得少量佣金。 。

21 comments
  1. I read an article on MacRumors titled "Developers delves into reasons why Apple's M1 chip is so fast" by Juli Clover. The article says that the reason M1 is so fast is not because the M1 is a fast general purpose CPU like the Intel and AMD, but it is mainly because it is a collection of specialized dedicated hardware chips that does spedific work very fast. So instead of using the general purpose CPU to encrypt files, it actually has a specific hardware chip dedicated for encryption. Similarly, instead of using the general purpose CPU to encode/decode h.264 and h.265 video, the M1 has a chip for h.264 encoding/decoding on hardware and another chip to encode and decode h.265 video. The M1 has many other chips to hardware accelerate decompressing music, for processing images, etc. and all the most used functions by the power user. So therefore the M1 appears fast, but if there is a function that it does not have hardware acceleration for, then it will use its general purpose CPU to carry that function out at speeds comparable to Intel and AMD. And on top of that, it has unified memory to accelerate memory usage. So this is why the M1 appears as if it was a very fast general purpose CPU.
    Comparing M1 to Intel/AMD is like comoparing apples to oranges, one is a general purpose CPU and the other is a collection of very fast accelerators. Ultimately, the end result is impressive. "All" Intel/AMD has to do is build a hole bunch of instruction sets in hardware into their CPU for the most used computer functions to imitate the M1 (and also reduce power comsumption). All this time people are so impressed thinking that the M1 is a super fast general purpose CPU, but the magic is in building in specialized chips for acceleration. The weakness of the M1 will show when the M1 is called to do something it is not pre-selected to handle it in hardware acceleration. When the computer is programed to do something completely new (say encode video on a future/new h.266 encoding/decoding), it will struggle and fall behind faster general purpose Intel/AMD CPUs.

  2. M1 is just way better than equivalent Intels. No contest. Linus you are wrong. Period. No need to cater and pander to all parties. Just like EV's. They are better. Period.

  3. Great video! I completely agree. Let’s be happy and enthusiastic about all new development – especially when something as exciting as the  M1 happens.

    Don’t get discouraged by all the haters our there. Most of them have no experience what so ever as to the system they hate on, which makes many of the comments utterly pathetic – regardless of which camp they come from.

    In the end, Mac and PCs are tools to be creative on. Whichever system you succeed on is fine – as long as you have fun, joy, and being productive.

    Nothing else matters…..

    Personally, a Mac’s does exactly that for me!

    Kudos to you for being objective!

  4. When you’re enjoying using your blisteringly fast new Mac spare a thought for the neophobic, it must be a difficult mindset to live with in a sea of endless change

  5. This is exactly why I unsubscribed from Linus’ channels. His videos for Apple products are very toxic. He often makes fun of their products first then add something nice to say at the end. I liked watching his videos to learn about new tech but apparently, if it’s from Apple, to Linus and his team it’s just to be mocked and made fun of. He has hasn’t used any M1 Mac yet and his video is already about “slow motion dumpster fire” from Apple’s announcement. So far, the M1 has proven him wrong and I do hope his fanboy attitude towards anything not Apple will change. What he has failed to realize is how powerful iPhones have become that their chips can now be used on laptops and desktops. THAT is one of the biggest innovations in the past 5, 10 years!

    But for now, to me, he’s the YouTube equivalent of internet trolls who can’t accept the fact that not all companies are the same and some just do their own thing, and puts out videos just for the views (and money).

  6. The architecture is similar to Fujitsu Super Computer A64FX in the #1 Super Computer. Integrated “GPU” and Memory in on unified SOC. This has been around for sometime and it’s water cooled!

  7. I love Linus but idk y he is in denial he has seen the damn scores the m1 is murdering intel and amd on single core and the integrated graphics are literally performing close to a 1060 that’s absolutely insane who cares about graphs

  8. I’m the same love to see something that pushes us forward. Look at AMD tech. Competition is brilliant for pushing the envelope. I’m getting one of new Apple computers. Just because they look really good and I love technology. I’m a PC developer/gamer. And need to learn more about MacOS and those devices too.

  9. M1 being a customised chip with zero upgradablity on anything including ram or GPU or other interfaces and closed specs. Running only one operating system and just a few applications natively. I don't think it's groundbreaking. Many custom solutions work really well in their custom ecosystem but as soon as they start supporting generic hardware configurations, OS and applications they start becoming less efficient and underperforming. DDR bus itself consumes a lot of power PCIE consumes lot of power and performance. So M1 is a good effort but that's pretty much it is.

  10. I think you nailed it. The negativity/cynicism/denial really bums me out, too.

    Sure, we may be biased, and Apple may be overselling it a bit or even putting a beast of a processor in their low-end machines, not as much slower than their future top end offerings just because they can make them for dirt cheap when compared to Intel chips, but I sense a massive reckoning coming. You know, maybe not at the same scale, but almost like when we saw Steve Ballmer and mobile phone manufacturer execs laughing the iPhone off, only to be subsequently destroyed and kept at a safe 5-year distance.

  11. Take “Apple” out of the discussion: it’s an ARMs chip….early ARMs chip “worked” even when “turned off” (residual power). What 680xx Motorola, Intel or AMD or Cyrx chip ever “worked” when “turned off”?

  12. sir, I've never watched your videos before but you got yourself a new follower! I wish everyone was as objective and factual as you are. Jumping on the " let's bash apple to make views' train is getting old, especially before the products are even released.

  13. Ha! You young whippersnapper you! We remember the transition from the 680×0 to Power-PC and I was opposed to that one because I felt the 680×0 was a clearly superior architecture.
    My wife was morally opposed to the transition to Intel and swore to never buy an Intel Mac.
    She died seven years ago and I'm beginning to suspect she was right after seeing how "Intel-thinking" has infected Mac analysts. Nowadays we're all concerned about clock cycles and "thermal throttling."
    I just heard Linus complaining about Apple wanting us to beta-test their new hardware designs as if this is the first time Apple has ever designed custom chips or something.
    These new M1 computers are stretching everyone's expectations.
    I mean they can't be serious when they say they're faster than 98% of all PCs sold last year, except they really are. Yeah, it's hurting a lot of people's brains finding everything they previously knew was wrong.

Comments are closed.