You don't need me to tell you making tech is hard... or maybe you do? | Self-taught hardware engineer discovers that GPUs really are ridiculously complex and hard to design after all | The tech I'm most psyched to see this year is anything but traditional PC silicon, and certainly isn't x86
Created for ignoble.experiment@arconati.us | Web Version
Even back in the old days when you could actually see an individual transistor, back when computers were the size of a family home, making tech was tough. Now, when you have to contend with the laws of quantum mechanics just to power a single chip it's exponentially harder. The added complexity of what today's hardware is being designed to do is just another added layer of brain-paining difficulty on top of the challenge of just designing and building the stuff.
So, yeah, it's hard. And everyone will tell you that, and will demonstrate that through their own missteps and failures, of which we're seeing multiple examples at the moment. Whether that's Intel's struggles with its baseline BIOS and its motherboard makers (not to mention the continued battle getting its discrete GPUs up to par), or Apple failing to actually hit any decent levels of sales for its Vision Pro goggle box, or Qualcomm being accused of making things up in order to make its Snapdragon X Elite CPUs look better than they might end up being, or Microsoft finding that its Windows 11 OS market share is actually dropping.
Basically, there are a host of failings to point at when making the case about what a bloody miracle it is that we get any sort of functional tech at all.
So, it's with a mix of 'duh, yeah' and 'holy hell, that's amazing' that we witness one intrepid engineer trying to build their own GPU from essentially first principles. Engineer Adam Majmudar has been chronicling his journey on X (formerly positively regarded) starting with learning how to create a simple processor and then moving on to fashioning their own GPU design.
And yeah, turns out that it's hard to do. Even though they were 'just' trying to make a GPU for parallel compute tasks and not a fully fledged gaming chip for chucking pixels and textured polygons around a screen at hundreds of frames per second.
Next time you're cursing your own frame rate, and castigating your PC for perceived poor performance, spare a thought for just what a lucky thing you are to have any kind of working silicon at all.
Anyone who has studied electrical engineering will know that with the right tools, know-how, and a lot of perseverance, it's possible to design a basic CPU from scratch. It won't be anything like today's processors, of course, but what about GPUs? Surely they can't be any different, yes? Well, one determined software engineer decided to do just that but found out that it's much harder than you'd think.
I'm a PC gamer, a tech obsessive, and probably a bit of a contrarian, too. Which is maybe why I'm always keen to support the underdog story when it comes to a new player entering the PC gaming arena. I mean, it may be a billion dollar underdog, but when it comes to proper Windows PC silicon, Qualcomm is still a n00b, and I can't wait to see how its new laptop CPUs finally shape up.
Microsoft must be wringing its hands in frustration as it tries to get users to move on from Windows 10 and embrace the wonders of Windows 11. It seems no matter what it tries, users just stubbornly refuse to move on from that old dinosaur of an OS.
No comments:
Post a Comment
Keep a civil tongue.