Tech Talk
Another year ends—and the magic of technology continues By Michael E. Duffy S o, here it is, December of 2024. I’ve been thinking a lot about software lately, something that I’ve
amount of information that can be processed per second by the number of cores. For example, the MacBook Pro I’ll be using on my new job has an M3 chip with 14 cores. That Intel chip I mentioned above has 24 cores. Most desktop software, however, doesn’t really make much use of multiple cores. Google has teams of very smart people making sure that its server software does exactly that. In addition to the parallelism of multi-core CPUs, cloud-based services like Google use complete systems in parallel. For example, your Google search query isn’t executed on a single machine. Multiple systems each execute a search on a part of Google’s stored data about the web, and those results are then combined and presented to you. The Google Search index covers hundreds of billions of webpages and is well over 100
been intimately involved with since 1975—almost 50 years now. At nearly 70 years old, I am starting a new job with Electronic Arts in January, still working at the leading (if not the cutting) edge of technology, about which I am understandably proud. In 1975, when I first started programming professionally, there were no personal computers, no smartphones, no internet or World Wide Web. Today, software—as Marc Andreessen predicted—has eaten the world. I really want to convey some of the importance of that to you. I am eternally curious. For someone
with that affliction, Google is like crack cocaine. An obscure phrase (“orbicularis oculi”)? Google has the answer. But what really astounds me is what happens after I ask that question. In a matter of a couple seconds, Google returns hundreds or thousands of relevant results (now augmented with an “AI Overview” of the information, for better or worse). Since I have some understanding of what goes on between the time I hit Enter and Google presents an answer, the result is nearly unbelievable. For most people, though, it’s just a fact of life. What I am trying to convey here is the amazing speed that underlies all of this. Humans are particularly bad with large numbers. A million seconds is about 11 days. A billion seconds is nearly 32 years! What we sometimes overlook is that the number of operations that a computer can perform in a single second is also a very large number. The basic operational unit of a computer—a single addition, storage operation or conditional test—takes place in billionths of a second. As of early 2024, the fastest Central Processing Units (CPUs), like the Intel Core i9-14900KS, can reach a base clock speed of 6.2 GHz, which translates to the ability to process instructions at a rate of 6.2 billion cycles per second. It’s incomprehensible to mere mortals. Our intuition fails us at those speeds. Think about it. When you press Enter on a Google query in your browser, your text is transmitted to their servers. It is interpreted and generates a request to a vast database representing all the information which Google retrieved from all the pages on the internet. Google decides which pages on the internet are most relevant to your query and presents the result to you. In seconds. It’s unreal. Part of the secret is parallelism. Nowadays, a single CPU contains multiple “cores,” each of which is a separate processor. If a computer program exploits that parallelism perfectly, it basically multiplies the
million gigabytes in size. It’s like the index in the back of a book—with an entry for every word seen on every webpage that Google has indexed. Your brain isn’t equipped to handle the sheer amount of data that it represents. As Arthur C. Clarke put it: “Any sufficiently advanced technology is indistinguishable from magic.” Google search is Big Magic. So is Netflix streaming HD movies to your TV. Smartphones, too—we have access to the sum of human knowledge in our pocket or purse. So much of what we take for granted is arguably magic. And yet, for the most part, it’s also completely predictable. A single computer instruction does a very specific thing, and each time, the same way as before, billions of times a second. Aside from the nearly impossible chance of a cosmic ray interfering with the execution of an instruction (which really does happen, causing a glitch), computers do exactly what they are instructed to do. Boring, and yet magical. Superhuman function, created by humans (arguably, some wicked smart humans). Software is everywhere around us now. In our cars (even ones that don’t drive themselves), in our phones, even our refrigerators and thermostats. When the internet goes down, or we don’t have a cell signal, the magic is temporarily extinguished, and we see how much we rely on it. Fortunately for all of us, it comes back on, and we go back to ignoring the incredible wonder of it all. g
Michael E. Duffy is a senior software engineer for Atlanta-based mobile gaming company Global Worldwide (globalworldwide.com), who lives in Sonoma County. He has been writing about technology and business for NorthBay biz since 2001.
December 2024
NorthBaybiz 29
Made with FlippingBook flipbook maker