Tech Talk
AI, yi, yi… By Michael E. Duffy
I t’s hard to believe that ChatGPT was released in late November of 2022, just a little over two years ago. At the time, Bill Gates wrote: “In my lifetime, I’ve seen two demonstrations of technology that struck me as revolutionary. The first time was in 1980, when I was introduced to a graphical user interface.” After seeing ChatGPT just before its general release, “I knew I had just seen the most important advance in technology since the graphical user interface.” ChatGPT is a text-based interface to a Large Language Model (LLM) named GPT. It’s
other LLMs) are trained on the text available on the entire public internet. One of the issues about testing LLMs is that you have to be certain that the answers to the test aren’t somewhere online. But these test scores really don’t matter. Commercial applications of LLMs demonstrate that they can be used to assist and replace human workers, delivering value to the companies that use them. Nor is the application of LLMs solely the province of Fortune 500 companies. Read this Forbes magazine article on Google’s NotebookLM, entitled “Why Google’s NotebookLM Is A Great App For Small Business” ( tinyurl.
important to understand that LLMs are trained on large bodies of text to predict the next word in a sentence—they are basically “autocorrect on steroids.” There are those who would disagree with that assessment, arguing that today’s LLMs actually display human-like reasoning. Tesler’s Theorem humorously states, “AI is whatever hasn't been done yet.” And, indeed, the term “artificial intelligence” has been bandied about so much as to be worthless as a descriptive term. LLMs are what are more accurately referred to as “generative AI,” software which generates new content based on text prompts (for example, “write a sonnet”). You can try ChatGPT for yourself at chatgpt.com . Even better, you can now talk with ChatGPT for 15 minutes by dialing 800-CHATGPT. It’s free! Developing AI depends heavily on computing power, specifically GPUs (Graphic Processing Units) which excel at the vector-based computations used by LLMs. Nvidia is the leading producer of GPUs, and the de facto hardware choice for LLMs. Its stock traded at about $15 at the time ChatGPT was announced—today, it’s over $130 a share. AI has been very good for Nvidia stock. Buying and running all those GPUs requires a lot of money. OpenAI, the company behind GPT, has raised $21.9 billion since its inception in 2016. Other companies focused on AI include Anthropic (whose LLM is named “Claude”), Meta (“Llama”) and Google (“Gemini”), all of whom need money to fund those efforts. OpenAI CEO Sam Altman has stated that the cost to train GPT-4 (the latest version of GPT) was "more than $100 million.” He also said that he doesn’t expect OpenAI to be profitable until 2029. Nevertheless, investors remain willing to take a big chance to own a piece of a company that unlocks the holy grail of Artificial General Intelligence (AGI)—software which “matches or surpasses human cognitive capabilities across a wide range of cognitive tasks.” There are a number of benchmarks used to gauge the performance (I won’t say “intelligence”) of existing LLMs. GPT-4 has scored in the 90th percentile on the Uniform Bar Exam, and the 96th percentile on the SAT. These achievements seem impressive, except for the fact that GPT (and
com/techtalk2025-02 ). This stripped-down LLM allows you to apply its summarization skills to data of your own choosing, much as you see search results being summarized at the top of Google searches these days. To quote the author, “I'm happy to create my own little LLM to offer better and more accessible information for my team to help me run my business better. NotebookLM is doing that right now. It's shaping up to be 2025’s killer app for small business.” So, these LLMs are here today, delivering value for businesses large and not-so-large. We will continue to see companies experiment with the technology. Some will use it to replace (for better or worse) employees, others will use it to enhance the productivity of their existing staff. But bigger changes are on the horizon: so-called “agents” and, of course, AGI. Agents are the next step in the road to AGI. As with current chatbots, agents will take a prompt. But instead of responding with text (or voice, as many chatbots now can), agents will be able to actually carry out tasks on your behalf, functioning as a real executive assistant or junior researcher might. As I write this, there are rumors that OpenAI will announce “Ph.D.-level super-agents to do complex human tasks.” That would certainly be A Very Big Deal.™ I remain skeptical about true AGI, but some very smart people simply point to the evolution of the capability demonstrated by LLMs over time. By extending the trendline into the future, AGI seems inevitable. I’ll write more on AI next month. In the meantime, I welcome your questions about AI and all things tech at mike@mikeduffy.com . g
Michael E. Duffy is a senior software engineer and lives in Sonoma County. He has been writing about technology and business for NorthBay biz since 2001.
February 2025
NorthBaybiz 27
Made with FlippingBook Ebook Creator