Tech Talk
Nothing Is Real: AI-generated video is the future By Michael E. Duffy E very day it gets harder to differentiate between real video and video that has been generated
I found Runway straightforward to use, but I also wanted to try other tools, using my Jungle Cruise photo as a source. Google offers
Veo3, but I couldn’t figure out how to
using AI tools. In a world of images created by AI, how can we believe our own eyes? The other day, I shared a video on my Facebook page. It showed a scuba diver removing a barnacle-laden tire from the neck of a hapless
access its video-creation functions. My next stop was Pika Labs ( pika.art ), where I uploaded the same image as before, and asked it to create a
video from it. It took a few minutes, but eventually rendered a 5-second video. Even though I specified “quality” over “speed,” the result was considerably worse than what Runway produced. Finally, I tried Sora from OpenAI, the company that created ChatGPT. It didn’t like my Jungle Cruise image, saying, “This content can't be shown for now. We're still developing how we evaluate which content conflicts with our policies.” Oh, well. Instead, I went back to the picture of me and Mabel the dog. My basic prompt of “the dog licks the man’s face” didn’t seem to work very well. Even with some additional direction (“don’t pan the camera”), I still wasn’t happy with the results. I recommend you stick with Runway. If your business creates video for marketing, training or social media, you should definitely be learning to use AI-generated video. This 8-minute video, entitled “How to Use Text to Video AI Tools to Create Content for your Business,” will get you started ( tinyurl.com/techtalk11257 ). It covers the entire process of creating a video using AI tools. You can also download an e-book with additional information. Don’t expect to produce a long video with a single prompt. One of the key techniques for success is to break things down into a series of scenes, i.e. a story board for the entire video. For each scene, you generate a video. You then use a tool like Adobe Premiere to merge the individual videos to create the final version. A good example is that video of the scuba diver and the seal I mentioned above, although it suffers from the problem that items, like the saw, change their appearance slightly from scene to scene. A gentle reminder: The first time you try using these tools, the results are likely to suck. To be successful with AI-generated video requires you to be comfortable with that—and persevere. Good luck! g
monk seal using a saw ( tinyurl.com/techtAlk11255 ). One of my followers pointed out that the video was clearly AI-generated, which I hadn’t really noticed. It certainly seemed plausible at first glance. Of course, once the issues were pointed out to me, it was obviously a fake. Doh! Then a Facebook friend of mine posted a video he created, based on a still picture of him standing next to a bronze statue of a bear. The video ( tinyurl.com/techtalk1125 ) shows the bear coming to life and licking his face. All it took was the original still image and the prompt “the bear licks the man’s face.” Insanely easy. And while obviously not real, the movements of the bronze bear appear natural. After chatting with my friend to understand how he made his video, I decided to try it for myself, using a tool called Runway ( runwayml.com ). I signed up for a free account, which gave me 125 credits to use. I cut-and-pasted an image of me and my daughter’s dog, Mabel, sitting next to each other in the car, and told Runway to “make the dog lick the man” (yeah, pretty unoriginal, but I just wanted to see if I could make something work). Sure enough, Runway generated a 5-second video, and used up 30 of my credits. Here’s the result: tinyurl.com/techtalk11253. I tried another example, using an old photograph of me working as a guide on the Jungle Cruise at Disneyland. Runway did a passable job, but it insisted that I was talking into a telephone receiver, rather than a push-to-talk microphone. The original showed the left side of my face, and when the video showed my head turning toward the camera, my face became unrecognizable as me. You can prompt Runway to correct its errors, but in this case, it didn’t help a lot. You can see for yourself at tinyurl.com/techtalk11254. Runway credits get used up pretty quickly, though. Each credit is worth 25 seconds of time, and I burned about 25 credits every time I created or modified a video. Once you’ve used your 125 credits on the free plan, that’s it—you can’t buy more. The least expensive plan is $12 a month, billed as an annual amount, and that gets you 625 credits that renew each month. You can also buy additional credits for a penny each (minimum $10, or 1,000 credits), which is reasonable.
Michael E. Duffy is a 70-year-old senior software engineer for Electronic Arts. He lives in Sonoma County and has been writing about technology and business for NorthBay biz since 2001.
38 NorthBaybiz
November 2025
Made with FlippingBook. PDF to flipbook with ease