Saturday, October 18, 2025

Laggard or Luddite?


Artificial Intelligence, or AI for short, is all the rage these days. Silicon Valley is all over it. AI-related start-ups are everywhere. AI-related posts and stories are flooding social media, LinkedIn and business news stories. Market valuations for AI-related companies are soaring just as dotcom companies did at the turn of the millennium.

Take Nvidia, for example. This relatively modest chip‑maker once focused on graphical user interface chips (the technology that renders images and video on device screens). In recent years, it’s become a poster child for the AI boom. Nvidia’s chips are ideal for AI because they can handle thousands of calculations simultaneously, making them perfect for the massive data processing AI requires.

Its valuation recently climbed to over $4 trillion, making it one of the most valuable companies on the planet. In essence, investors believe that as demand for AI hardware (like the specialised chips Nvidia produces) grows, so too will its profits. However, when valuations shoot up so fast, there’s always a question of whether expectations are running ahead of what’s actually possible.


Sadly, I’m old enough to remember more than one tech boom in decades past. When I began my career at IBM in 1991, the desktop computing phenomenon was taking off. Desktop computing was hot, and local area networking, the technology linking desktops together, was emerging rapidly. Likewise, new desktop applications were on the rise, including tools we take for granted these days, such as spreadsheets, word processors and email.

When I joined IDC, a technology industry analyst firm, later the same year, I quickly learned that in the previous decade, minicomputers had been the proverbial golden child. They'd progressively transferred the capabilities of room-sized, and horrendously expensive, mainframe computers onto smaller and cheaper machines. I was writing about the industry's evolution as research manager for The Computing 100, the local technology industry's annual bible. At the time, the minicomputer boom was drawing to a close, and the industry was in recession for the first time in more than a decade.


As an aside, I ghostwrote most of the publication in 1993. It subsequently opened the door for my 20-year career in technology public relations. In March 1993, I responded to an ad in the computer section of the Sydney Morning Herald's employment pages. A company called Recognition Public Relations was seeking applicants who enjoy writing about computers.  During my job interview with Steve Townsend, the company's owner, I was asked about my relevant writing experience. I placed a copy of The Computing 100 on the desk and said, "I wrote this."  Needless to say, I got the job.

By the late 1990s, the rise of the internet had superseded the desktop revolution. At the time, Recognition Public Relations was actively promoting this burgeoning technology and its day-to-day use. I still recall the first time I wrote about the internet. It was a user story prepared on behalf of a client called Softway.

It had helped implement a Unix-based internet solution for the Royal Botanic Gardens in Sydney. I interviewed the Chief Botanist, asking him about his use of the Internet. He explained that he could instantly view images of leaf images in the archives of Kew Gardens in London. I remember thinking, "How on earth do I make this internet thing sound remotely interesting?"

Soon after, the dotcom boom was upon us - followed swiftly by the dotcom bust. A few years later, the mobile app and social media boom emerged, and a new wave of Silicon Valley unicorns was born. Scroll forward another decade or so, and now it’s the turn of AI.

I’ll be honest — I still don’t fully grasp the whole “AI” phenomenon. There's also a hint of deja vu about the whole thing. Everyone talks about it like it’s the dawn of a new age, just as they did about desktop computers, the internet and mobile phones. I’ve also attended numerous events where I've seen stunning demonstrations of AI writing, drawing, creating video content and holding conversations — even making decisions of its own. It’s impressive stuff.

It's mind-blowing that machines can now “learn” from experience. I understand the theory, but part of me still finds it unsettling — a tool that thinks for itself somehow blurs the line between human and machine. In simple terms, artificial intelligence is just a fancy way of saying “teaching computers to think a bit like people.” It’s not magic — it’s math and pattern recognition.

Machine learning, the part that gives AI its smarts, works by feeding the computer huge amounts of data — photos, text, sounds, you name it — and letting it find patterns on its own. The more examples it sees, the better it gets at guessing what comes next. It’s a bit like how we learn: by making mistakes, getting feedback, and slowly improving. Show it enough pictures of cats and dogs, and eventually it can tell them apart — though it doesn’t see them the way we do. It’s really just number crunching on a massive scale, but somehow, out of all that math, it starts to look a lot like thinking.


As I look at my own business, I struggle to see the role that AI can play within it. However, we’ve been experimenting with AI in small ways. For example, we’ve used it to create lifestyle images for some of our older products lacking this kind of marketing content (like the image above). We also use it to quickly generate code for new product pages on our website. I’ve used it to help me create the first draft of advertorial promotions. I even used today to help draft this blog post.

However, I realise that I’m finally showing my age. The thought of trying to learn AI and master its use does my head in. I’d rather leave this challenge for a younger generation. I vividly recall my father expressing similar sentiments about spreadsheets. He wasn’t keen on learning how to use and apply this desktop technology. As someone who’s used spreadsheets his entire working career, I couldn’t imagine life without them. No doubt some young person will say the same thing about AI in the years ahead.

While I’m not keen to be a digital pioneer, I'm certainly no technology luddite. I’ve learned over time that entrepreneurial organisations eventually find clever ways to integrate new technologies into the mainstream and help smaller businesses deploy them in meaningful ways. Therefore, I'm more than happy to adopt this new technology as practical applications emerge and their day-to-day installation and activation become seamless. I wonder what I’ll be saying about AI in a decade from now.

No comments: