Jeffrey Zeldman, one of the OGs of web design, We decided to consider a debate that has been gaining momentum lately: With the rise of AI, is the Web over? After all, it looks like the new OpenAI browser is built for another era of the Internet entirely.
Zeldman’s criticism is simple, and I can certainly appreciate it: people have been declaring the Web dead for as long as it’s been alive (and the comments have been ridiculously wrong). I’d like to take a moment to reflect on a specific naysayer: George Colony.
Colony’s name may come as no surprise if you’re not in the technology industry, but he is the founder of Forrester Research, one of the largest technology and business consulting firms in the world. If you’re a reporter with a story and you need an analyst, you’ve probably talked to someone from Forrester. I’ve talked to Forrester several times—their analysis is generally pretty good.
But there is one area where the company—specifically Colony—goes wrong. And it concerns the World Wide Web, which the colony declared “dead” or dying on several occasions over a 30-year period. In each case, Colony was trying to make a bigger point about where online technology was going without actually giving the Web enough credit for being able to get there.
/uploads/img-5.jpeg)
The 90s: The Web is dead because it’s not interactive enough
Colony’s first anti-web movement came around 1995, when his comment was referenced Christian Science Monitor Article:
Another critic, technology analyst George Coloni, has focused on the graphical part of the Internet, the World Wide Web. “The Web is dead,” he says, criticizing the system because it’s not very interactive.
His quotes have been referenced in the context of two related haters – Clifford Stoll, author of silicone snake oiland Paul Saffo, who was then director of the Institute of the Future. Each had deep underlying points. Stoll criticized its social failings; Sappho suggested that the Web is too one-directional. Meanwhile, the colony felt that the web was very stable. Computers can do much more.
Colony was one of the few to beat this drum. While Sappho and Stolle faded somewhat into history (although Stolli has modern-day defenders in the commentary), the colony retained its significance.
/uploads/WebDead1.jpg)
In 1997, the Forrester founder was quoted network worldThis means that the Web is not the horse that will take us to the next stage of digital nirvana. Instead, he said Java would take over online. (See, if he had said JavascriptThat would have been right. But plain-jane Java? Not so much. As far as consumer-facing experiences go, Flash ate his lunch.)
In another article during the same period, he explained that the problem he saw was that the Web did not go far enough. “With today’s technology we will not be able to reach an Internet economy of this size,” he said at an event sponsored by the company. national post,
Can you see the inherent flaw in his comment? He originally assumed that Web technology would never improve and would be replaced by something else – whereas what actually happened is that the Web eventually integrated everything he wanted, plus much more.
Which is funny, because Forrester’s main competitor, International Data Corp., essentially said it right in this article. “The Web is a dirt road, it’s infrastructure,” said IDC analyst Michael Sullivan-Trainor. “The concept that you can scrap the Web and start from square one is ridiculous. We’re talking about using the Web, developing it.”
/uploads/Screenshot-From-2025-10-25-11-07-07.png)
2000s: coming soon, XInternet
In the 2000s, George Colony maintained The “Web is dead” discussion is meant to emphasize that the Web will be killed, but not the Internet.
Colony’s comments proved extremely derisive in the IT field, as highlighted in a 2001 roundup of reader criticism. register Received after writing about it. As one reader said:
I think your article failed to take into account the larger equation, that nothing is static and everything is evolving, including technology. Everything described by you and your sources here has been known since before the birth of the Web, before the birth of easily accessible email, and since the creation of the ARPANET.
Despite that he was handed his donkey register Comment section, Colony still keeps it going. He still stood by his “websites are static” stance, even though the concept he tried to sell was called “Web services”, as it was in 2003. Infoworld Piece:
To clear up confusion about Web services, Colony provided a definition: “Web services are not the Web or services, but Internet middleware that enables you to connect with customers, partners, and operating groups.”
The For example, when a user searches for information about how to implement a new HR process, what comes back are implementation and training tools.
Futurists invent their own terms, because sometimes they may prove prescient. Gartner’s “Hype Cycle” is a classic example. But Colony’s attempt to rebrand the web did not succeed.
But during this period, there was one company that was getting this naming-things strategy right again and again: publishing firm O’Reilly Media. Among other things, it came with the LAMP stack, the Maker Movement, and Web 2.0. Meanwhile, XInternet, which is probably a reference to XML, which was being incorporated into HTML during this period, went nowhere.
(This is a futurist’s challenge. At one point during the 2000s, in a piece where he also shared his “Web is dead” prediction, he claimed that Oracle would become a commodity player, which actually turned out not to be true.)
The colony itself eventually adopted Web 2.0, writing in 2007:
Web 2.0 has forever changed the relationship between your company and your customer. Who is best to understand and build new relationship? Marketing. Who is best at creating the technology to accomplish this? Your Business Technology/IT Group. So, there is only one way: marketing and technology in your company must work together to design and implement your Web 2.0 strategy. And you, and only you, can breed dogs and cats with each other. This is an unnatural act, but it must happen before your company becomes an opportunist rather than a victim in the world of Web 2.0.
It’s strange that the “Web is Dead” guy would be expressing such strong views on how great the Web was. But there will be another change, and that will be at the top of the wave.
/uploads/WebDead2.jpg)
2010: It’s going to be an app ecosystem!
George Colony Viewpoint Found little support in the pages of the dead on the web wired In the 2010s, the periodical tedium topic, with its excerpt “The Web is Dead. Long Live the Internet”, co-written by Michael Wolfe.
(The story disappointingly does not mention Colony or Forrester’s earlier predictions.)
That story attracted a lot of conversation and dialogue, which Colony himself received after bringing up the topic in a revised form in 2011. leweb conference. It’s a bold choice to declare the Web dead at a conference called “The Web” in French, but they shrugged it off, saying:
The perimeter of this network is constantly becoming more intelligent. So this tells us that many old architectures are now gone.
The first model, of course, is the PC model which states that you place all your executables on the desktop. But the problem with that model is that it doesn’t take advantage of the cloud so that’s a dead model. The other model says, “Oh yes, put everything in the web, put everything in the cloud.” And the problem with that approach is that you have to run it through a network that is improving, but not at the same rate as processors and storage. But it is not taking advantage of this extraordinary growth and strength on the perimeter. So we think that the web, as you know, 95% of the web executable is on the server, not on your powerful PC, and the cloud is certainly in the central data center. So we think that’s an old model too.
So what comes out? We see a model emerging that we call the Internet of Apps, which says that we will have very powerful services in cloud data services, etc., connected and launched to very powerful applications on these local devices. When I say local device, I don’t just mean iPad or mobile. It also means PC. Ultimately this will also mean servers with a transparent interpolation between applications and those services.
And what we have today on Android is a very simplified version of this.
This proved to be a bad prediction. app And The web continued to grow and the cloud was at the center of it. This shows that networks are getting faster, and it is easier to put a more powerful server in the cloud than to make your phone or laptop faster. Chips, especially from Intel, began to stagnate during the 2010s.
I’m sure after this they talked about the death of the web, but it really went quiet for a while, until…
/uploads/WebDead3.jpg)
2020: AI goals for the web
So the recent chatter, In the wake of OpenAI’s new browser Atlas (terrible name: the browser doesn’t work like Atlas at all, even metaphorically), is that AI will eat the web.
And as early as 2023, Colony was ready with its comment on the matter, in a presentation that was covered information weekThe point of Colony is essentially that the web is disorganized, and ChatGPT is going to organize it,
“It’s all we had. But think about it. (The Web) is very, very poorly organized. The Web is really a big mess,” he said.
This is similar to the commentary we’re seeing now, albeit a little earlier than most critics. But given the fact that he has been saying this literally since 1995, except for a short period when Web 2.0 and later the cloud proved him wrong, it loses its point.
In August, Colony reconsidered, offering this take that described the Web not as dead, but as something more pathetic. “Like AM radio, still around, but nobody listens.” (Is he using the Internet the same way as me? It certainly doesn’t seem so.)
You can think all you want that the web is dying. Many people have said this over the years. But maybe don’t say it periodically over a 30-year period, in constantly changing contexts, or you’ll start sounding like Chicken Little of the Web.
I, for one, think the Web will do what it always does: democratize knowledge.
