SEO experts worldwide have long debated whether the release of thousands upon thousands of new or additional websites could have a detrimental effect on Google’s ranking. For example, what would happen if a company launched 50,000 new or additional product descriptions? This question is not unreasonable. Customers who rely on automated text creation and use a Text Robot for this are often faced with this consideration.

There has been much debate in professional circles about the possible effects. Is it good to put so much new content online? Does it have a rather negative effect on the ranking? Or does it not matter at all?

Since there has never been an official statement on the part of Google, the credo was: better to play it safe and publish in several steps – better safe than sorry.

A clear statement for the first time

Now, John Müller, Google manager and one of the search engine giant’s top webmasters, has taken a stand on precisely these cases in an expert question and answer session. When asked whether it was OK to publish 100,000 web pages at once, he said: “Sure, no problem!”

From an SEO point of view it is absolutely fine to publish everything at once. The only thing that needs to be ensured is that the new content is detectable by Google and that the web server can handle the queries of the Google bot.

“In one go” should be a clear preference

Furthermore: Müller is against a step-by-step publication. The structure of the search engine index would be less problematic overall if it did not have to be put together piece by piece by new results from the bot.

One thing is thus clear: If you rely on automated text creation and have a Text Robot produce a lot of fresh and unique content, you can publish it immediately. And that is exactly how it is intended.

You can find the interview with the question and Müller’s answer in this video (from 19:12).

YouTube

By loading the video, you agree to YouTube's privacy policy.
Learn more

Load video

The transcribed text in original:

Can I publish a hundred thousand pages at once?

Sure! No problem.

I think the the biggest impact, the biggest effect you’ll see is that Googlebot will try to crawl a lot of pages. Especially if it’s a larger well-known site, where we already know that crawling pages and indexing pages from that site makes a lot of sense. Then we’ll probably kind of try to crawl a lot of pages on your server and it will be good if your server had the capacity to deal with that.

But from an SEO point of view that’s generally not an issue.
Some sites put their archive online all at once. I think kind of artificially introducing a kind of a trickle into the index is something that often causes more problems than it solves anything. So obviously putting a hundred thousand pages online all at once doesn’t mean that these pages will rank all at once immediately. It’s still going to take time for us to actually rank those pages properly in the search results. But I don’t see anything against putting them all on online.

Further contributions on the topic of Text Robots:

Share post: