Thank you for explaining. Sounds like it would be wiser to go back to self hosting. Although perhaps stopping all bots would make sense. Or would we get dropped from the ineffective search engines?
You can be selective with what bots can access and what bots cannot. Just takes some homework to create that block/allow list. Probably just allow the biggest most popular search engine bots and block all the rest. But my whole complaint is that hosts are not helping with this issue when they absolutely should be. They should supply a filter to block AI as part of their service. It is also in their best interest too. While they might be making more money now from the AI issue, AI is going to put them completely out of business in the long run.
As for self hosting that is a double edged Sword. I don’t blame the FR for getting second party hosting. The server infrastructure maintenance hardware etc. is on the host. They can handle a lot more traffic capacity if needed. And they do have some tools they supply that are extremely helpful, especially security tools. So it has it’s advantages.
But at the same time it now looks like hosting is going to collaborate with the powers that be and start allowing backdoor access to databases and user details. I think Palantir is doing that right now by setting up total collection of everything from everywhere. That is when a private server would be an advantage. But you would still have to set up the bot controls and fight your own security problems.
I honestly can’t see an easy way to run from this mess as long as we continue to use the Top Level Domain and hosts do not help. Especially when it is a site that is open to the public out front. Ours is a closed private party and you have to log in to view it. This helps greatly because ALL access can be shut off to everyone including bots unless they are logged in as a trusted member.
The problem is you can block the bots but AI can still read and crawl the text on the public side. So it still ends up being a load. It would be just like massive human hits to the public side. If it is closed off from the public they cannot do this. See the situation they are putting us in? We have to pretty much shut down our own sites or sacrifice revenue, or give up the free advertising of normal search bots to get away from the AI. It is an extremely unfair business practice that needs to be addressed. And we haven’t even addressed the theft of copyrighted material AI is steal from domains yet.
I am now sure this is the plan. To destroy the open independent internet and independent hosting and make everyone dependent on their AI services. This will eliminate interaction and communication between humans. And it will give them complete control over information. They can literally alter reality and change history as they like. As you see they already are with search result control. The truth is, even though they advertise that they have AI available as an “option”, they have actually already implemented it as their main search protocol and are hiding this fact. They are not using regular old school webcrawlers anymore, it is all AI based collection and information control.
Like I say, it is a fun free toy now, but in the long run we are going to pay dearly. We are going to lose the open independent internet as we now know it. And if the hosting industry wants to continue to exist they better get on board now before it is too late. And it is time for domain owners to get proactive and put pressure on them.