BOOJ websites blocking Google via Robots.txt Major SEO issue

We have been working with a large brokerage to transition them off of the Booj platform and one of the big reasons they hired us was because their SEO performance was so bad with the Booj platform.

@Michael_C lead of our SEO team did a deep dive, and while there were a lot of problems to fix, there was one that I quite honestly can’t even comprehend.

BOOJ is quite literally blocking Google from indexing their website using the Robots.txt file

I though this might have been an accident on the one site, or must be a configuration error, however on another call today another broker client of ours (different state) brought on one of his agents who wanted help to understand why his website was also (in effect) banned from Google.

I took a look, and sure enough - Booj has blocked Google (and all search engines and spiders) from indexing his website too.

This is CRAZY!! (Screenshot below)

As a professional SEO, there is no greater sin (it’s a death sentence for your business) than accidentally getting your clients banned from Google.

Why would any website company (in effect) do this on purpose?

These agents, teams and brokers on Booj can’t even rank for their own names. (Search the domain and this is how it shows up)

Now, Booj does not claim (That I am aware of) to be an SEO company or provide SEO so I do want to give them the benefit of the doubt that this could in fact be a mistake due to lack of SEO knowledge (maybe they were trying to accomplish something else here?) - But honestly, this is MASSIVE in terms of an error.

If you want to remove this file blocking Google, Just delete this code in the Robots.txt

User-agent: *
Disallow: /

(you’re welcome)

If you are on Booj and you want to know if this is affecting you, post your URL below, we’re happy to take a look.

And if you want to hire professional SEO’s who know what they’re doing? You know where to find us.