There has been a slight mismatch between the number of sites submitted, and the number indexed. Turns out that 10 sites have a User-agent: * Disallow: / in their robots.txt. I've added those sites to the do not index list, which means if you resubmit them you'll see the message '... has previously been submitted but ... Access blocked by robots.txt'. If you see this, but have updated robots.txt to allow searchmysite.net, let me know and I'll move to the index list again. #searchmysite
Fosstodon is an English speaking Mastodon instance that is open to anyone who is interested in technology; particularly free & open source software.