hello,
I've tried adding robots.txt file on my server, and in the meta. Also i've removed my sitemap and disabled it in settings and it still doesn't seem to block search engines from crawling my site.
Are there any other known procedures I can do to get my site not to show up in search engines? Thanks
XFileSharing Pro - Remove site from search engines
the robots.txt is there as a rule for search engine crawlers if they should crawl the site. they can choose not to follow your rules.
unfortunately, you cannot stop your site from appearing in search engines. your links will be everywhere around the web posted in forums, email, blogs for search engine robots to crawl.
unfortunately, you cannot stop your site from appearing in search engines. your links will be everywhere around the web posted in forums, email, blogs for search engine robots to crawl.
of course
but what i dont understand is if new files have been uploaded as non public they also appear after a few days in google.
and they don't appear in a forum when I search them. they appear as own link in google and next to our other files like tos faq etc.
are you 100% sure that those files have been posted somewehere in a forum to appear?
but what i dont understand is if new files have been uploaded as non public they also appear after a few days in google.
and they don't appear in a forum when I search them. they appear as own link in google and next to our other files like tos faq etc.
are you 100% sure that those files have been posted somewehere in a forum to appear?
the file will not be generated by the scipt.
based, on the below code in index.cgi, if the file catalogue.rss is not found, the rest of the code of updating catalogue.rss is skipped.
based, on the below code in index.cgi, if the file catalogue.rss is not found, the rest of the code of updating catalogue.rss is skipped.
Code: Select all
if(-f "$c->{site_path}/catalogue.rss" && time-(lstat("$c->{site_path}/catalogue.rss"))[9]>3)
{
my $last = $db->SelectARef("SELECT file_code,file_name,file_descr,DATE_FORMAT(CONVERT_TZ(file_created, 'SYSTEM', '+0:00'),'%a, %d %b %Y %T GMT') as date FROM Files WHERE file_public=1 ORDER BY file_created DESC LIMIT 20");
for (@$last)
{
$_->{download_link} = $ses->makeFileLink($_);
$_->{download_link}=~s/\&/&/gs;
$_->{download_link}=$ses->SecureStr($_->{download_link});
$_->{file_name}=~s/\&/&/gs;
$_->{file_name}=$ses->SecureStr($_->{file_name});
}
my $tt = $ses->CreateTemplate("feed.rss");
$tt->param(list => $last);
open FILE, ">$c->{site_path}/catalogue.rss";
print FILE $tt->output;
close FILE;
}exit unless $ses->{dc};