Does anyone have an idea on how to add a robots.txt file to an azure deployment of Sitefinity for different environments. For DEV and TEST I don't want the search engines to be crawling my sites so I want a robots.txt file like this:
However for production I would want them to crawl it.
I'm not sure, other than adding the file to the web project, how to get it up to Azure. Adding it to the package is fine but how do you add a file based on the deployment?
var pageMan = PageManager.GetManager();
var pages = pageMan.GetPageNodes().Where(p => p.RootNodeId == SiteInitializer.CurrentFrontendRootNodeId);