We create all our site collections programatically with a custom site def/template. Everything works as expected, except for the crawler. It's apparently denied access to the sites. The crawl logs says:
http://server.localnetwork.lan/somesites/siteName The object was not found. (The item was deleted because it was either not found or the crawler was denied access to it.)
And in the log files I'm getting this:
08/11/2009 14:20:34.01 OWSTIMER.EXE (0x0674)
0x1560 Search Server Common
MS Search Administration
7hmh High exception in SearchUpgradeProvisioner Keyword Config System.InvalidOperationException: jobServerSearchServiceInstance is null at Microsoft.Office.Server.Search.Administration.SearchUpgradeProvisioner..ctor(SearchServiceInstance searchServiceInstance) at Microsoft.Office.Server.Search.Administration.OSSPrimaryGathererProject.ProvisionContentSources()
If I create a site collection manually the crawler is able to access it. The same users/accounts have the same access on both sites, so that shouldn't be the issue.
The code we use to actually create the site collection looks a little like this:
SPWebApplication app = SPWebApplication.Lookup(new Uri("WebApplicationUrl"));
app.FormDigestSettings.Enabled = false;
app.Sites.Add("url", "title", "description", "language code", "SiteTemplateName", "Owner.Username", "Owner.Fullname", "Owner.Email");
app.FormDigestSettings.Enabled = true;
The code has been slightly altered to protect the innocent... ;)
Any idea what we're doing wrong?
(Please note, I'm not sure if this is a programming error or a config/setup error, so I'm cross-posting with Stackoverflow)
Answer on Stackoverflow