I am developing a new CMS for our company and for ease and clean code decided to have just the one file in root, index.php. We use search engine friendly URLs as well as unique IDs to load pages. Using SEF URLs Google will index the pages individually fine, even if they are all parsed through index.php.
However sometimes we have a client with a host which either uses windows or has mod_rewrite off, so we can"t use SEF URLs. Our URLs in this case would look like http://website.com/?pid=2
Would Google still index each page in this case? It would still see a separate page Title and Description for each page, along with different content and layout. But the URL points to index.php all the time, just with some GET data on the end. I"m not sure if all the pages would be indexed.
Can someone answer this as I don"t want to construct a CMS parsing everything through index.php if Google won"t list pages that don"t have SEF URL"s?
Respostas:2 para resposta № 1
If you want your pages to be indexed by Google, your best bet is to include a sitemap that references them all. This way Google knows to look for them and will list them as individual pages.
This allows you to include pages that are referenced only by a GET variable by including them in the XML as a separate entries:
<?xml version="1.0" encoding="utf-8"?> <urlset xmlns="http://www.google.com/schemas/sitemap/0.90"> <url> <loc>http://www.example.com/</loc> <lastmod>2013-04-27T23:55:42+01:00</lastmod> <changefreq>daily</changefreq> <priority>0.5</priority> </url> <url> <loc>http://www.example.com/?id=12</loc> <lastmod>2013-04-26T17:24:27+01:00</lastmod> <changefreq>daily</changefreq> <priority>0.5</priority> </url> </urlset>
0 para resposta № 2
Of course. Search google
site:www.example.com where example.com is a known site that uses this method. You will see google index them accoringly