i thought i asked this before but i can't find a thread by me with this question so i'll try again
basically on one of my sites i have used a new to me technique to try and increase the relevancy of my site to more then one town. i did this by adding links for each town, then changing the keywords to match that town. seems to be working ok but i'm concerned about the duplicate content google may see on each of my pages, as very little has changed.
if it's ok as it, i'm considering doing this for another site offering more service, so i will have say 6 service, each linking to 6 towns with the same content for each service, resulting in 36 page, with only 6 being different. will this still be ok as the duplicate content is only on my site? You could use the rel=canonical tag in you on-site links. try to revise some words or use spinner to make it unique. It would affect the serp on the landing page. In my own experiences, using different title tags and description tags for different linked pages would be OK. Try to use town name with your main targeted keywords or variants of your main keywords would be better.
Thanks, Since each of your content has a little change made, then I would assume all of them has only 10% unique. It would be found out in a soon by Google and most likely to be de-indexed somehow.
I would recommend you to make a spun yourself, or hire someone. You should make them at least 40% unique to be safe. I know exactly what you mean. Google will only index some of those pages, not all.
Try to make page sizes different too, maybe by adding more or less graphics to each page. Google does compare file size too.
Your method does work but not all pages get indexed and it takes so long to copy pages and change title, keyword and description tags too. Depending how many pages you are talking about, I personally did the same trick and generated close to around 700 pages that way
Glad to hear its working for you but I always error on the safe side and use as much fresh content as possible. This just looks more natural and hey your visitors will enjoy it better too. Also glad to hear the good advice from Fox Trots - did not know that Google compares file size. Although there is NO duplicate content penalty, you are probably better off varying to some degree the content within each individual page. What happens is that when Google's algorithms are crawling pages they compare keywords, keyword relevance to content, content and the quality of it. So if the content on some of your secondary pages are more relevant to the keywords on your homepage, your home page's SERPs may result affected. NOW, not to be misinterpreted, this only means that once the pages with higher relevance are indexed the other pages will be ignored. So you are probably better off having as much original content as you can. It's not that difficult to create content for the 6 services so why not just create new content for them? In this way, you can add keywords that can target those specific pages and not just as a whole.
basically on one of my sites i have used a new to me technique to try and increase the relevancy of my site to more then one town. i did this by adding links for each town, then changing the keywords to match that town. seems to be working ok but i'm concerned about the duplicate content google may see on each of my pages, as very little has changed.
if it's ok as it, i'm considering doing this for another site offering more service, so i will have say 6 service, each linking to 6 towns with the same content for each service, resulting in 36 page, with only 6 being different. will this still be ok as the duplicate content is only on my site? You could use the rel=canonical tag in you on-site links. try to revise some words or use spinner to make it unique. It would affect the serp on the landing page. In my own experiences, using different title tags and description tags for different linked pages would be OK. Try to use town name with your main targeted keywords or variants of your main keywords would be better.
Thanks, Since each of your content has a little change made, then I would assume all of them has only 10% unique. It would be found out in a soon by Google and most likely to be de-indexed somehow.
I would recommend you to make a spun yourself, or hire someone. You should make them at least 40% unique to be safe. I know exactly what you mean. Google will only index some of those pages, not all.
Try to make page sizes different too, maybe by adding more or less graphics to each page. Google does compare file size too.
Your method does work but not all pages get indexed and it takes so long to copy pages and change title, keyword and description tags too. Depending how many pages you are talking about, I personally did the same trick and generated close to around 700 pages that way