How to Fix Duplicate Content Issue in Blogger – If any of the above questions is true then you must fight against duplicate content issue within the site. I already wrote an article on how to identify and remove low-quality content from Google but in this article, we will be discussing how to fix this issue permanently. Duplicate content within the site is a big issue and most of the bloggers are not aware of how to fix it. Even my blog ranking got lowered with duplicate content issue after moving to a responsive theme.
- Are you facing duplicate content issue?
- Is your ranking getting lowered with the increase of posts day by day?
- Did you got hit by Google Panda?

How to Identify Duplicate Content:
- Go to Google and search for site:yoursite.com.
- This will show up the list of all the pages that are indexed in google.
![]() |
Site:alltechbuzz.net |
Sometimes you may notice a mobile version of pages and search results getting indexed in Google as follows.
- yoursite.com/2013/04/keyword1-keyword2-keyword3.html?m=1
- yoursite.com/2013/04/keyword1-keyword2-keyword3.html?m=0
- yoursite.com/2013/04/keyword1-keyword2-keyword3.html/search?abcd
The content in such kind of pages resembles the content of the main url thereby forming duplicate content.
You can also find Duplicate content via Google Webmaster Tools:
- Go to Google Webmaster Tools.
- Now navigate to Optimization then HTML Improvements.
- There you can find the list of all Duplicate Content.
![]() |
Duplicate Content in Webmaster Tool |
- The most commonly faced problem is Duplicate Meta Description. In this article, I will be explaining how to deal with this issue.
How to Fix Duplicate Content Issue?
Remove Duplicate URLs from Google:
- Go to Google.com and search for site:yoursite.com
- Now this will list you all the list of urls that are indexed in Google.
- Find the URLs that are ending with the wrong extension like ? m=1,?m=0 etc.
- Now open that page and copy the url of that particular page.
- Then go to your webmaster tools then navigate to Optimization then Remove URL’S.
- Then click on Create New URL Removal Request.
- Then paste the url there and then submit.
- When Google indexes your website for the next time those submitted urls must be disappeared from search results.
Step to Avoid Duplicate Content Getting Indexed in Google in Future:
What does this code do?
- This code makes Google index only those pages ending with “.html” and it blocks all those pages which are ending with urls like “.html?m=1” and “.html?m=0” etc.
Configuring URL Parameters in Google Webmaster Tools:
This step is optional.Avoid it if you are not confident enough to configure Webmaster Settings.
Google Webmasters provides an additional feature to configure URL parameters that get indexed in Google. Use this with precautions or else your whole website might get de-indexed.
- In Webmaster Tools navigate to Configuration then URL Parameters.
- You can see the url parameters as follows.
![]() |
URL Parameters in Webmaster Tools |
- By default, Google sets certain crawl value for each of these parameters.
- Now the main issue is with the mobile version parameter which is “m“.
- Click on edit and follow the settings shown below in the screenshot and save those settings.
![]() |
Mobile Parameter Settings |
Changing Robots.txt File:
Note:This is only for Advanced users.Use this only if you are using responsive theme or use this only if you are not able to fix this duplicate content issue after following all the other steps.
- Open your Blogger Dashboard.
- Then go to Settings then Search Preferences.
- In Search, Preferences enable custom robots.txt which will be disabled by default.
- Enable it and paste the following code.
User-agent: Mediapartners-Google Disallow:
User-agent: *
Disallow: /search
Allow: /
User-agent: *
Disallow: /*.html
Allow: /*.html$
Sitemap: https://www.alltechbuzz.net/feeds/posts/default?orderby=UPDATED
- In the above code replace www.alltechbuzz.net with your website url then save it.
Custom Robots Header:
- Open blogger dashboard then Settings then choose Search Preferences.
- Enable Custom Robots Header Tags which is disabled by default and follow the below settings show in the screenshot.
![]() |
Custom Robots Header for Blogger/Blogspot |
- If you are done with all the settings then save the settings to get into effect.
Nice post. Duplicate post harms site reputation
I will check if any duplicate content prob on my website and may seek your help.Thanks for sharing this is very informative and helpful for all.
thanks, this was so useful for me ..
thanks a lot sir.
before reading this post i was in big problem that was the duplicate content.
now i am happy.
many thanks that’s was very useful for me 🙂
But this robot.txt will block Google’s mobile bot. Isn’t it?
Then our rankings in mobile will decrease or not?
I was analyzing my blog in Verify If your Blog Mobile Friendly through Google by using your Robot.txt and it said the blog is redirected to blocked page.
ya, i agree, duplicate content kills the blog
It’s great work
Explain gorgeous
Niiice job.
Then our rankings
Thanks Bro, in url parameters not showing m parameter
I will fix in robot txt file, how many days take time for removing duplicate issue please give me reply as soon as possible
thanks, this was so useful for me and my friends
Thanks Brother for this useful tutorial. How much time this will take for removing duplicate content issue?
This information is highly useful though I discovered that if one disable the ?M=1 the mobile version will vermose from the search engine. I used a tool to check my duplicated contents and I discovered that all the pages were 2 so I wondered how it was possible after searching on Google and found only my home page URL indexed so after a long thought, I found out that the second pages were the mobile pages and from the results, PC pages has more autority than mobile pages.
If you are having duplicate issue in your blog, try using canonical link in your html. This was specifically recognized by Moz. And it worked for me.
Thanks for information, usefull article