Page Indexing without content
-
Hello.
I have a problem of page indexing without content. I have website in 3 different languages and 2 of the pages are indexing just fine, but one language page (the most important one) is indexing without content. When searching using site: page comes up, but when searching unique keywords for which I should rank 100% nothing comes up.
This page was indexing just fine and the problem arose couple of days ago after google update finished. Looking further, the problem is language related and every page in the given language that is newly indexed has this problem, while pages that were last crawled around one week ago are just fine.
Has anyone ran into this type of problem?
-
I've encountered a similar indexing issue on my website, https://sunasusa.com/. To resolve it, ensure that the language markup and content accessibility on the affected pages are correct. Review any recent changes and the quality of your content. Utilize Google Search Console for insights, or consider reaching out to Google support for assistance.
-
To remove hacked URLs in bulk from Google's index, clean up your website, secure it, and then use Google Search Console to request removal of the unwanted URLs. Additionally, submit a new sitemap containing only valid URLs to expedite re-indexing.
-
It seems that after a recent Google update, one language version of your website is experiencing indexing issues, while others remain unaffected. This could be due to factors like changes in algorithms or technical issues. To address this:
- Check h reflag tags and content quality.
- Review technical aspects like crawlability and indexing directives. Deck Services in Duluth GA
- Monitor Google Search Console for errors.
- Consider seeking expert assistance if needed.
-
@AtuliSulava Re: Website blog is hacked. Whats the best practice to remove bad urls
something similar problem also happened to me. many urls are indexed but they have no content in actual. My website(scaleme) was hacked and thousands of URLs with Japanese content were added to my site. These URLs are now indexed by Google. How can I remove them in bulk? (ScreenShoot attached)
I am the owner of this website. thousands of Japanese-language URLs (more than 4400) were added to my site. i am aware with google url remover tool but adding one by one url and submitting for removing is not possible because there are large number of url indexed by google.
Is there a way to find these url , downlaod a list and remove these URLs in bulk? is Moz have any tool to solve this problem?
-
I've faced a similar indexing issue on my website https://mobilespackages.in/ myself. To resolve it, ensure correct language markup and content accessibility on the affected pages. Review recent changes and content quality. Utilize Google Search Console for insights or reach out to Google support.
-
@AtuliSulava It sounds like you're experiencing a frustrating issue with your website's indexing. I have faced this issue. Unfortunately, I have prevented my website from indexing in google by mistake. Here are some steps you can take to troubleshoot and potentially resolve the problem:
Check Robots.txt: Ensure that your site's robots.txt file is not blocking search engine bots from accessing the content on the affected pages.
Review Meta Tags: Check the <meta name="robots" content="noindex"> tag on the affected pages. If present, remove it to allow indexing.
Content Accessibility: Make sure that the content on the affected pages is accessible to search engine bots. Check for any JavaScript, CSS, or other elements that might be blocking access to the content.
Canonical Tags: Verify that the canonical tags on the affected pages are correctly pointing to the preferred version of the page.
Structured Data Markup: Ensure that your pages have correct structured data markup to help search engines understand the content better.
Fetch as Google: Use Google Search Console's "Fetch as Google" tool to see how Googlebot sees your page and if there are any issues with rendering or accessing the content.
Monitor Google Search Console: Keep an eye on Google Search Console for any messages or issues related to indexing and crawlability of your site.
Wait for Re-crawl: Sometimes, Google's indexing issues resolve themselves over time as the search engine re-crawls and re-indexes your site. If the problem persists, consider requesting a re-crawl through Google Search Console.
If the issue continues, it might be beneficial to seek help from a professional SEO consultant who can perform a detailed analysis of your website and provide specific recommendations tailored to your situation. -
@AtuliSulava Perhaps indexing of blank pages is prohibited on this site, look for more information on how to check the ban on indexing in which site files...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Chat GPT
I want to get your thoughts on Chat GPT for creating articles on my site to drive SEO. Does Google approve of this type of content or not? I seems quite good quality - I suppose a key question also is: is it duplicate content? I have used on Propress website and also on blog sites so need to understand if this will reduce my rankings. Thanks
Content Development | | Katie231
Matthew1 -
Rankings preferring English URL
We've recently had a redesign of our website and we have both a Dutch and an English version. However, in MOZ for both NL and BE-NL it seems to favor the English URLs. This never used to be the case and I'm wondering why it's happening and whether it could actually be hurting our SEO, as search engines would favor local languages for search queries.
Local SEO | | Billywig0 -
Is page speed important to improve SEO ranking?
I saw on a SEO Agency's site (https://burstdgtl.com/search-engine-optimization/) that page speed apparently affects Google ranking. Is this true? And if it is, how do I improve it, do I need an agency?
On-Page Optimization | | jasparcj0 -
Lots of Pages Dropped Out of Google's Index?
Until yesterday, my website had about 1200 pages indexed in Google. I did lots of changes: removed low quality content, rewrote passable content to make it better, wrote high quality content, got lots of likes and shares on social networks, etc. Now this morning I see that out of 1252 pages submitted, only 691 are indexed. Is that a temporary situation related to the recent updates? Anyone seeing this? What should I interpret about this?
Technical SEO | | sbrault740 -
Noindex Pages indexed
I'm having problem that gogole is index my search results pages even though i have added the "noindex" metatag. Is the best thing to block the robot from crawling that file using robots.txt?
Technical SEO | | Tedred0 -
Why is my office page not being indexed?
Good Morning from 24 degrees C partly cloudy wetherby UK 🙂 This page is not being indexed by Google:
Technical SEO | | Nightwing
http://www.sandersonweatherall.co.uk/office-to-let-leeds/ 1st Question Ive checked robots txt file no problems, i'm in the midst of updating the xml sitemap (it had the old one in place). It only has one link from this page http://www.sandersonweatherall.co.uk/Site-Map/ So is the reason oits not being indexed just a simple case of lack if SEO juice from inbound links so the remedy lies in routing more inbound links to the offending page? 2nd question Is the quickest way to diagnose if a web address is not being indexed to cut and paste the url in the Google search box and if it doesnt return the page theres a problem? Thanks in advance, David0 -
Pages with different content and meta description marked as duplicate content
I am running into an issue where I have pages with completely different body and meta description but they are still being marked as having the same content (Duplicate Page Content error). What am I missing here? Examples: http://www.wallstreetoasis.com/forums/what-to-expect-in-the-summer-internship
Technical SEO | | WallStreetOasis.com
and
http://www.wallstreetoasis.com/blog/something-ventured http://www.wallstreetoasis.com/forums/im-in-the-long-run
and
http://www.wallstreetoasis.com/image/jhjpeg0 -
How do https pages affect indexing?
Our site involves e-commerce transactions that we want users to be able to complete via javascript popup/overlay boxes. in order to make the credit card form secure, we need the referring page to be secure, so we are considering making the entire site secure so all of our site links wiould be https. (PayPal works this way.) Do you think this will negatively impact whether Google and other search engines are able to index our pages?
Technical SEO | | seozeelot0