Why Site’s Drop in Google?
The Google’s algorithms are aimed at helping people find “high-quality” sites by reducing the rankings of low-quality content.
But rankings can fluctuate for multiple reasons. Sometimes its due to human error and preventable.
The quick list of why sites drop:
- Tracking the wrong keywords
- Recent Google algorithm update
- New website
- Low quality content
- Low quality links
- Losing high quality links
- Cheap hosting
- Google penalty
- Competitors trying harder
- Business moved locations
Below are more in-depth explanations on what you can do to improve your ranking.
Check Your Content
Content is still king. While there are other elements of SEO important to reaching the top of the search engines, high quality content remains at the top of the list.
The effects of low quality link schemes is silent but deadly. Unsuspected business owners can be subject to these penalties.
Unethical SEO companies build poor backlinks on spammy third party sites to generate short term results for clients. Kinda like steriods for websites.
But be aware, Google’s webspam team may take manual actions onunnatural links.
When a manual action is taken, Google will notify through Webmaster Tools. If you have a warning for unnatural links to your site, it’s best to remove these links and request a reconsideration.
You can fix unnatural links by making sure they don’t pass PageRank. To do this, add a rel=”nofollow” or remove the links entirely.
After fixing or removing your unnatural links to your site, let Google know about your change by submitting a reconsideration request in Search Console.
General SEO Issues
If you’re handling the SEO for your site, investigating these key areas is going to help you kick major butt in fixing your seo issues.
- Lack of communication: Before you panic, check with your marketing department, It department or in-house webmaster. A simple seo fix might be one email away.
- Site crawl errors: Website errors can prevent your site from showing up in search engines. Use Google Webmaster Tools to find your site crawl errors.
- Search engine algo changes: Learning about algorithms can help pinpoint recent fluctuation in your sites ranking. Moz has a greathistorical record of Google updates.
- Recent site updates: If your site was recently updated by a team member, its possibly they inadvertently triggered an seo failure.
- Recent CMS updates: WordPress sites regularly undergo system widesoftware updates to fix bugs and exploitation. If your site has custom features, its possible a recent update triggered an seo mistake.
- Domain name expired: It’s possible your credit card was expired and your Domain Registrar could not charge your card to renew yourdomain.
Technical SEO Issues
After examining your big picture SEO health, its time to find and fix the problem. A large majority of SEO problems are due to human error.
- Rel=canonicals: The rel=canonical tag functions as a 301 redirect for search engines without physically redirecting readers.
- Nofollow links: If used in the wrong places it Nofollow links can cause havoc by no passing along link juice to deeper pages.
- Meta Robots: Excludes individual pages from search results.
- Robots.txt: A text file created by webmasters to tell search engine robots how to crawl and index your site.
- Page titles: Verify their visibility in search results and in your source code.
- H1 tags: Verify your H1 tags contain relevant keywords to describe your pages.
- Forgetting Image Tags: Ensure every single image on your website is optimized to be searched and indexed by search spiders by including ALT tags on each image.
- AJAX sites: Sometimes they can serve search engines the wrong content. This is common with website builders like Wix.
- Hreflang: Verify your language tags are serving the correct pages.
- Navigation links: Verify that you have not recently changes your site wide navigation. Lack of links to deeper pages can hamper their placement in search results.
- Internal-links: Same concept as above but related to links within the content of a page.
- Meta Descriptions: Used to describe the content of a page to visitors in search results. Verify their visibility in search results and in your source code.
- Shared Hosting: Sometimes servers go down and neighbors commit crimes. Check with your hosting provider and make sure your site does not share the same IP with shady neighbors.
- Https / SSL: Duplicate content issues can arise if you are serving both secured and unsecured sites to search engines.
- Failed plugins: When not consistently maintained, plugins can have a cascading effect on the rest of your website.
- Site Speed: Google uses site speed as an SEO ranking factor—not a huge one, but definitely an important one.
- WordPress reading settings: Visit the reading settings in your WordPress dashboard. Make sure you unclicked the Search Engine Visibility button that says Discourage search engines from indexing this site.