Tarun Gupta

5 Silly Optimization Mistakes SEOs Must Avoid

Tarun Gupta December 31st, 2016 Search Engine Optimization no comments.

blog-post-for-outbrain-advertising-cover

Increasing acceptance of digital marketing has sidelined existing practices of search engine optimization a bit. Amid the clues marketers got through paid search, social media and content marketing, they found it futile to invest efforts on keywords, tags and back links. However, the fact remains- it’s impossible to cut off SEO completely as it determines the site’s overall performance in search.


Marketers can still utilize on-page optimization and off-page optimization to achieve increased rankings, more traffic to the website and improved conver-sion. However, with frequent algorithm refreshes, Google has again changed the diaspora of on page / off page optimization and indicated how tough it would act on spammers.


It’s not easy to stay on top amid changing realities of search. You need to be equally aware about the basics of optimization in sync with the evolving search updates. I know many fellow marketers who frequently make silly mistakes and unintentionally skip moot points of optimization.


Following are some SEO blunders that even ace professionals commit time and again that destroy their entire digital marketing campaign:


5 SEO Mistakes That Kill Your Campaign

1. You Forget To Tweak .htaccess:

.htaccess is a configuration files that contains specific directives that are used to block / unblock a website’s document directories. .htacees helps webmasters to create a more detailed sitemap, generate cleaner URLs and adjust caching. This is a very crucial tool that cleanse up your site’s indexing process.


While setting up an .htaccess file, you need to be vigilant and cautious. A simple coding error could drag your site’s indexing and ranking to nowhere. Take help from a pro coder who can help you fix your code errors in .htacees file.

  Google Rolled Out Phantom Update Targeting How-To Content
2. You Discourage Search Engines from Indexing in CMS:

Most of the SEO plugins in CMS platforms like WordPress and Joomla have built-in features that allow users to instruct search engines not to crawl the website. This can be done via settings of that particular CMS. To do so, you need to navigate through Settings → Reading → Discourage to stop search engines from indexing the site. Tick the check box next to the option and you can stop search bots from indexing your website. Check the feature at least once a week to avoid accidental checking of the box.


Even if you tick the check box, search bots may keep indexing your website. If you really want to stop your site from indexing, better to change settings in .htaccess or robots.txt files.

3. You Set Your Robots.txt File Crawlable:

Many marketers / websites do this mistakes time and again. This one is the worst of all. Don’t ever leave your robots.txt file open for crawling because it could lead to serious breach and privacy issues. If you are a beginner, learn in deep how you can set and manage robots.txt files.

4. You Do Not Add “nofollow” Tag Attribute to Outbound Links:

Despite debated at length, links are still the most critical factor for SEO and search rankings. Just for the sake of high- quality backlinks you shouldn’t pass your own link juice to other sites. Using ‘nofollow’ attribute is a way to drive high-quality backlinks without losing the link power on your site.

  Conversion Centric Guest Blogging Pointers

Prepare a list of outbound links on your website. Implement ‘nofollow’ tag attribute where necessary. However, don’t go overboard in doing it. Since, link exchange is a mutual practice, tagging every link with nofollow may force other SEO professionals to nofollow you as well.

Have website's search engine optimization in mind? Rely only on Brainpulse SEO Services For guaranteed results within set time frame.

5. You Don’t Check Website Code in Validator:

A website consists many thousand lines of complicated codes. Better the code is, better a site’s visibility will be. A neatly coded website allows search bots to scan and index your website more effectively.


So, before starting search engine optimization of a website, use tools to check the website code in deep. List the errors and ask a developer to fix it. Though, Google doesn’t punish you for having a website with invalid HTML/CSS markups, It pays off running the validator tool once in a while.


Comments are closed.

Latest Search Engine Optimization Articles