Previously, we wrote one article the steps to diagnose decrease in visitor count; which eventually helped many web-masters. The basis of writing this article is, many web-masters are asking, why there is a dramatic fall in traffic, practically within 3-4 days period, for most of them the traffic source is Google organic search (above 90% of the traffic). So, here is another article, which mainly focuses on loss of traffic from Google search results.
Host server is getting offline unpredictably
Do you know, once our traffic reduced to half within 5 days? We use Pingdom for reporting when the server goes offline / not available; but it could not detect it. We found from Google webmaster tool that Google spider found the server is not available and went off. This is very bad thing indeed. Google can not rely on a website that is running on a unstable server.
Choose a quality hosting service. If you can not afford a dedicated host, but the traffic load is high; think of using own computer as a server. Use high speed dedicated cable Internet connection.
---

We suffered once too
Duplicate content issue
We wrote ago about how to prevent Google Webmaster tool showing the duplicate content issue. You might be interested to read that. Feeds might create duplicate content issue too.
Changes to the site
There are some changes that might affect traffic: moving to another server, changing the site structure, changing or merging tags, errors in the robots.txt, change of site loading speed. After changing the theme, our page load speed became half than that of ago; but we lost traffic for few days. It is paradoxical and gets back again.
Black hat SEO
Hidden links, hidden text, redirects users to malicious scripts on the page.
Modified algorithms of search engine
Search engines are constantly improving, something complementary and change in their algorithms. Sometimes such changes are hardly noticeable, and sometimes these changes can throw many into shock and turn everything into a plus for you or a minus.
The site is not updated for a long period
Blog means fresh content, everyday.
Link trading
Well, you yourself as a nice person may not do these. But your competitors can insert your website’s name in free link exchange websites. It happens, though rare. People are getting inhuman day by day. Please search with your domain name in Google. Ask questions in Google Webmaster Forum.
If we loss 80% traffic we will straight forwardly ask there. We need the answer then.
Hacking?
Quite common. Tag might appears as “This site could threaten the security of your computer”. Some of your top pages can be redirected using .htaccess too.
Do not use pirated themes or free themes for serious websites or blogs.
Seasonal decline in traffic
It is peculiar that the curve of traffic is like Sine wave (only positive) pattern. People surf less during the summer. Surfing and number of pageviews depends on season too.
New competitor?
When this website launched, obviously this 4500 / day (right now) traffic was shared by other websites. May be by 20 websites (for example). So, they will loss 225 / day on an average.
New Advertisement units?
Do not use pop up, click under stuffs.
Conclusion
Wait, wait and wait. Wait for at least 20 days; just check if everything is right and increase post frequency. Yes, an opinion of professional service might be needed in extreme cases. You can ask us for a free first step advices via email through contact us too.
Please do not panic and sleep well. Good times and bad times are part of life. Keep your nerve strong, we do care for you. Just write us other than searching here and there.

Clearly explained.But i have a doubt does robots.txt affects traffic?if so how to go back to default robots.txt and simply deleting robots.txt is enough to make website no use of robots
Presence of a physical robots.txt is an important part of on site SEO. You can check the presence of Google’s own robots.txt – http://www.google.com/robots.txt
By default WordPress has no physical robots.txt but a virtual robots.txt. Deleting a physically created robots.txt will automatically create a virtual robots.txt unless the WP script is manipulated.
If there is no robots.txt, all types of good search engine bots might index not so important files or urls over the desired files or urls (= posts, pages and uploaded images).
If your website gets a good traffic, you will notice in web master tools that Google Bots actually sniffs the robots.txt very frequently – for us it is around 2 hourly.
I changed my theme from Gazpomag to Eri by Elmastudio, since then I am getting seriously low traffic!