Google Updates and penalties and changes :
Googe+ can understand the picture, but Google bot can not do the same, so they read the ALT tag. Here comes the importance of to have an alt tag for your images. Google images are an experimental place, and google is still working on its final shape. Perhaps it comes soon.
Penguin updates hit only 1% US websites, but as for google it will increase and cover all website’s to remove problematic results and bring quality on the web.
Schema does not optimize the pages. It just connects the operating hours or your rating by customers or comments to google search output. It means if you have a gross score of 4.5 that can get highlighted on the google search result if you have a schema on your page? It also helps to shows extra snippets just below your search appearance in the google search.
On content piracy that is a major problem for big content based sites. DMCA can only handle it notices. If one website files regular DMCA notices, google might come in action and start noticing it. It will look for site’s piracy and remove pirated content fast from web. But still web admins will have to file a DMCA notice for the same. The change that will happen is that google will respond to your request of pirated content removal faster.
If we build a site where we block some part of the content like add or menu bar or some other content on the page for mobile users what may happen? It might look suspicious to google as someone is blocking the content. If the user can see the same content from mobile and computer devices, there is no problem for google bot’s to understand what you want to show to your users.
Updating RSS feed is important as google try to link and compare the sitemap data with RSS feed. And if we think google bot scroll our site once a day we need to update our RSS feed once every day.
High amount of crawl or lower amount of crawl of any particular page or site by google doesn’t define its state of optimization. It’s just google try to keep itself updated. Sometimes it crawls few sites because there is a regular change and as google want to keep themselves updated they crawl those sites or pages more. So, rate of crawl is not an indicator of how optimized your website is and will be.
If your website is affected by one panda update, it may not be affected by the next. If you correct your website in the meanwhile. While if your website is not affected by one panda version it may still get influenced by the latest panda release like panda 2.0, 2.1, 2.2. So, keeping your website to best google parameter’s should be our target. It helps google understand us better and provide users best results.
Rolling out of panda and penguin has not specific date difference but it will become a regular event in the future to keep google in good shape of competitiveness. As optimizers work new techniques to modify google search results, Google will do its work to filter out all impurities as frequently as possible.
We see multiple duplicate contents on many websites. Ecommerce and product related sites, where different users try to post same product with a different set of description of or the same description in a different way. It Is a matter of worry for many site owners. So, what google does in these cases? As per google’s view it knows there are many duplicate contents on the web, and it can’t remove or penalize or even find all. It’s because the amount is enormous, and it’s a technical problem even for google. So, what google does where it finds the duplicate content? They look for most appropriate and best content on the same product among all the duplicate pages. Based on this they optimize those pages that have most relevant and optimized content for one product.
Here is a look at how Google see the Duplicate Contents :
To know more about Google updates you can join the best digital marketing course in kolkata.