Monday 25 July 2016

Screaming Frog SEO Spider Update – Version 6.0

Have some very exciting new features ready to release in the latest update. This includes the following –
1) Rendered Crawling (JavaScript):
There were two things we set out to do toward the begin of the year. Firstly, see precisely what the web crawlers can creep and list. This is the reason we made the Screaming Frog Log File Analyser, as a crawler will just ever be a reenactment of hunt bot conduct.
Also, we needed to slither rendered pages and read the DOM. It's been known for quite a while that Googlebot demonstrations more like a current program, rendering substance, creeping and indexing JavaScript and powerfully created content rather well. The SEO Spider is currently ready to render and creep website pages comparably.
You can pick whether to creep the static HTML, comply with the old AJAX slithering plan or completely render site pages, which means executing and slithering of JavaScript and element content.
zinavo
Google deprecated their old AJAX crawling scheme and we have seen JavaScript frameworks such as AngularJS (with links or utilising the HTML5 History API) crawled, indexed and ranking like a typical static HTML site. I highly recommend reading Adam Audette’s Googlebot JavaScript testing from last year if you’re not already familiar.
After much research and testing, we integrated the Chromium project library for our rendering engine to emulate Google as closely as possible. Some of you may remember the excellent ‘Googlebot is Chrome‘ post from Mike King back in 2011 which discusses Googlebot essentially being a headless browser.
The new rendering mode is really powerful, but there are a few things to remember –
  • Typically crawling is slower even though it’s still multi threaded, as the SEO Spider has to wait longer for the content to load and gather all the resources to be able to render a page. Our internal testing suggests Google wait approximately 5 seconds for a page to render, so this is the default AJAX timeout in the SEO Spider. Google may adjust this based upon server response and other signals, so you can configure this to your own requirements if a site is slower to load a page.
  • The crawling experience is quite different as it can take time for anything to appear in the UI to start with, then all of a sudden lots of URLs appear together at once. This is due to the SEO Spider waiting for all the resources to be fetched to render a page, before the data is displayed.
  • To be able to render content properly, resources such as JavaScript and CSS should not be blocked from the SEO Spider. You can see URLs blocked by robots.txt (and the corresponding robots.txt disallow line) under ‘Response Codes > Blocked By Robots.txt’. You should also make sure that you crawl JS, CSS and external resources in the SEO Spider configuration.
It’s also important to note that as the SEO Spider renders content like a browser from your machine, so this can impact analytics and anything else that relies upon JavaScript.
By default the SEO Spider excludes executing of Google Analytics JavaScript tags within its engine, however if a site is using other analytics solutions or JavaScript that shouldn’t be executed, remember to use the exclude feature.
2) Configurable Columns & Ordering
You’re now able to configure which columns are displayed in each tab of the SEO Spider (by clicking the ‘+’ in the top window pane).
zinavo
You can also drag and drop the columns into any order and this will be remembered (even after a restart).
To revert back to the default columns and ordering, simply right click on the ‘+’ symbol and click ‘Reset Columns’ or click on ‘Configuration > User Interface > Reset Columns For All Tables’.
3) XML Sitemap & Sitemap Index Crawling
The SEO Spider already allows crawling of XML sitemaps in list mode, by uploading the .xml file (number 8 in the ‘10 features in the SEO Spider you should really know‘ post) which was always a little clunky to have to save it if it was already live (but handy when it wasn’t uploaded!).
zinavo
So we’ve now introduced the ability to enter a sitemap URL to crawl it (‘List Mode > Download Sitemap’).
Previously if a site had multiple sitemaps, you’d have to upload and crawl them separately as well.
zinavo
Now if you have a sitemap index file to manage multiple sitemaps, you can enter the sitemap index file URL and the SEO Spider will download all sitemaps and subsequent URLs within them!
This should help save plenty of time!
4) Improved Custom Extraction – Multiple Values & Functions
We listened to feedback that users often wanted to extract multiple values, without having to use multiple extractors. For example, previously to collect 10 values, you’d need to use 10 extractors and index selectors ([1],[2] etc) with Xpath.
zinavo
We’ve changed this behavior, so by default a single extractor will collect all values found and report them via a single extractor for XPath, CSS Path and Regex. If you have 20 hreflang values, you can use a single extractor to collect them all and the SEO Spider will dynamically add additional columns for however many are required. You’ll still have 9 extractors left to play with as well. So a single Xpath such as –
Will now collect all values discovered.
zinavo
You can still choose to extract just the first instance by using an index selector as well. For example, if you just wanted to collect the first h3 on a page, you could use the following Xpath –
zinavo
Functions can also be used anywhere in Xpath, but you can now use it on its own as well via the ‘function value’ dropdown. So if you wanted to count the number of links on a page, you might use the following Xpath –
zinavo
I’d recommend reading our updated guide to web scraping for more information.
5) rel=“next” and rel=“prev” Elements Now Crawled
The SEO Spider can now crawl rel=“next” and rel=“prev” elements whereas previously the tool merely reported them. Now if a URL has not already been discovered, the URL will be added to the queue and the URLs will be crawled if the configuration is enabled (‘Configuration > Spider > Basic Tab > Crawl Next/Prev’).
rel=“next” and rel=“prev” elements are not counted as ‘Inlinks’ (in the lower window tab) as they are not links in a traditional sense. Hence, if a URL does not have any ‘Inlinks’ in the crawl, it might well be due to discovery from a rel=“next” and rel=“prev” or a canonical. We recommend using the ‘Crawl Path Report‘ to show how the page was discovered, which will show the full path.
There’s also a new ‘respect next/prev’ configuration option (under ‘Configuration > Spider > Advanced tab’) which will hide any URLs with a ‘prev’ element, so they are not considered as duplicates of the first page in the series.
6) Updated SERP Snippet Emulator
Earlier this year in May Google increased the column width of the organic SERPs from 512px to 600px on desktop, which means titles and description snippets are longer. Google displays and truncates SERP snippets based on characters’ pixel width rather than number of characters, which can make it challenging to optimise.
Our previous research showed Google used to truncate page titles at around 482px on desktop. With the change, we have updated our research and logic in the SERP snippet emulator to match Google’s new truncation point before an ellipses (…), which for page titles on desktop is around 570px.
zinavo
Our research shows that while the space for descriptions has also increased they are still being truncated far earlier at a similar point to the older 512px width SERP. The SERP snippet emulator will only bold keywords within the snippet description, not in the title, in the same way as the Google SERPs.
Please note – You may occasionally see our SERP snippet emulator be a word out in either direction compared to what you see in the Google SERP. There will always be some pixel differences, which mean that the pixel boundary might not be in the exact same spot that Google calculate 100% of the time.
We are still seeing Google play to different rules at times as well, where some snippets have a longer pixel cut off point, particularly for descriptions! The SERP snippet emulator is therefore not always exact, but a good rule of thumb.
Other Updates
We have also included some other smaller updates and bug fixes in version 6.0 of the Screaming FrogSEO Spider, which include the following –
  • A new ‘Text Ratio’ column has been introduced in the internal tab which calculates the text to HTML ratio.
  • Google updated their Search Analytics API, so the SEO Spider can now retrieve more than 5k rows of data from Search Console.
  • There’s a new ‘search query filter’ for Search Console, which allows users to include or exclude keywords (under ‘Configuration > API Access > Google Search Console > Dimension tab’). This should be useful for excluding brand queries for example.
  • There’s a new configuration to extract images from the IMG srcset attribute under ‘Configuration > Advanced’.
  • The new Googlebot smartphone user-agenthas been included.
  • Updated our support for relative base tags.
  • Removed the blank line at the start of Excel exports.
  • Fixed a bug with word count which could make it less accurate.
  • Fixed a bug with GSC CTR numbers.
Thanks to everyone for all
Visit Our Websites:

Monday 4 July 2016

Google Algorithm Update - 2016 (RankBrain)

In October 2015, Google launches what is known to be the first use of artificial intelligence in its ranking algorithm.While it has been more than half a year since its release, very few information about it has been made official by Google.A lot of people were left in the dark. What it actually is? What kind of signals does RankBrain track in order to rank websites
Google's using a machine learning technology called RankBrain to help deliver its search results. Here's what's we know about it.
 Web Designing Bangalore

Google utilizes a machine-learning manmade brainpower framework called "RankBrain" to deal with its list items. Considering how that functions and fits in with Google's general positioning framework? This is what we think about RankBrain.

The data secured underneath originates from three unique sources and has been overhauled after some time, with notes where upgrades have happened. Here are those sources:

To begin with is the Bloomberg story that broke the news about RankBrain (See likewise our review of it). Second, extra data that Google has now given straightforwardly to Search Engine Land. Third, our own particular information and best suppositions in spots where Google isn't giving answers. We'll clarify where these sources are utilized, when considered essential, aside from general foundation data.
What is RankBrain?
RankBrain is Google’s name for a machine-learning artificial intelligence system that’s used to help process its search results, as was reported by Bloomberg and also confirmed to us by Google.
What is machine learning?
Machine learning is the place a PC shows itself how to accomplish an option that is, instead of being taught by people or taking after point by point programming.
What is artificial intelligence?
True artificial intelligence, or AI for short, is where a computer can be as smart as a human being, at least in the sense of acquiring knowledge both from being taught and from building on what it knows and making new connections.
True AI exists only in science fiction novels, of course. In practice, AI is used to refer to computer systems that are designed to learn and make connections.
How’s AI different from machine learning? In terms of RankBrain, it seems to us they’re fairly synonymous. You may hear them both used interchangeably, or you may hear machine learning used to describe the type of artificial intelligence approach being employed.
So RankBrain is the new way Google ranks search results?
No. RankBrain is part of Google’s overall search “algorithm,” a computer program that’s used to sort through the billions of pages it knows about and find the ones deemed most relevant for particular queries.
What’s the name of Google’s search algorithm

 Website Designing Bangalore

It’s called Hummingbird, as we reported in the past. For years, the overall algorithm didn’t have a formal name. But in the middle of 2013, Google overhauled that algorithm and gave it a name, Hummingbird.
So RankBrain is part of Google’s Hummingbird search algorithm?
That is our comprehension. Hummingbird is the general pursuit calculation, much the same as an auto has a general motor in it. The motor itself might be comprised of different parts, for example, an oil channel, a fuel pump, a radiator et cetera. Similarly, Hummingbird includes different parts, with RankBrain being one of the most up to date.

Specifically, we know RankBrain is a piece of the general Hummingbird calculation in light of the fact that the Bloomberg article clarifies that RankBrain doesn't handle all quests, as just the general calculation would.

Hummingbird likewise contains different parts with names natural to those in the SEO space, for example, Panda, Penguin and Payday intended to battle spam, Pigeon intended to enhance nearby results, Top Heavy intended to downgrade advertisement overwhelming pages, Mobile Friendly intended to remunerate portable well disposed pages and Pirate intended to battle copyright encroachment.

I thought the Google algorithm was called “PageRank”
PageRank is part of the overall Hummingbird algorithm that covers a specific way of giving pages credit based on the links from other pages pointing at them.

PageRank is special because it’s the first name that Google ever gave to one of the parts of its ranking algorithm, way back at the time the search engine began, in 1998.

What about these “signals” that Google uses for ranking?
Signs are things Google uses to decide how to rank website pages. For instance, it will read the words on a site page, so words are a sign. On the off chance that a few words are in intense, that may be another sign that is noted. The counts utilized as a component of PageRank give a page a PageRank score that is utilized as a sign. On the off chance that a page is noted as being versatile well disposed, that is another sign that is enlisted.
Every one of these signs get handled by different parts inside the Hummingbird calculation to make sense of which pages Google appears in light of different pursuits.
How will RankBrain Affect Us?
For us marketers and SEOs, the answer is surprisingly, not much. Think of it as an optimization effort by Google in improving their processing.
That being said there are still quite a few things that RankBrain bring to the table:
  • Increased tolerance for long-tail keywords - Previously, users need to “play around” with the keywords they use in searches in order to get the right result. By processing long, complex searches, RankBrain allows users to type in queries in a more natural way.
  • Supporting platform for voice search – Speaking about queries in a natural way, we can’t avoid from talking about voice search. Voice search technologies like Siri, Google Now and Cortana is experiencing more and more usage. Unintentional or not, RankBrain plays a vital role in helping voice search assistants return a more accurate result out of voice searches.
  • Less emphasis on long-tail keywords – Here’s one thing that make things easier for marketers. Now that we know RankBrain handles long queries and break them down to more commonly used terms, we no longer have to emphasize a lot on optimizing for long-tail keywords.
  • More flexibility in keyword usage – This also opens up more room for flexibility in keyword usage. Instead of spamming the same exact match keyword, we are free to use more LSI keywords, synonyms and alternate terms to improve readership.
Will RankBrain Kill SEO?
As usual, as Google rolls out new algorithms, it’s unavoidable to get people clamoring SEO is dead yet again. Doesn’t help that this time around it’s about an AI system hell bent on taking over the world.
The answer of course, it won’t. RankBrain might change a little bit in how we do keyword research and implement them in our contents but that’s mostly it.
In another word, SEO will become even more sophisticated and we have to rely a lot more on quality rather than technical SEO.
Addressing things like semantic web, LSI keywords and content readability becomes more important than ever. Competition will become even harsher as basic SEO just won’t cut it anymore.
The Future of  RankBrain
 Furthermore, not surprisingly, nothing is sure particularly in quick advancing commercial enterprises like SEO. For the time being RankBrain's application is very negligible to place it in straightforward term.

Later on, it's possible that RankBrain will be utilized as a part of a more extensive application than simply translating seek inquiries.

Should the AI is propelled enough to really see all substance on the web, then it will be a gigantic achievement in the business. So gigantic it may make significantly more specialized stuff like labels and metadata outdated.

To close everything, I'd like to make it clear that the vast majority of the data in this post is made in the wake of examining Google's legitimate declarations, a patent that is identified with RankBrain and what different specialists pondered it. 
For Visit ours:
Website Design Company Bangalore | Web Design Companies Bangalore | Web Design Company Bangalore

Saturday 2 July 2016

Analyzer Web Design Tool

Web investigation instruments can be utilized to examine different parts of your website, keeping in mind the end goal to accumulate information for statistical surveying, and center you on particular regions of the web page.
These instruments may demonstrate you anything from activity to age demographic, and are precious as a part of any Admin's munititions stockpile, particularly since there are such a large number of progressive routes these days to gauge your site insights, other than the conventional movement numbers.
The locales recorded in this article are both free and paid-for, however offer an assortment of various devices, some standard, some exclusive.
Web Design Bangalore

Google Analytics

An undeniable decision, and broadly utilized, Google Analytics is particularly valuable when utilized as a part of conjunction with the whole suite of Google programming. An appealing choice, particularly if your site utilizes WordPress as it's CMS, as Google offer a module, so you can see your Google Analytics Stats specifically on your dashboard. As said before, other Google items, particularly AdSense work flawlessly with Analytics, being incorporated with the interface.
To the extent feedback goes for this instrument, it doesn't offer the trademark effortlessness of Google, rather covering itself in various levels of confusingly composed menus, regardless of a late update. This is regular of Google's more corner programming in any case , and is superbly useable for the lion's share of individuals. Other than that, the interface is reasonably easy to use, and once you've found the specific details you were searching for, they're laid out in a simple to peruse way. The fare usefulness is an extraordinary component of Google examination, permitting you to break down and alter your information in various diverse configurations, other than putting away it disconnected from the net. You can likewise pick into having your Analytics messaged to you, at interims of your decision, which could be gigantically helpful.
Website Designing Bangalore

Alexa

Alexa offers free devices on its site, and gloats a capable pursuit capacity, speculating, as Google does, your hunt before you've got done with writing it, inquiry it's database of destinations. It will then rank your outcome by movement and catchphrases. Alexa's most remarkable element is its "notoriety" score, which is just the quantity of destinations that connection to yours, demonstrating your webpage's impact all through the web—it additionally positions your website against all the others recorded in Alexa; curiously, WDD fared somewhat preferred on Alexa over on Woorank, in any case this is on account of Woorank totals it's score from an assortment of outsider locales, including Alexa. Alongside activity details, it lets you know the top pursuit inquiries that have prompted your site, which will help with SEO examination, and afterward it likewise lets you know how to enhance your SEO by demonstrating capable catchphrase utilized by contenders.
For the demographic side of things, Alexa will let you know the age, instruction level, perusing area, sexual orientation and considerably whether they have kids or not. Presently whilst these wouldn't be 100% precise, they will give an expansive comprehension of your viewership. As a free arrangement of instruments, Alexa is near on perfect, given the measure of information it gives, notwithstanding giving you "upstream" and "downstream" locales, where clients went to quickly prior and then afterward your site. The interface needs some work, in any case, and for the normal client may demonstrate somewhat convoluted—it likewise doesn't offer the security and genuine feelings of serenity in unwavering quality of insights that paid locales can. However there is the AlexaPro bundles, which clients can buy if they wish to facilitate their examination.
Web Designing Bangalore

Piwik

Piwik isn’t browser based, and it isn’t run by a large company for profit like the others, instead it’s open source software. The interface is fairly nice, however the inconvenience of download, along with the huge amount of data clogging it up, offsets this slightly. It uses a dashboard interface, with editable tiles of information, which can be dragged around to your liking. As far as how much data is provided, Piwik feels like an aggregation of all the other sites mentioned above, offering far more stats than you’ll ever need, for free!
Web Design Company Bangalore

Optimizely

Optimizely is an A/B tester, and for those of you who don’t know about A/B testers, they work as follows: You have two designs of a website: A and B. Typically, A is the existing design (called the control), and B is the new design. You split your website traffic between these two versions and measure their performance using metrics that you care about (conversion rate, sales, bounce rate, etc.). In the end, you select the version that performs best.
Most A/B testers are difficult to implement, and have a back-end interface, which is why Optimizely stands out, as it has an easy to use interface, making it easy to edit your B page, with redirects to it coming from one line of JavaScript, which links your site to the third party Optimizely editor. It is compatible with third party analytic software that you may already be using, such as Google Analytics or Adobe Omniture SiteCatalyst, and the script redirect doesn’t (noticeably) affect load time. The higher, more expensive plans also support multivariate testing.
howitworks

For Visit Our Sites:

Copy to clipboard Or select and install one of following plugins Open Site Settings