I’ve written quite a bit about browser testing solutions trying to help identify techniques and tools that make cross-browser development easier. My last article on the subject covered how to use BrowserStack to test any number of browsers all from one central tool; your own browser.
I was on a Windows PC back then so testing multiple browsers was a bit easier and testing tools were mainly complementary to my work. Now that I’m on OS X, the need for tools to round out my testing strategies is even more important, specifically because of the lack of Internet Explorer on the OS.
I’m a bit of a stickler for what I install on my computers and I prefer online tools when available. I’m also always on the hunt for new tools that make cross-browser testing easier and decided to give CrossBrowserTesting.com a run. I’ll go over some of the key features of the service and how to leverage it to improve your testing capabilities.
ZOMG That’s a Lot of Browsers
First, let’s mention that like every reliable service in this space, CrossBrowserTesting.com charges a monthly fee. I’m not surprised at all by this because the bottom line is that they have an infrastructure to support and well, that costs money. Their fee structure is based on the number of minutes you’d like available to you on a monthly basis but with a unique twist in that they allow you to roll over a certain number of minutes, month to month. So if you don’t use all of your minutes, you can roll some over for the next month.
Onto the service itself. There are a couple of things that are important to me in these types of services. These are:
- Breadth of browser support across major OS versions
- Mobile support (as I’m starting to shift to mobile web)
- Debugging tool support
- Responsiveness of the UI
- Form factor support
- Local system testing support (for example: proxy-based debugging)
All of these matter because they provide you the broadest testing surface across multiple devices. But to be honest, without debugging tool support (like Chrome DevTools, IE F12 Tools, etc.), a service like this would be compelling to use and only marginally better than a screenshot service. And being able to test locally is an obvious must-have to allow you to test interactively before deploying to staging or production. So this criteria is important to consider.
The first thing I noticed about the service is its amazing breadth of browser and device form factor support. Every major OS is covered (including Ubuntu) and every OS version has a fairly comprehensive list of supported browser versions for testing.
In addition, there’s extensive support for mobile devices and browsers covering earlier and more modern versions of Android, iOS, Blackberry Bold and Windows Phone 8. The interesting (and really beneficial) thing is that for specific Android versions, they’re allowing you to test against competing browsers like Firefox Mobile, Maxthon and Opera.
Testing With the Service
If you’ve used BrowserStack or a similar service, you’ll feel right at home in CrossBrowserTesting.com. The user experience matches very closely to what I’ve seen before which made jumping into it fairly trivial. You’re initially presented with a dashboard that gives you access to the main features. These include:
- Live browser testing
- Automated screenshot service
- Establishing a local connection
The live browser testing is what I’m most interested in. For me, I need to ensure that the rendering is consistent so the first thing I did was to do a baseline test to see if a site will render the same in my virtual browser as it does in my local browser. To mimic my local settings I chose to start the session in Mavericks, running under the most recent stable version of Chrome:
One thing to note is that in the OS/browser selection form, you’re solely presented with the browser options available for that specific OS version like this:
I went with GNC’s website because, well, I’m a bit of a fitness buff and they have a lot of interactive points as well, such as JavaScript-based fly-over menus and cycling feature panels. I figured it was a good test to see if the service could handle all of the interaction.
Looking at the two screenshots, you can see that the rendering for Chrome on Mavericks on both systems is exactly the same. This is a good thing, although it’s a bit trippy to see Chrome on Mavericks within Chrome on Mavericks. Inception anyone?
Local machine
Remote virtual browser
Once your session is running, you can change your target OS and browser version at any time by clicking on the Change Configuration button which displays the panel with dropdown choices. Note that changing the OS or browser will reload your session but it sure beats having to spark up multiple virtual machines, especially for cursory reviews of pages.
Getting the baseline UI was great but a more important test is to see how the site responds to interaction. Let me preface this by saying that I’ve not found a service like this that offers instantaneous response. There will always be a lag because these browsers are virtualized. The key thing that you want is to ensure that normal interaction, like hovering over a menu or controlling UI controls (like a scrolling panel) performs as expected (albeit a little slower). For example, GNC’s site has a dropdown menu system that expands when you hover over a menu option. Notice that hovering over it will expand the menu and equally important give me the option to drill-down into it.
This interactivity is what makes these services so valuable. The days of having to rely on screenshot services and a ton of VMs to see how your site renders across a ton of browsers are gone.
What About Debugging?
Good question. Browser-based developer tools have really progressed nicely and we depend on them daily. Thankfully, CrossBrowserTesting.com has included the default debugging tools with each browser giving us access to Chrome DevTools, the IE F12 Developer Tools, and Firefox’s Web Developer Tools as well as Firebug for older versions of the browser. Notice here that I’ve fired up the IE F12 tools in IE11 on Windows 7.
The tools are completely functional allowing me to inspect the markup and DOM structure of the page as well as set styles and change text, just like you would on your local PC. You can see here how I’m able to update the inline JavaScript on the site:
What this translates to is the ability to leverage the debuggers to do advanced debugging work like script debugging across any browser and browser version.
One thing I was concerned about is whether the tools would accurately show page load times via the network traffic monitoring panels and in my tests, they seem to be consistent with what I saw locally. This means I can feel confident, to some degree, that the load times will be more or less on par (of course taking into account network issues).
The one thing that I think would be very hard to measure, though, is page performance via the new suite of performance profilers included in Chrome and Internet Explorer. A lot of that data is directly affected by aspects of your computer, especially when rendering is GPU-enhanced. Testing this on virtualized browsers or virtual machines just isn’t real-world so I wouldn’t recommend it. If you’re an interactive developer (games), then it’s best to test on your own device to get a better understanding of performance.
Testing Different Form Factors
As I begin focusing on mobile more and more, the need to test across multiple mobile OSs and different form factors becomes a high priority. Unfortunately, short of getting a very big inheritance, winning the lotto, or finding a loving sponsor, building a full-featured mobile device lab just isn’t in the cards. And at the pace things are going, things are only get tougher as manufacturers continue to push the limits of mobile browsers and device size.
CrossBrowserTesting.com offers the ability to test across the major mobile OSs simulating most of the popular mobile devices like iPads, iPhones, Nexus 7s and such. This is certainly not an all-encompassing list of mobile devices and I’m assuming is meant to tackle the most modern OSs and devices available.
The process to testing is exactly the same as what we did for desktop browsers, except the rendering will be within the size of the specific mobile device you’ve selected:
Again, the service uses simulators to allow you to test out how your site will render on a mobile device. Keep in mind, though, that while simulators are good it’s always best to test against a real device if possible.
New devices come out all the time and I wouldn’t expect every form factor to be on here. I think a nice addition would be to allow a user of the service to be able to define the viewport size as opposed to solely being presented default screen resolutions. This would also offer more flexibility in testing sites that are responsive.
Screenshots
Before interactive services like CrossBrowserTesting.com became available, screenshot services became known as one of the quickest ways of seeing how your site rendered across multiple browsers. While they’re kind of passe now, they’re still useful and interestingly enough, I’m seeing most of these browser testing services spin up screenshot capture as part of their offerings. So it seems this practice is having a bit of a renaissance, most likely driven by the increasing number of browser versions, devices and form factors we need to account for.
Using the service is straightforward and as easy as entering a URL, selecting the browsers you’d like screenshots from, and clicking the Take Screenshots button:
The nice thing about this is that it allows you to choose as many device/OS/browser combinations as you’d like as well as define the resolution on a per-target basis. This generates a series of snapshots that you can review:
Clicking individual screenshots displays a larger image allowing you to get a detailed view of the rendering.
A couple of things to keep in mind: It takes a little while for the screenshots to be captured and rendered. So the more browsers you select, the longer you’ll wait. Unlike other services where you wait your turn in a queue, this wait seems to be simply associated with processing time. You’re paying for the service so I can’t imagine there being a queue like BrowserShots.org. Also bear in mind that some of these screenshots are invariably derived from simulators and as I mentioned before, simulators don’t always render the same as a real browser. Lastly, the screenshot is for a specific page, not the entire site.
Nonetheless, the fact that I can fairly quickly get an idea of how my site is rendering across so many devices helps me to drill-down into specific browser combinations that need special attention.
And that’s where a really neat feature comes in. The service offers the ability to compare layouts side-by-side so you can see rendering differences between different browsers:
As you can see in the screenshot, it goes a step further by also detailing the differences and creating a transparent yellow overlay on each panel to highlight the actual differences. I’m sure you can relate to the frustration many a developer has felt over discovering slight layout differences after the fact. This helps to bring that forward during the testing process. And you can scroll through and compare multiple scenarios by clicking the Prev and Next buttons.
Testing Local Files Remotely
The true value of a service like this is to facilitate your local debugging efforts. Simply allowing you to test publicly-available sites offers such limited value in terms of your overall testing strategy. CrossBrowserTesting.com provides the ability to test your local files against their remote servers using a Java-based proxy applet or the command line, again leveraging Java to create a proxy. This is similar to other services and is necessary to establish the connection between your local PC and the remote servers as well as allowing you to be able to tunnel past any firewalls you might have in your company. Once the connection is set, you’re able to test out both local files via direct access or via URL from your local web server.
The team at CrossBrowserTesting.com have created a video which gives you a good explanation and demonstration of how this part of the service works.
Closing Thoughts
It’d be truly great if we didn’t need these services. That would mean every browser rendered totally as expected across every device that supported them. Unfortunately, we still have a bit of browser fragmentation and every browser version tends to have their own quirks to contend with. So services like CrossBrowserTesting.com provide real value in streamlining cross-browser testing.
Overall, I think the service is very good albeit not without some quirks of its own. There were some intermittent lockups that I experienced in the live testing which may be attributed to Flash and in some sessions, seeing a number of browser icons in the OS dock left me scratching my head as to why they were there when I chose a specific target browser. These issues didn’t necessarily prevent me from doing what I wanted to do (testing) but it felt like things needed to be tidied up a bit.
The layout comparison feature, though, was pretty hot and something I could see myself using regularly.
What I am seeing is that price could be a big success factor for the breadth of services they’re offering. CrossBrowserTesting.com appears to have set themselves at a very competitive price point incorporating live testing, screenshots and local testing into one fixed monthly cost as opposed to separate pricing for specific services. This is very appealing, especially for price-conscious developers.
The big factor, though, will be how much time you need for testing. From experience, two and a half hours (the amount of time allotted for the Basic plan) seems a little limited especially when accounting for latency of rendering. Again, your mileage may vary but it’s certainly something to consider.