Friday, November 30, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

New User Forum Format Introduced

Our eValid User Forum has a new look and feel. More compact, more efficient, prettier.

One of the new things to look for is that there is a new CAPTCHA in the user login section. Which is one of the reasons for the upgrade: Believe it or not the image-based CAPTCHA that was used before had been hacked...we were seeing several hundred completely bogus logins per day. So it's a double win: pretty new format, and new, more effective question/answer based CAPTCHA.

FYI, the current statistics on the eValid User Forum show: 3240+ posts, 1560+ topics, 14,000+ members.

Wednesday, November 28, 2012

Testing Mobile Device Web Applications

According to recent trade press articles, by 2016 there'll be more mobile devices than people! Really! More than one per person, world wide, the article said.

The uses to which these devices are put -- telecommunications (making phone calls), texting, social networking, shopping, photography, book readers, etc. -- often have a very significant web interaction. Web "apps" -- which run native on mobile devices -- also often have a great deal of website interaction, but because they run native on the device.

Our primary interest is in the quality control testing and analysis of applications that run on mobile devices using some form of browser to interact with a website. This is a fairly broad area, and involves tens of thousands of applications ranging from bus-system arrival time updates to online shopping. Given this focus on quality, what are the components of web application behavior that are the most important? Here's a starting list -- what most users would consider important -- the key performance indicators:
  • Appearance: How does the web application appear to the user on this device? Are all the essential items of information present?
  • [Relative] Performance: How fast is the application -- when delivered to the device? Does the speed vary with the device? How much?
  • Availability: What is the uptime percentage? How often is the application served incorrectly, or not at all.
  • Data Volume: How much data does the server deliver to this device? Does the amount of data vary, device to device? What are the biggest items being sent over the internet to the device? How essential are they?
  • Response Validity: Is the information presented to the user correct? Are the answers the ones you expect? Is anything left out that should be there but isn't?
No single test solution can address 100% of these factors -- that's beyond the state of the art. But we believe that the eValid test engine can cover a majority of these concerns by providing quantitative measurements of key quality factors. It can do this by being able to imitate devices realistically, and its ability to make detailed measurements "as if" the web pages were being sent to the actual device, even though it is imitated.

Imitating Devices
The way eValid imitates devices is through the User Agent String (UAS) (if you click this link you'll the user agent that applies to your browser). What this does is set the variable string that the browser reports to the server to be what you specify in your settings. eValid has a special SetUserAgent command that sets the string for a newly launched sub-window, and has a similar command line switch for a batch-run.

Once the UAS is set, it stays that way for the duration of the run (or the duration of the sub-window). The server response to every request as if it was responding to the browser for that particular device. In many cases the material that the server sends to you differs based on which UAS you're using. Devices with smaller screens, or with limited bandwidth, tend to be sent smaller volumes of data.

The main advantage of this approach is that you can use eValid as a single point of measurement for all of the main factors that you need to look at in terms of quality of a web application: content, size, and speed. This means that you have a single point solution so that measurements can be made in parallel for comparison purposes if you want. You could have 10 sub-windows open on your screen, each of them imitating a different device. eValid's regular reporting of HTML download speed and volume are available -- just are they are when eValid "imitates" the IE browser of which it is an identical twin.

Examples
Here are some examples of how this kind of analysis works and the kinds of results you can get.
  • In this One Phone, Multiple Sites example you see eValid imitating an iPhone (a currently preeminent smartphone) as it navigates to six different website home pages. What is noteworthy is that the data volume ratio, comparing the HTML delivered to the PC with that sent to the imitated device, the ratio ranges from 5% to 93% (but is never 100%).
  • In this One Site, Multiple Devices example we've set eValid up to imitate five different mobile tablet devices, including an Apple iPad and a current Samsung Galaxy. In this case all of the devices are navigated to the Amazon.com home page. As before the variance in the what is displayed -- as well as the wide range in the data volume downloaded -- striking.
  • As a final example, here is a run where we had 20 different mobile devices look at (read and render) the same Amazon.com landing page. The Selected Device Screenshots show how easily it is to confirm operation of a website with many different mobile devices at the same time. The total byte count for the downloaded page size is shown for each screenshot. This data was generated automatically with built-in capabilities of eValid as described in the Single Platform Testing page, which shows the parametric script we used to collect the data.
Summary
eValid can imitate and test any kind of mobile device reliably, with little technical difficulty, and with full realism in terms of download timing and capacity, and with the capability to inspect page layout and other aesthetics.

Tuesday, November 27, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Monday, November 26, 2012

What makes a browser test enabled?

The other day someone made this observation: "You say your browser is 'test enabled.' But what exactly does that mean?" Considering that this is a main architectural feature of the eValid product, the centerpiece of the eValid Test Suite, it's important to understand why we use that terminology.

About Experiments
The idea of any kind of testing is rooted in the notions of experimental science: Known inputs applied to an "application under test (AUT)" produce results that imply facts about the AUT.

The isn't different whether the what's being tested is hardware or software. The Software Testing process for of a web browser enabled application typically involves applying a Test Method through a Testbed that permits application of Test Data to the AUT -- the web application itself is the AUT. Each Software Test constitutes one instance of a scientifically oriented Experiment the purpose of which is to accomplish successful Verification and Validation of the web application.

How Do You Test Software?
To test software, you need a place to run it -- to feed it data and make observations about what happens when you do. That is, a Testbed.

An obvious choice for a testbed for a web browser enabled application is to use a Web Browser as the launch-point for individual tests in the experiments performed to establish web application quality.

OK, you say, great. But how do you make a browser into a testbed for software applications? Every browser will test web applications: you type, you see, you can even take a screenshot. How can you mechanize that?

Browser Architecture Exploited by eValid
This is where eValid is different from regular browsers. The inventive thing we did is craft the test enablement using the components of a regular browser. eValid has a separate script recording and script playback engine that is built into the browser executable, but this is done in a way that does not change the way the browser works (so the results are accurate), and does not modify how JavaScript operates inside the browser (so the test browser can continue to run complex AJAX applications).

The details, of course, can be complicated but eValid makes use of resources that every browser automatically has built in. The main resource that eValid uses is Document Object Model that provides direct and immediate facts about a page that has been read and rendered into the browser face. Having this resource available makes it easy and straightforward to validate and verify a web application.

The conceptually simple approach that eValid takes provides a number of very important practical advantages. The main one is verisimilitude -- that the tests run by the eValid browser with its testing augmentation are as similar as possible to what happens with an actual browser. The advantages also include low overhead, ability to extract accurate timings and component byte counts, and the ability to assure test synchronization using non-interfering methods that interrogate the DOM.

How It Plays Together
From a scientific point of view let's see how the eValid architecture shapes up compared with a conventional experimental setup:

Scientific Experiment eValid Test Perspective
The set of hypotheses, the test data, you give to the test object that will illustrate the needed behavior. The eValid script with its instructions on how to drive the browser through the application.
The testbed setup reveals the results of application of the test data. The resulting rendered page(s) that the input script directs the browser to visit.
The measurements from the test object that are used to verify the hypotheses are true (or are not). The set of validations or checks that determine whether the test PASSes or FAILs.

 In short, the eValid testbed system really is a Test Enabled Web Browser.

Tuesday, November 20, 2012

How eValid Supports Software Development Out-Sourcing Firms

We all know that a lot of web application development is out-sourced to very competent technical organizations worldwide. Out-sourcing this work to offshore firms isn't seen as a negative any more -- just a more efficient way to get the job done.

A typical offshore out-sourcing firm has very skilled people who build and deliver a web application -- usually at very attractive (and low) cost.

But, when it comes to testing the web application, the situation can often be quite different. As happened the other day in a conversation with a very bright technical sales rep who contacted us about using his firms services.

When asked about how they test their application, the answer came back that they use special, project-specific tools that are built "in house" by their engineers. But no, those are not either tools nor tests that they can export to their customer, sorry. In-house use only. Trust us.

We believe strongly that that kind of answer serves nobody. Customers have the right to see how their application was tested. But, as our intrepid sales rep went on, all of the "big vendors" make that very difficult for them, because they require the customer to purchase the test engine, often at prohibitively high cost.

 For such situations, when a firm is rightly concerned with delivering a quality product at a fair cost, the eValid solution can break this impasse.

eValid offers the following:
  • A low cost, fully productized solution that competes with the "big guys" products.
  • Full technical support to your team, even if you're working for the customer.
  • Authorization to deliver eValid outputs (your test suite) to the customer.
  • Reseller discount if you buy eValid on behalf of the customer.
  • Rebate if your recommendation leads to a sale of eVaild to the customer.
The point is, eValid makes an ideal test engine for use by companies who don't want to be limited by the contractual norms and sales expectations of the "big guys".

Friday, November 16, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Friday, November 9, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.