Monday, December 31, 2012

New eValid Patent Issued

In our continuing effort to beef up the Intellectual Property surrounding the eValid product, we are pleased to announce that an additional patent has been approved by the USPTO. Here is the complete text of USPatent # 8,327,271 that was issued 4 December 2012.

This patent joins two other previously issued patents, and is soon to be joined by other patents covering the eValid product and the technology used to implement it. You may also want to take a look at the Business Development (BizDev) Opportunities that are based on eValid technology.

Thursday, December 13, 2012

Selected Recent Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Thursday, December 6, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Friday, November 30, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

New User Forum Format Introduced

Our eValid User Forum has a new look and feel. More compact, more efficient, prettier.

One of the new things to look for is that there is a new CAPTCHA in the user login section. Which is one of the reasons for the upgrade: Believe it or not the image-based CAPTCHA that was used before had been hacked...we were seeing several hundred completely bogus logins per day. So it's a double win: pretty new format, and new, more effective question/answer based CAPTCHA.

FYI, the current statistics on the eValid User Forum show: 3240+ posts, 1560+ topics, 14,000+ members.

Wednesday, November 28, 2012

Testing Mobile Device Web Applications

According to recent trade press articles, by 2016 there'll be more mobile devices than people! Really! More than one per person, world wide, the article said.

The uses to which these devices are put -- telecommunications (making phone calls), texting, social networking, shopping, photography, book readers, etc. -- often have a very significant web interaction. Web "apps" -- which run native on mobile devices -- also often have a great deal of website interaction, but because they run native on the device.

Our primary interest is in the quality control testing and analysis of applications that run on mobile devices using some form of browser to interact with a website. This is a fairly broad area, and involves tens of thousands of applications ranging from bus-system arrival time updates to online shopping. Given this focus on quality, what are the components of web application behavior that are the most important? Here's a starting list -- what most users would consider important -- the key performance indicators:
  • Appearance: How does the web application appear to the user on this device? Are all the essential items of information present?
  • [Relative] Performance: How fast is the application -- when delivered to the device? Does the speed vary with the device? How much?
  • Availability: What is the uptime percentage? How often is the application served incorrectly, or not at all.
  • Data Volume: How much data does the server deliver to this device? Does the amount of data vary, device to device? What are the biggest items being sent over the internet to the device? How essential are they?
  • Response Validity: Is the information presented to the user correct? Are the answers the ones you expect? Is anything left out that should be there but isn't?
No single test solution can address 100% of these factors -- that's beyond the state of the art. But we believe that the eValid test engine can cover a majority of these concerns by providing quantitative measurements of key quality factors. It can do this by being able to imitate devices realistically, and its ability to make detailed measurements "as if" the web pages were being sent to the actual device, even though it is imitated.

Imitating Devices
The way eValid imitates devices is through the User Agent String (UAS) (if you click this link you'll the user agent that applies to your browser). What this does is set the variable string that the browser reports to the server to be what you specify in your settings. eValid has a special SetUserAgent command that sets the string for a newly launched sub-window, and has a similar command line switch for a batch-run.

Once the UAS is set, it stays that way for the duration of the run (or the duration of the sub-window). The server response to every request as if it was responding to the browser for that particular device. In many cases the material that the server sends to you differs based on which UAS you're using. Devices with smaller screens, or with limited bandwidth, tend to be sent smaller volumes of data.

The main advantage of this approach is that you can use eValid as a single point of measurement for all of the main factors that you need to look at in terms of quality of a web application: content, size, and speed. This means that you have a single point solution so that measurements can be made in parallel for comparison purposes if you want. You could have 10 sub-windows open on your screen, each of them imitating a different device. eValid's regular reporting of HTML download speed and volume are available -- just are they are when eValid "imitates" the IE browser of which it is an identical twin.

Examples
Here are some examples of how this kind of analysis works and the kinds of results you can get.
  • In this One Phone, Multiple Sites example you see eValid imitating an iPhone (a currently preeminent smartphone) as it navigates to six different website home pages. What is noteworthy is that the data volume ratio, comparing the HTML delivered to the PC with that sent to the imitated device, the ratio ranges from 5% to 93% (but is never 100%).
  • In this One Site, Multiple Devices example we've set eValid up to imitate five different mobile tablet devices, including an Apple iPad and a current Samsung Galaxy. In this case all of the devices are navigated to the Amazon.com home page. As before the variance in the what is displayed -- as well as the wide range in the data volume downloaded -- striking.
  • As a final example, here is a run where we had 20 different mobile devices look at (read and render) the same Amazon.com landing page. The Selected Device Screenshots show how easily it is to confirm operation of a website with many different mobile devices at the same time. The total byte count for the downloaded page size is shown for each screenshot. This data was generated automatically with built-in capabilities of eValid as described in the Single Platform Testing page, which shows the parametric script we used to collect the data.
Summary
eValid can imitate and test any kind of mobile device reliably, with little technical difficulty, and with full realism in terms of download timing and capacity, and with the capability to inspect page layout and other aesthetics.

Tuesday, November 27, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Monday, November 26, 2012

What makes a browser test enabled?

The other day someone made this observation: "You say your browser is 'test enabled.' But what exactly does that mean?" Considering that this is a main architectural feature of the eValid product, the centerpiece of the eValid Test Suite, it's important to understand why we use that terminology.

About Experiments
The idea of any kind of testing is rooted in the notions of experimental science: Known inputs applied to an "application under test (AUT)" produce results that imply facts about the AUT.

The isn't different whether the what's being tested is hardware or software. The Software Testing process for of a web browser enabled application typically involves applying a Test Method through a Testbed that permits application of Test Data to the AUT -- the web application itself is the AUT. Each Software Test constitutes one instance of a scientifically oriented Experiment the purpose of which is to accomplish successful Verification and Validation of the web application.

How Do You Test Software?
To test software, you need a place to run it -- to feed it data and make observations about what happens when you do. That is, a Testbed.

An obvious choice for a testbed for a web browser enabled application is to use a Web Browser as the launch-point for individual tests in the experiments performed to establish web application quality.

OK, you say, great. But how do you make a browser into a testbed for software applications? Every browser will test web applications: you type, you see, you can even take a screenshot. How can you mechanize that?

Browser Architecture Exploited by eValid
This is where eValid is different from regular browsers. The inventive thing we did is craft the test enablement using the components of a regular browser. eValid has a separate script recording and script playback engine that is built into the browser executable, but this is done in a way that does not change the way the browser works (so the results are accurate), and does not modify how JavaScript operates inside the browser (so the test browser can continue to run complex AJAX applications).

The details, of course, can be complicated but eValid makes use of resources that every browser automatically has built in. The main resource that eValid uses is Document Object Model that provides direct and immediate facts about a page that has been read and rendered into the browser face. Having this resource available makes it easy and straightforward to validate and verify a web application.

The conceptually simple approach that eValid takes provides a number of very important practical advantages. The main one is verisimilitude -- that the tests run by the eValid browser with its testing augmentation are as similar as possible to what happens with an actual browser. The advantages also include low overhead, ability to extract accurate timings and component byte counts, and the ability to assure test synchronization using non-interfering methods that interrogate the DOM.

How It Plays Together
From a scientific point of view let's see how the eValid architecture shapes up compared with a conventional experimental setup:

Scientific Experiment eValid Test Perspective
The set of hypotheses, the test data, you give to the test object that will illustrate the needed behavior. The eValid script with its instructions on how to drive the browser through the application.
The testbed setup reveals the results of application of the test data. The resulting rendered page(s) that the input script directs the browser to visit.
The measurements from the test object that are used to verify the hypotheses are true (or are not). The set of validations or checks that determine whether the test PASSes or FAILs.

 In short, the eValid testbed system really is a Test Enabled Web Browser.

Tuesday, November 20, 2012

How eValid Supports Software Development Out-Sourcing Firms

We all know that a lot of web application development is out-sourced to very competent technical organizations worldwide. Out-sourcing this work to offshore firms isn't seen as a negative any more -- just a more efficient way to get the job done.

A typical offshore out-sourcing firm has very skilled people who build and deliver a web application -- usually at very attractive (and low) cost.

But, when it comes to testing the web application, the situation can often be quite different. As happened the other day in a conversation with a very bright technical sales rep who contacted us about using his firms services.

When asked about how they test their application, the answer came back that they use special, project-specific tools that are built "in house" by their engineers. But no, those are not either tools nor tests that they can export to their customer, sorry. In-house use only. Trust us.

We believe strongly that that kind of answer serves nobody. Customers have the right to see how their application was tested. But, as our intrepid sales rep went on, all of the "big vendors" make that very difficult for them, because they require the customer to purchase the test engine, often at prohibitively high cost.

 For such situations, when a firm is rightly concerned with delivering a quality product at a fair cost, the eValid solution can break this impasse.

eValid offers the following:
  • A low cost, fully productized solution that competes with the "big guys" products.
  • Full technical support to your team, even if you're working for the customer.
  • Authorization to deliver eValid outputs (your test suite) to the customer.
  • Reseller discount if you buy eValid on behalf of the customer.
  • Rebate if your recommendation leads to a sale of eVaild to the customer.
The point is, eValid makes an ideal test engine for use by companies who don't want to be limited by the contractual norms and sales expectations of the "big guys".

Friday, November 16, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Friday, November 9, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Monday, October 29, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Tuesday, October 23, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Tuesday, October 16, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Monday, October 8, 2012

Webinar: Comparative Mobile Device Performance Testing

Comparative Mobile Device Performance Testing

Run Simultaneous Tests of Mobile Browser Web Apps
Full AJAX Realism
Imitate Any Mobile Device
Identify Performance Issues Due To Load
Register
Wednesday, 24 October 2012
2:00 PM Eastern Time / 11:00 AM Pacific Time

QA/Testing/Tuning/Performance projects need to qualify relative performance of complex AJAX web applications -- within strict budget and time constraints -- to make sure their server-stack setups can meet the load for multiple devices.

The traditional methods of studying mobile app performance, based on using HTTP/S simulations or "VUs", don't always work when asynchronous AJAX applications are involved. VU's don't do AJAX. You need a real browser.

eValid mobild device imitation technology methods offer quick-to-create, realistic, and fully synchronized AJAX functional tests. Plus you can lift those tests into parallel performance/loading scenarios that can involve 100's or or even 1,000's of Browser Users ("BUs") per machine.

In this webinar you'll learn: how special eValid commands overcome problems with variable-time playback dependency; how to create full-reality AJAX tests quickly; how to "dress" eValid to imitate ANY mobile device; how to adjust tests to be totally self-synchronizing under stressed AJAX conditions; how to incorporate tests in an eValid LoadTest scenario; how to collecte specific device dependent performance data; how to run 100's or evn 1,000's of tests in parallel using multiple Browser User (BU) instances; and, how to analyze consolidated performance summary data to identify server-stack bottlenecks.

This unique approach demonstrates how eValid becomes a genuine force multiplier in your web application performance testing efforts.

  Webinar
Topic
Summary
  • eValid Architecture and Structure: How eValid functional and performance testing works.
  • Functional Testing: How to make reliable recordings of AJAX applications.
  • Making AJAX Tests "Desktop Safe": How to augment tests for complete AJAX synchronization.
  • Creating Multi-Device Performance Test Scenarios: How to use the eValid scenario editor to organize realistic multi-device control scripts.
  • Running Performance Tests: How to launch single and multiple-instance runs using "cloud computing" resources.
  • Finding Bottlenecks: How to read the collected data, and how to use other information to help spot server-stack issues.
 
You are cordially invited to attend this free Webinar.
Register now

Tuesday, September 25, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Wednesday, September 19, 2012

Latest User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Thursday, August 30, 2012

Monitoring Data Collected By eValid

The use of eValid as a "smart" monitoring agent continues to grow. Customers are finding that eValid can monitor a complex AJAX page, involving complicated interactive transactions, with very high reliability and repeatability.

The key to this is the "structural testing" capability in eValid that works by directly accessing DOM with eValid's DOM manipulation commands. Here is an outline of how Structural/Algorithmic Testing compares with regular record/play testing. In addition, as the chart shows, eValid's Structural/Algorithmic tests can be extended to a fully programmatic mode using the eValid Programmatic Interface (EPI) for C++.

Here is a short description about How Well eValid Works in this kind of monitoring application, taken from the website of our Swiss Monitoring Partner RealStuff Informatik AG. Installations made by them at local companies in Switzerland have relied on the advanced Structural/Algorithmic approach for much of the monitoring done.

Monday, August 13, 2012

Latest User Forum Posts


 Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Thursday, August 9, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Monday, August 6, 2012

Academic Users Review eValid After Real-World Project Use

Since 2009, the Center for Systems and Software Engineering (CSSE), at the University of Southern California (USC), under the leadership of Dr. Barry Boehm and CSSE Assistant Director Prof. A. Winsor Brown, eValid has been part of the student's projects. For the new academic year, Dr. Sue Koolmanojwong has decided to continue using eValid in her two-semester course.

In the two-semester team project based course sequence CSCI577a/b, students learn to use best software engineering practices to develop software systems for real clients. They adopt the Incremental Commitment Model (ICM) to develop software projects, and in most cases, the students have little to no industry experience and little project management knowledge.

During the second half of the software engineering course sequence, the student teams implement and test the systems that they have designed. These software systems are then put to use in a real operational scenarios by real clients; therefore, they require certain levels of testing to verify and validate that the quality of service, or level of service, of the systems can be achieved. For the past few years, some of the project teams have been utilizing eValid to automate their testing process as well as to prove their achievement of the level of service goals, especially for load and performance testing. At the end of each project life cycle, students get the chance to reflect and review the entire development process that they went through, including the testing process.

PhD student P. Aroonvatanaporn, came to this conclusion: "Despite certain areas that eValid may be lacking, it has been proven that the eValid tool has played significant roles in help creating successful projects in the software engineering course at USC. The projects developed and deployed as a result of this course have achieved roughly 90% on-time delivery and client satisfactory. Part of this success comes from the use of good testing suites, such as eValid, to ensure product quality before they are delivered to the clients."

Here are some of the students' specific observations:
  • "Our team found it [eValid] very easy to use as we had to check for large amount of data in form of documents, images and digitized documents for load testing. Moreover it performs line-by-line checking and catches any loose ends that have been overlooked." -- A. Paradkar (Team 13).
  • "This tool is very useful and powerful; it not only can perform functional testing but also can perform pressure [load] testing." -- Qi Li, (Team 12).
  • "I used eValid to test 10 concurrent users. I first recorded a playback of one of our test cases. I then created a load test that ran the saved playback 10 times, with a wait period of 0 between each call. When I ran the load test, eValid opened 9 more instances of itself and ran the playback in each instance" -- W. Halverson (Team 5).
  • "eValid provides the user with an interface that is simple and intuitive enough that it does not require users to high technical knowledge to use the tool." -- P. Aroonvatanaporn (Team Administrator)
  • "[eValid] could be improved by making the Load Test Functionality automated. Modification of a Test Script should be minimum and support [the] change [of] number of users to any amount should be configurable through the GUI. Similarly, for Performance Testing one should be able to set the timer at any given point in the sequence of action/activity recorded." -- V. Bhaskar Rao (Team 3). (Our observation is that this student did not read the manual, as both of his requirements are actually implemented in the eValid Product.)
We are very happy to provide the CSSE students the opportunity to make good use of our product and to have a hands-on learning experiences with real-world applications.

We certainly appreciate their work and the work of everyone at USC/CSSE, and we especially thank Mr. Aroonvatanaporn for this very positive review.

Wednesday, August 1, 2012

Do you ever take on "test suite development" work?

In our User Forum, Kobep wrote:
Do you ever take on "test suite development" work?

Our primary goal is to support the eValid product suite, but yes, on occasion we do take on the job of building a test suite for a customer. But it's important to note that we are not primarily a service organization and as a result of that we do NOT send any script development work offshore.

A typical test suite that we would develop would the for a relatively stable product, for which there is reasonably stable documentation. Our goal would be to build up a suite of test -- a nominal count of 100 tests, arranged ideally in ten groups of ten tests. While we welcome the chance to make tests test powerful and effective -- and we generally succeed in this goal -- the main purpose of such a project is to help do a "technology transfer" to the customer.

We've found over and over that when a customer has a set of tests that are in the customer's context -- they run on the customer's machine and exercise a web application that is familiar to all of the staff involved on the customer side -- that the suite works very well as a mechanism for the customer to take over the suite for long-term maintenance. It seems that having both the context, a working sample, and ownership of the actual tests all serve to make it very easy for customer staff members to understand and maintain/modify/upgrade the tests.

In some cases, when there is some structural testing script modification that has to be done, then the examples in the nominal 100-case suite also serve as the basis for future modification.

The good news is that we like to do these projects on a very short term basis -- in some cases it's taken less than a week or two to flesh out tests for an application. And, yes, some of those tests take a LOT longer than most of them; there always seems to be a rough one in the bunch.

If you're interested in the details, give us a call.

eValid Support


Tuesday, July 31, 2012

Starting eValid with Common Device Settings

We have been asked many times, "What is the best way to have eValid imitate mobile device XYZ?"

The best solution is to start eValid using the -AGENT switch to specify the User Agent String (UAS) which eValid is supposed to be using.

When eValid starts up it will continue to use that particular UAS until you close that instance.  All of the browsing you do in this mode will be reported to the server as arising from that specified device.

To see how this appears you might want to look at this page:
Starting eValid With A Specific User Agent String (UAS).

This page illustrates how different the delivered HTML appears when it has been tailored by the web server for the particular device. (In this case, an iPad and an iPhone).

Here are a dozen start up sequences for some Common Devices.

Tuesday, July 24, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Friday, July 20, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Sunday, July 15, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

  1. How important is "realism" in load testing? -- Some of the background on how eValid assures realistic testing. 
  2. Time Modal Dialogs -- Hints on timing modal dialogs.
  3. Can eValid test AJAX apps that run on mobile devices? -- Dressing eValid to look like a mobile device that runs an AJAX app. 
  4. eValid record from the Google search input field -- Recording in the presence of an autocomplete input field. 
  5. Which databases and which monitoring environments -- Suggestions about capturing and displaying eValid monitoring data.

Wednesday, June 20, 2012

Handling Modal Dialogs in Microsoft Dynamics

The Microsoft Dynamics application providespowerful tools for CRM and ERP in a very powerful and sophisticated online context.

As part of a recent customer support exercise some issues arose in having eValid process some Microsoft Dynamics modal dialogs. Actually, these problem the customer had was being able to successfully dismiss (terminate) a modal that was launched as part of a normal test.

Two routes to dismissing the modal dialog are discussed. One uses an available eValid utility eVclick.exe that sends a left click to a particular spot on the desktop after a fixed time delay. The time delay allows the modal page fully render before being dismissed.

The other route involves using the Dashboard to switch eValid from normal recording mode -- after the modal page appears -- into Desktop Recording mode, where the modal can be dismissed easily.

Our solution is described in detail in Testing Microsoft Dynamics -- Modal Dialog Processing.

Monday, June 18, 2012

Webinar: Load Testing Mobile Apps


Load Testing Mobile Apps

Run 1,000's of Mobile Browser Users
Full AJAX Realism
Any Mobile Device
Identify Performance Bottlenecks
Register
Wednesday, 27 June 2012
2:00 PM Eastern Time / 11:00 AM Pacific Time

QA/Testing/Tuning/Performance projects need to qualify performance of complex AJAX web applications -- within strict budget and time constraints -- to make sure their server-stack setups can meet the load. The traditional methods of ramping up load, based on using HTTP/S simulations or "VUs", don't always work when asynchronous AJAX applications are involved. VU's don't do AJAX. You need a browser.

eValid server loading methods offer both quick-to-create, realistic, and fully synchronized AJAX functional tests. Plus you can lift those tests into performance/loading scenarios that can involve 100's or 1,000's or 10,000's of Browser Users ("BUs") per machine.

In this webinar you'll learn: how special eValid commands overcome problems with variable-time playback dependency; how to create full-reality AJAX tests quickly; how to adjust tests to be totally self-synchronizing under stressed AJAX conditions; how to incorporate tests in an eValid LoadTest scenario; how to launch 100's or 1,000's or 10,000's of Browser User (BU) instances; and, how to analyze consolidated performance summary data to identify server-stack bottlenecks.

This unique approach demonstrates how eValid becomes a genuine force multiplier in your web application performance testing efforts.

Webinar
Topic
Summary
  • eValid Architecture and Structure: How eValid functional and performance testing works.
  • Functional Testing: How to make reliable recordings of AJAX applications.
  • Making AJAX Tests "LoadTest Safe": How to augment tests for complete AJAX synchronization.
  • Creating LoadTest Scenarios: How to use the LoadTest scenario editor to organize realistic LoadTest control scripts.
  • Running LoadTests: How to launch single and multiple-instance runs using "cloud computing" resources.
  • Finding Bottlenecks: How to read the LoadTest and other raw data to help spot server-stack issues.
You are cordially invited to attend this free Webinar.
Register now

Monday, June 4, 2012

Sample Site Analysis Runs from iPhone

We had some questions asked the other day about whether the eValid approach to emulating a mobile device will let eValid do a site analysis of a web site from the perspective of a mobile device?

So we ran a couple of site analysis runs, aimed at a couple of local transportation systems, and everything worked fine.  Here are the two scans, both done while eValid is dressed as (and behaving like) an iPhone:

   Site Analysis of BART
  
   Site Analysis of SF MUNI

Friday, June 1, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Thursday, May 31, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Monday, May 21, 2012

Is eValid done with JavaScript? Why not?

In our User Forum, Amirr wrote:
Is eValid done with JavaScript? Why not?

No, eValid is NOT implemented with JavaScript, and there are several very good reasons why we chose the eValid implementation the way we did. Here are some of them:
  • JavaScript runs in a single thread inside the browser, and the eValid view is that when the JavaScript engine is running the test engine should NEVER interfer with what the browser is doing naturally. To do so would be to violate a fundamential principle: Don't interfere with the process under test -- in this case, with the normal operation of the browser.
  • If you are running an AJAX application, then the most common problem in testing is to assure synchronization of playback. Even though eValid records realistic "wait times" during the recording process, we NEVER suggest that relying on wait times for synchronization of AJAX is a reliable approach.
  • It's usually the opposite, as we have found out many times. Think of it this way: no matter how well designed your "user wait times" are, there will always be an instance when the wait times you have put into your script are not long enough. The consequences for failure to sync in a test process are very severe: most of the time the test is ruined.
  • eValid actually uses a separate, not-in-the-browser process (which actually runs in a thread of the browser process that does not interfere with normal browsing activity) to run commands that "validate and synchronize on DOM property values." There is a family of these commands -- with both positive and negative synchronization modes.
  • The key feature of eValid's architecture -- to avoid anything more than the minimum interference with the browser as it does its work, under control of the eValid recording or playback engine -- assures that the results you obtain with eValid are as realistic as possible. We very firmly believe that unnecessary interference with the behavior of the web application you are testing is a very serious "no-no" -- and eValid does not cross that line.
  • We understand there are several other web testing solutions that are based on the use of JavaScript and the APIs that obtain within the browser, but we specifically chose NOT to use that available interface for the reasons outlined above. We wanted eValid based web application tests to be accurate, reliable, credible and efficient.

Wednesday, May 16, 2012

How does eValid licensing work?

From time to time we are asked this question, "How does eValid licensing work?"  Here is a short summary of some the available licensing options.
  • Regular single-machine, multi-user floating license:  You install eValid on one machine, and eValid is available to any number of users (user accounts) on the console or via Remote Desktop Connect (RDC), but at most one at a time.

    This is our most-commonly-used license. With it you can share the eValid copy among all members of a small teams of testers.

  • Enterprise license: This is the same structure as above, but the EPRISE license manager allows multiple users on the console or by RDC.  For example, an EPRISE06 license allows up to six simultaneous users.
  • Commercial License:  This license requires web access, and eValid authenticates itself at launch time on a per use basis by interrogating a special web page.  A commercial license can be organized as a subscription, or you can use the the "pay per play" mode that is employed by firms that use eValid in monitoring mode.
  • Autoplay License: After special processing by eValid staff, this license allows any script you want to run on any machine, anywhere, for a fixed length of time.  The special AUTOPLAY script unlock key is generated by eValid on a 1-day response time.  There is a fixed fee and discount structure for multiple Autoplay licenses.
  • Corporate License.  In this case, you have complete freedom to use eValid anywhere in your firm, depending on how the agreement is negotiated.  Typical corporate licenses are for 100's of users.
eValid has a number of different sets of features -- we call them bundles -- that can be tailored for specific purposes and applications.  Examples include the functional testing, regression testing, monitoring, AJAX monitoring, monitoring-agent, single-user server loading, multiple-user server loading, and site analysis bundles. Any bundle of features can be implemented with any of the above types of licenses.

Please contact us for details and we can work out the licensing scheme that meets your needs.

Tuesday, May 8, 2012

Webinar: Load Testing Mobile Apps

Run 1,000's of Mobile Browser Users
Full AJAX Realism
Any Mobile Device
Identify Performance Bottlenecks
Register
Wednesday, 23 May 2012
2:00 PM Eastern Time / 11:00 AM Pacific Time

QA/Testing/Tuning/Performance projects need to qualify performance of complex AJAX web applications -- within strict budget and time constraints -- to make sure their server-stack setups can meet the load.
The traditional methods of ramping up load, based on using HTTP/S simulations or "VUs", don't always work when asynchronous AJAX applications are involved. VU's don't do AJAX. You need a browser.

eValid server loading methods offer both quick-to-create, realistic, and fully synchronized AJAX functional tests. Plus you can lift those tests into performance/loading scenarios that can involve 100's or 1,000's or 10,000's of Browser Users ("BUs") per machine.

In this webinar you'll learn: how special eValid commands overcome problems with variable-time playback dependency; how to create full-reality AJAX tests quickly; how to adjust tests to be totally self-synchronizing under stressed AJAX conditions; how to incorporate tests in an eValid LoadTest scenario; how to launch 100's or 1,000's or 10,000's of Browser User (BU) instances; and, how to analyze consolidated performance summary data to identify server-stack bottlenecks.
This unique approach demonstrates how eValid becomes a genuine force multiplier in your web application performance testing efforts.

  Webinar
Topic
Summary
  • eValid Architecture and Structure: How eValid functional and performance testing works.
  • Functional Testing: How to make reliable recordings of AJAX applications.
  • Making AJAX Tests "LoadTest Safe": How to augment tests for complete AJAX synchronization.
  • Creating LoadTest Scenarios: How to use the LoadTest scenario editor to organize realistic LoadTest control scripts.
  • Running LoadTests: How to launch single and multiple-instance runs using "cloud computing" resources.
  • Finding Bottlenecks: How to read the LoadTest and other raw data to help spot server-stack issues.
You are cordially invited to attend this free Webinar.
Register now

Friday, May 4, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Monday, April 30, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Friday, April 20, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Tuesday, April 10, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Monday, March 19, 2012

Mobile Device Testing Solution

Recently we have had a number of people showing interest in eValid as the basis for all kinds of testing of mobile or portable devices -- from smartphones to tablets to whatever.

Here is a link to a summary of how the eValid solution addresses the issues of testing mobile devices:

  http://www.e-valid.com/Products/Documentation.9/Mobile/mobile.solution.html

Wednesday, March 14, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.



Wednesday, March 7, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Friday, February 17, 2012

Product Documentation Updated, Expanded

From time to time we update the eValid documentation to include new descriptions and additional material that we have developed.  In the past few weeks our efforts have expanded the complete eValid V9 package by about 40% over the prior version.

In addition, we have updated the internal search capability to pick up as many pages as possible (previously it delivered matches only on a selected subset of pages.

Overall we believe eValid users who access the full product documentation from the eValid browser will appreciate the improvements.

Monday, January 30, 2012

Selected User Forum Posts

Beginning in mid-2010 we have directed all technical support questions to the eValid User Forum. We have learned that when one user has an issue, all users can profit from the answer.

Here is an additional selection of some of the posts that we think would be of general interest.

Tuesday, January 17, 2012

Webinar: Performance Testing Mobile Web Apps

Run 1,000's of Mobile Browser Users
Full AJAX Realism for Any Mobile Device
Easily Identify Performance Bottlenecks


Wednesday, 25 January 2012
2:00 PM Eastern Time / 11:00 AM Pacific Time

QA/Testing/Tuning/Performance projects need to qualify performance of complex AJAX web applications -- within strict budget and time constraints -- to make sure their server-stack setups can meet the load.

The traditional methods of ramping up load, based on using HTTP/S simulations or "VUs", don't always work when asynchronous AJAX applications are involved. VU's don't do AJAX. You need a browser.

eValid server loading methods offer both quick-to-create, realistic, and fully synchronized AJAX functional tests. Plus you can lift those tests into performance/loading scenarios that can involve 100's or 1,000's or 10,000's of Browser Users ("BUs") per machine.

In this webinar you'll learn: how special eValid commands overcome problems with variable-time playback dependency; how to create full-reality AJAX tests quickly; how to adjust tests to be totally self-synchronizing under stressed AJAX conditions; how to incorporate tests in an eValid LoadTest scenario; how to launch 100's or 1,000's or 10,000's of Browser User (BU) instances; and, how to analyze consolidated performance summary data to identify server-stack bottlenecks.

This unique approach demonstrates how eValid becomes a genuine force multiplier in your web application performance testing efforts.

Webinar Topic Summary:
  • eValid Architecture and Structure: How eValid functional and performance testing works.
  • Functional Testing: How to make reliable recordings of AJAX applications.
  • Making AJAX Tests "LoadTest Safe": How to augment tests for complete AJAX synchronization.
  • Creating LoadTest Scenarios: How to use the LoadTest scenario editor to organize realistic LoadTest control scripts.
  • Running LoadTests: How to launch single and multiple-instance runs using "cloud computing" resources.
  • Finding Bottlenecks: How to read the LoadTest and other raw data to help spot server-stack issues.
You are cordially invited to attend this free Webinar.
Register now

Wednesday, January 4, 2012

Mobile Testing FAQs

As eValid users know, eValid is a single-point solution for testing the functional quality, performance, and supporting server capacity for all kinds of web applications -- including Mobile Devices that run web applications.

We've prepared a Mobile Testing Technology -- Frequently Asked Questions (FAQs) that answers some of the most common questions about how eValid fills this role. Please take a look and don't hesitate to respond or ask questions.

Sunday, January 1, 2012

Happy New Year!

We would like to wish everyone in the eValid community -- users, resellers, and visitors and potential customers alike -- the very best of health, happiness and success in the coming year!

As more and more applications move into the "mobile" domain, you will see more and more challenges in testing these complex applications. As some of our technical work in the past several months has shown, the quality and relative performance of mobile applications can be indeed be tested reliably and effectively. In coming months you can expect to see more and more about testing complex mobile web applications from the eValid team.

HAPPY NEW YEAR!