Monday, 11 September 2017

Test Automation in Large Scale



Digital businesses must adapt to new functionalities into production on daily basis to take full advantage of the flexibility offered by the virtual world.
Test Automation in Large Scale

Automation for digital business is not science fiction. It’s real and happening today. For one of the largest U.S. Federal programs, more than half a million business process were automated every day in 2 hours on more than 100 virtual machines.
In another example, a global manufacturer of luxury brands validates their core business processes worldwide with 600 hours of automation on more than 30 virtual machines every day!
We can see every day that today’s digital technology projects take place on a swift timeline than ever before, where typical projects last days-to-weeks rather than months-to-years. Large-scale automation makes it possible to deploy this new business functionality early and often – while significantly mitigating risks of business disruption or major glitches during the process.
The real world scenario explained in three words
FAST                FREQUENT                  FLAWLESS
Test Automation Large ScaleThe quality of application is determined by its swiftness, frequency of updates and defects
Automation is the key. Test Automation for process validation and functional testing allows you to run through literally millions of business process steps just like the end users, to validate that no code is broken.
Though test automation does not look a very rewarding or a feasible option due to the costs involved in setting up the environment or the tools – but definitely, in the long run, its benefits surely surpass the initial glitches.
It is said, that “there’s a method for every madness”.
To maximize the ROI using Test Automation, a specific plan of action is required.
What actually might set up the best way of implementing large scale test automation so as to maximize ROI?
  1. Robust Product Road Map: When organizations begin to automate anything, they should ensure that they have a clear goal set. There should be absolute clarity on the product road map and on the automation check list. Else, there will be confusion and chaos around the whole process and the efforts prove to be a big scale failure. Hence, defining the product roadmap is a must.
  2. Optimized Test Process: To meet a defined roadmap, a clear demarcation of the test process is a must. The test process must be defined keeping in mind the risks, costs, and training involved while implementing test automation.
  3. The Framework: Whether it is a data-driven or a keyword-driven framework, or a hybrid one, defining and selecting a framework will have a definite positive impact on the ROI as it will implement the road map you defined earlier.
  4. Tools to be used: Though testers have a wide variety of choice for testing solutions, wrong decisions regarding the test automation tools can lead to effects that may not be reversible. Therefore, selecting the right tools is a business-critical requirement for ensuring successful test runs. All variables such as integration, installation, cost, maintenance, and compatibility with the testing environment must be considered while selecting a tool. 
  5. Script Management: The scope of script management usually floats around the areas of the standard process that are well documented, logging the errors that are generated, and most importantly testing if the script was written is able to withstand unexpected behaviors.
  6. Manual Effort: Automation is an end result of a lot of manual activities such as script and test case writing, setting up the machines and selecting the tests, and once the test is run – analyze the results. These cannot be done in a jiffy, and need to be planned for both costs and efforts.
  7. Testing Team: If an organization realizes the significance of the manual effort required, it must give a due diligence while planning the work and effort estimation with regards to the team and the skills required for the implementation of test automation.
  8. True Measure of Success: A few good examples of this definition may include measuring benefits such as faster time to market, improved ROI, or, on the other hand, the number of bugs have decreased. 

Scaling up with virtual machines

Test Automation ServicesThe automation framework can be run on masses of virtual machines in the cloud to get scale. During automated business process validation, each of these machines interacts with the applications, running through all complex business scenarios with real time data. This verifies that things are working as expected. The machines will only be turned up on demand and can be decommissioned when the scenarios finish executing.
Please share your thoughts on test automation. Subscribe to get notifications on our upcoming test automation blogs.

Sunday, 3 September 2017

Software Testing Services

Indium Software Testing Services
‘Making a Right Choice’
“Global Pure Play Software Testing Services Market 2017-2021”
Technavio’s analysts forecast the global pure play software testing services market to grow at a CAGR of 19.41% during the period 2017-2021 and is expected to reach  USD 9.08 billion by 2021.
In today’s technology-driven business environment, software apps and products are expected to satisfy critical checkpoints, such as ease of installation, performance, compatibility, recovery after a failure, resource consumption, portability, security, and reliability.
The global customer acceptance and deeper implementations of Cloud, Big Data, Mobile, and Gaming demand immediate attention and a redefined application development. When there is a new upgrade or deployment for the existing business needs, the IT enterprises should make sure that the implementation is accurate and secure. Nevertheless, it is very difficult to establish an equal level of intensive care when it comes to the everyday functioning of the business application.
This has set a higher standard for software testing services companies. They are catering to these problems with their holistic testing strategy and framework to test and certify the quality of the application and make sure that it is defect free.
The Cost of Software Testing is primarily affected by the reasons mentioned in the image below
World Quality Report
Cost is almost always the predominant problem that IT decision-makers are worried about. Skyrocketing costs are usually the result of many fundamental difficulties that can be defined by many braces, including:
  • Inefficient processes
  • Underproductive people
  • Inadequate tools and technologies
  • Dynamic and unstable business and technological landscape
  • Poor decision-making or implementation errors
This inability of an organization is resulting in catastrophic application failures where not only is money being misspent, but reputations of otherwise smart professionals are being smudged.
Adding to these challenges, another big challenge for the senior executives is gaining access to the right testing tool.
The Right Testing Tool
Along with good testing processes and people, testing tools are an integral part of the trio that boosts the speed and quality of software testing programs. Testing tools allow you to leverage the testing servers to create, maintain and run testing scripts, both manual and automated. A Good software testing tool provides a turn-key solution that allows to focus on creating robust and efficient tests, as quickly as possible. Such tools provide the extra vigour that is required to execute successfully. This is when software testing services company comes into the picture.
These are independent technology company who take up outsourced software testing projects.
There primary objective to service the global technology customers in terms of Software Testing. These services are applicable across industry sectors such as retail, e-commerce, banking, gaming, technology, education, manufacturing, life sciences, healthcare and travel/transportation/hospitality.
Software testing services include
  • Test Automation
  • Installation/Configuration Testing
  • Regression Testing
  • Mainframe Testing
  • Continuous Integration Testing
  • Managed Crowd Testing
  • ETL Testing
  • Accessibility Testing
  • Exploratory Testing
  • Internationalization/Localization Testing
  • Automated Test Script Development
  • User Acceptance Testing (UAT)

Benefits of Partnering with Software Testing Services Company

  • Streamlined software testing process
  • QA Testers can optimize the QA system
  • Generates high-quality software projects, mobile-ready apps, and market-ready service deployments with lesser time to market.
  • Cost-benefit realities of QA effort to enhance the economic efficiency of the business
  • Improve the current QA thrust and improve it through automation and workflows
  • QA services ensures High RoI on QA investments
  • Provides 100% test coverage
  • Provides faster test cycle at reduced cost
We will discuss more about software testing services in our upcoming blogs.
Till then, stay tuned !!!!

Monday, 21 August 2017

Mobile Application – Performance Testing & Engineering

mobile apps performance testing

Why is performance testing important for mobile apps?

Why is performance testing important and why should you be doing it? Well, very simply, it is the key to user engagement, getting people to use your app and continue using your app.
User engagement/experience is definitely a challenge on mobile. This graph is based on data from Apple and Google shows that 80% of apps are never used again after the first day they are downloaded and less than 4% are still used a month after they are downloaded. Looking at the graph, it’s evident that keeping users engaged with your mobile app is difficult.
A side point is that I have had people in the past say that while they accept this for consumer-facing apps, it’s not relevant to them because they are making employee facing apps or some other B2B type apps. But the reality is that adoption and engagement are just as much a challenge in employee facing apps as it is in B2B apps.
Going a bit further I would like to illustrate with two examples with data points, which basically impacts the impacts of performance of user engagement and ultimately the revenue
  • Amazon has reported that a 100-millisecond increase in response time of their app reduces revenue by 1%.
  • Google reported similar findings that a 100-400 milliseconds increase in search response time led to a $90 million dollar reduction in ad revenue.
It’s important to note that there’s nothing unique about Amazon or Google, meaning that this impacts them. So, clearly performance is critical to users, so we need to be testing it.

Importance of Mobile Apps Performance

As the usage of mobile phones is increasing, the importance of mobile apps performance is also increasing simultaneously.  Mobile phone users are very demanding and always expect mobile apps to perform like their desktop apps.
According to a survey,
  • 71% users expect a website to open on their mobile as quickly as on their desktop.
  • 5 seconds is considered as turnaround time for mobile applications and websites.
  • 74% users leave the mobile websites and 50% users exit mobile apps after 5 seconds or more response time.
  • 35% of the frustrated users go to competitors’ applications.
Normally mobile phone users try to access the troubled application twice and nearly half of them never return to the application if they still face the same issue(s) on their mobile device. Achieving and maintaining a good response time of a mobile application is more challenging for performance engineers than a desktop application due to its complex architecture and fewer resources it uses.

Difference between Traditional Desktop Web Apps & Mobile Apps

The both Desktop web applications and mobile applications are quite similar technologies and don’t need to select a unique performance testing tools for mobile apps.

Top 5 Mobile App performance related issues

Client Side
Server Side
  •  Network (Wi-Fi-/2G/3G/4G) impact of connectivity issues
  • CPU Usage
  • Memory Usage
  • 2d & 3d Graphics Card
  • Battery Usage consumption 
  • CPU Usage
  • Memory Usage
  • Cache
  • I/O
  • Bandwidth Usage
  • Connections

Top 3 Performance issues which we have encountered while carrying load testing of mobile based application and solution

S.NoIssueSolution
Bandwidth Consumption issue
1During our mobile app performance, test execution network bandwidth was high due to image & page size was hugeBandwidth consumption decreased by compressing and reducing page & image resources to make it faster
Response Time / Page Load issue
2During mobile app performance test execution, the page load time was high.Reducing the number of HTTP requests between the client and server.
Data Loss Issue
3We experienced in one of our microfinance client application while filling the forms and uploading the images in mobile application and if network connection loss/drops the middle of completing all the data will lose.Create an offline mode/data save option method in the mobile application so that whenever network drops the data won’t lose. We can resume the continued activity once the connection has been re-established. This kind of method will use for travel apps

 Conclusion

There is no choice about the exponential growth of mobile application usage in this world. To better user experience the mobile apps should be quick to respond as expected by the customer.
The Performance testing of mobile apps is not different from our traditional web/desktop applications. However, Performance engineers need to understand all types of mobile apps architecture and details completely (like Network bandwidth, screen size, processing power etc…)so that it will improve the user experience and apparently the user base & revenue.
References:
  1. http://www.testplant.com/2015/04/23/mobile-apps-need-performance-testing-too/ 
  1. http://www.agileload.com/agileload/blog/2013/01/14/mobile-performance-testing-overall-analysis—whitepaper

Wednesday, 16 August 2017

Web Services Testing: An Overview

Web Services Testing
Web Services deliver a standard mode of interoperability between software applications running across platforms and frameworks. These form base of connectivity for services that are connected together into a Service Oriented Architecture (SOA).
Further, these services communicate with each other using web services. Web services put force a standard way of integrating web-based applications using XML, Simple Object Access Protocol (SOAP), Web Services Description Language (WSDL), and UDDI open standards over an internet protocol. A Web service is offered by an electronic device to another electronic device, communicating with each other via the World Wide Web.
It is also factual that a failed web service creates chaos not only to the managers but also to the administrators who are responsible for the server maintenance. A failure also discomforts the clients who are trying to call the particular web service.
web-services
The key elements of web services are a repository, messaging, and service. Since web services are distributed over networks and applications, the testing requirements should also include interfaces. Web services are integrally susceptible to added risks in the areas of integration and interoperability.
A web service can be implemented using any programming language on any platform, on condition that a standard XML interface description called WSDL is available. A standard messaging protocol called SOAP is also used at the simultaneously. Web services often run over HTTP but may also run over other application layer transport protocols. Automating web services testing facilitates reduction of its overall testing efforts.

Web Services Testing

web-services-testingWeb services provide the continuous connection from one software application to another over private intranets and the Internet. Web services’ testing includes functionality and loads aspects to check how a Web service performs for single client and scales as the accessing load increases. Testing enables the detection of errors, evaluation, and approval of system qualities at an earlier stage. An automated test approach, in particular, helps to efficiently repeat tests whenever needed and by improvising the time to market. In particular, test automation will be essential to a sound and efficient Web services development process, for the assessment of the functionality, performance, security, scalability, and UI of Web services.

Web Services Testing Steps are

Step 1: Generate the client /skeleton code of the web service
Step 2: Define the required test inputs
Step 3: Invoke the web service using the client/skeleton code
Step 4: Client/skeleton code generation and response verification
Step 5: Actual response Vs expected response verification

Web Services Testing Tools

There are many web services testing tools available, such asweb services testing tools
  • ManageEngine
  • QEngine
  • SoapUI
  • TestMaker
  • WebInject toolsets
These tools are built on open source tools/libraries/frameworks and hence help reduce overall costs. They help to increase automation efficiency by minimizing initial coding effort. These tools can also help to create and execute the tests quickly.
In this fast paced world, a faster test cycle at lower cost is crucial to stay competitive and thus reusable test automation frameworks coupled with open source tools and technologies is a key solution to shrink test cycle time and its costs.
There are Web Services test automation frameworks available that are designed and developed by many testing service providers and adopting them reflects more effectiveness. Some of their important features & benefits are detailed below.
We will discuss more about Web Services Testing in our upcoming blogs.
Subscribe to hear more from our web services testing experts.

Tuesday, 25 July 2017

IoT Testing Complexities


IoT Testing Complexities



IoT testing Services
A comprehensive QA strategy is essential to cover the dimensions of IoT testing. The strategy should include the types of testing, test lab setup, testing tools and simulators/emulators that are to be deployed. Considering the practical hiccups in generating big data from the thing (device) in a testing environment, it is crucial to evaluate data simulation and virtualization methods.
iot-testing-areas
Stubs can be considered as an option during early stages, whereas, data recorders can assist as an alternative at the later stages. Beyond test planning and data simulation, metrics-driven exhaustive test execution is performed to achieve a stable system. QA team can split IoT test area into two layers i.e. device interaction layer and user interaction layer. However, QA has to be performed across both the layers. It is actually easier to identify techniques and the types of testing that can be adapted to each layer to enhance the QA strategy.

The device interaction layer

This layer acts as the connectivity between the software and the hardware components of a real-time ‘IoT environment interact’. A typical example will be a Bluetooth device transmitting real-time data to a mobile device application. Sometimes, a lot of interaction testing will be done on the functional side of QA.device-interaction-layer
However, other types of testing could also be required. IoT testing will cover the spectrum of other required elements listed below, in addition to typical software testing:
Conformance with standards: These are mostly device performance traits that are precise to the devices and sensors. These attributes must be validated against the standards of the device and its communications protocol. Hardware vendors perform most of these tests, but there could be a certain domain or use-case specific requirements such as the use of such devices in an environment that was not tested.
Interoperability: The ability of different devices to support the required functionality among themselves, other external devices and implementations.
Security: With billions of sensors in the making, it’s crucial to tackle data privacy and the security concerns across the IoT ecosystem.
The following are the different types of security testing requirements:
  • Identity and authentication
  • Data protection
  • Data encryption
  • Storage data security in local
  • Remote clouds

The user interaction layer:

This layer is the touch point between the thing (IoT Device) and the user. The success of the overall system depends on the seamless user experience.

Key testing areas in this layer include:

Network capability and device level tests: The specific aspects of network communication such as connectivity are validated by simulating different network modes in addition to device-level validation such as energy consumption tests, etc.
Usability and user experience: Usability and user experience are important in terms of the real-time usability; it involves both human and machine interaction and also the real-time experience that the IoT system provides.
For example, contactless payments compared with a physical card-based payment. 
The IoT services and back-end IoT environment:
While integration testing of the interfaces is a key, there is a complex data layer that comes into play.
For example, a classic IoT system boxes a complex analytical engine to ensure an exceptional user experience. This creates a QA environment to assist validation of such interface by addressing the growing data volume, velocity, and variety challenges. The front-end validation environment can be done by assembling data recorders and simulators. The service and data layer validations will involve complex simulation such as the generation of millions of sensor hits, machine learning algorithms and the time-boxed traffic. There are a few methods to create such an ecosystem; For example, leveraging sandboxes of development services or creating mock environments using virtualization tools. However, numerous implementation synergies are required to establish a working set of environments for through services and back-end validation platform.
Let’s discuss more IoT testing in our upcoming blogs.
Please comment on your interested topic in IoT.

Sunday, 23 July 2017

A day in the life of a Game Tester!

qa-game-tester
To give an idea of what a Game tester’s day looks like, we are talking about a guy who goes by the gamer tag ‘Dalda’
Is game testing a job or a career?
While the game testing career starts as a lower paying job, there is high potential for anyone to grow in the industry very quickly. Those who have an urge, commitment and wish to make a name for themselves in gaming industry, game testing provides a big platform to grow. If one is focused long enough and keep learning and stay current, he/she can make a managerial role within a short span of 10 years.
How did you get started?
I always found myself playing a lot of video games going back as far as I can remember. From my college days, I was interested in computers plus my long standing interest in video games lead me to conclude that I wanted to be a part of the gaming industry.
The first job I got was in company, one of whose Executive was a friend’s Uncle. Knowing the bunch of us and our addiction to gaming we got offered a job.
What do you do all day?
Playing games was something I knew how to do However, learning how to be a tester, I had to unlearn what I knew, which was killing bad guys, smashing stuff and level up faster. That does not help a tester, test a game effectively. On a regular day my schedule was pretty tight. I had do to all sort of background stuff like anyone does at their job, like keeping track of what they do.
What skills do you think are most useful for a QA tester?
Attention to detail, and understanding the game. Learning the game properly and in detail will help us to understand patterns within the game, which will help to break the game and find more bugs.
What’s the pay like?
Unlike software engineers, game testing is not a highly paid job at entry. A game tester with 2 years of exposure would have more knowledge than a software tester who holds 5 years of experience. The learning curve is much steeper. All that will help one to grow quickly and get to bigger roles with better pay.
Are there any games which involve dread testing?
Definitely. Testing a game in a foreign language, without any support documentation or without anyone to explain the game, makes your job a lot harder.
Do you have a say in what games you test?
No. You go where you are asked to go and are expected to work on anything that is sent your way.
What’s the best and worst part of being a QA tester?
Best part is that you get to learn so much. You see a game when it is just in the developmental stages. You get to track the whole game through its various phases. You get to lend a hand in shaping the game and making it better for everyone to enjoy. You get to see the game through to the end. And at the end of it all, you are happy because you know you helped make the game better and everyone is enjoying the game!
Worst part would be, after working for a whole day on a game, I ended up dreaming myself as a character stuck in the game and being unable to clear the level to get out!
Do you play the games you test in your down-time?
Yes, occasionally. One of the games I tested – Section 8, my colleagues and I ended up staying late for a few hours past the end of day just to enjoy the game. Some more games in that list are CS: Condition Zero, Unreal Tournament.
Has the job affected your enjoyment of gaming?
After a small initial stage of not wanting to look at a video-game after working for a whole day on video-games, I have gone back to finding and playing new games in my free time and it is still a lot of fun.
What’s your favorite game and why?
Last of Us! No game has had me being so mentally involved with the protagonist.
Favorite movie based off a video game?
Halo Legends and Resident Evil series. These movies stayed close to the video game story line, while most of the other movie versions of the games did not portray the story line nor convey the experience when you play the game.

Tuesday, 4 July 2017

Learning Management System Compliance




A very common scenario that any learning management system (LMS) vendors experience is, “A potential client reaches at their exhibition booth and asks, “Is your LMS complies with regulatory standards?”
The answer is not very simple.
The worry with compliance is that it is driven by laws and regulations, which require specific training. The desire of companies is to avoid liabilities, business objectives implementation, reduction of paperwork.
To understand the topic thoroughly, we have to differentiate between the ability to track compliance in a number of areas and as a piece of software, a learning management system should be compliant with certain standards and regulations. It is crucial to understand that laws and regulations are explicit to individual countries or states, and will vary widely from one jurisdiction to another.
The below definitions and examples are drawn from the United States and Canada but there will be similar issues in the regulation of other countries as well.
Below are main compliance issues that a learning management system vendor needs to consider:
Accessibility standards – In the United States, accessibility regulations come under the Americans with Disabilities Act (ADA) and Section 508 (29 U.S.C. ‘794d) of the 1998 Rehabilitation Act. These standards follow the World Wide Web Consortium’s (W3C) Web Content Accessibility Guidelines 1.0. LMS adhering to these standards will help to increase the user base.
Regulatory compliance tracking – LMSs are basically large databases and are often questioned to track regulatory compliance with specific government regulation.
Example: In United States, Health Insurance Portability and Accountability Act (HIPAA), and the Occupational Safety and Health Administration (OSHA) regulations are included.
In Canada, the requirements of training for the Workplace Hazardous Materials Information System (WHIMIS) are at times tracked by an LMS.
Security standards – Many organizations require secured systems to store personal information. This is true in the medical field but can apply in other areas as well.
These requirements will include
  • maintenance of audit trails,
  • deployment of completely closed systems,
  • usage of digital signatures, such as the US Federal Drug Administration regulation FDA 21 CFR Part 11 for the medical, biotech, and pharmaceutical industries,
  • use of high degrees of encryption for groups such as the banking industry.
Interoperability standards – One of the most basic demands of LMS users is the interoperability of courses developed by different developers in the same LMS. It led to the development of one of the first set of LMS standards like those of the Aircraft Industry CBT Committee (AICC). In 1997, the IMS Global Learning Consortium (IMS GLC), a non-profit organization keen in setting stipulations and standards for the learning industry, was formed. It issued many sets of specifications since that date. In the same year the announcement of the Advanced Distributed Learning (ADL) initiative of the US Department of Defence. ADL developed the Shareable Content Object Reference Model (SCORM) and the ADL Registry of SCORM compliant software. LMS vendors are well aware of both AICC and SCORM, and generally are compliant with both standards.
However, most LMS implementations don’t work well with these standards and often require a period of adjustment and tuning to make them work seamlessly.
Tracking training for certification – There are many regulatory compliances for training in specific industries, where employees are required to be certified before being employed and require to be recertified on a regular basis. Many LMSs track certification and will trigger automatic alerts as the date for recertification approaches. This need can be driven by legislation or by standards imposed by a specific industry or company.
Tracking training for liability reduction – Training can aid reduce the liability for employers in controversial areas. Courses or educational materials on such things, such as sexual harassment, or employment discrimination; might be much less expensive to provide than a settling of lawsuits in case of disputes.
Reduction in paperwork – In conclusion, a fully functional LMS will have features that reduce workload and paperwork for compliance management. Such features include auto-enrolment in compliance training based on job, automated alerts to managers and workers on failure to complete compliance training, assessment and evaluation of the training, certificate issue on training completion, and many other configurations for tracking and reporting in this area.
So the answers to questions on compliance and learning management systems are not simple and are multidimensional.
Hopefully, this article will help you to sort out what you need in this area.