I wrote this piece yesterday in response to my resume rejection by a hiring manager. According to my recruiter, she said “didn’t see much experience on my resume on my E2E testing”. I wasn’t sure what she meant by that. Not everything can be ascertained from just a resume. My recruiter wanted me to write a paragraph about “end to end testing” in response to the hiring manager. I gladly obliged and wrote this piece then realized this was my entire approach to E2E testing.
To prepare for E2E testing we gather information about the feature under test as much as possible by sitting down with the product owner and tech lead. We go through the epic in Jira look at the tickets in the epic regarding the feature. Since E2E testings are conducted to ensure everything in the new feature is working as per requirements. “The E2E test goes through every operation the application can perform. We test how the application communicates with hardware, network connectivity, external dependencies, databases, and other applications.” [browserstack]
We gather URLs for all our brands (national and international) and data as per the feature required, list of embedded features that needs to be tested, the environment, mobile devices, gathered data on the versions and browser versions of these devices that will be supported, feature flags that needed to be tested, API testing using Swagger, testing databases with Dynamo DB.
We know our critical features are ad rolls because that brings money to the company. We have to make sure ad rolls are autoplaying and working as it should. We have to make sure all our ads on the page are displaying correctly.
Getting information on other teams that might be affected by these changes and what should be the focus if another team is affected by the changes of the feature under test. We take everything and create an excel sheet using google drive and send it to the product owner, tech leads, CTO so they can look at the testing activities conducted by the QA team. As we are testing we are updating the excel live with screenshots and a comprehensive list of defects we find. The session usually runs for an hour or two depending on the size of the feature.
After the session is over product owner, tech lead, QA lead will sit down to go for the defects and prioritize them according to the severity. Once the bugs get fixed we have another E2E session and retest everything as per the test plan to make sure everything is working. If everything goes well during the retest we sign off collectively to launch in production. Once it goes live, we individually do a quick test of ads, creating an article, ads are displaying correctly. Just so that everything is working and there are so surprises. If there is something wrong we tackle and notify the product owner and tech lead right away.
NOTE: I found some great info about E2E testing. I will be posting them here when I get the chance.