There is no such statement as ‘I am now prepared for the interview‘. When facing a Testing interview no matter how many interview questions and answers you have gone through – there is always more to read 🙂 Continuing on our Interview series, let’s see some more interesting FAQs…
Checkpoints, as the name says it all, it refers to a validation point that compares the current value with the expected value for specified properties of an object OR current state of an object with the expected value, which can be inserted at any point of time in the script. If the current and expected value match it generates a PASS status otherwise FAIL status.
Business Intelligence is the process of collecting raw data or business data and turning it into information that is useful and more meaningful, generally used for Reporting, Analysis, Data mining, Data quality and Interpretation, Predictive Analysis, etc.
An ETL tool is used to extract data from different data sources, transform the data, and load it into a Data Warehouse system; however a BI tool is used to generate interactive and ad-hoc reports, dashboard and Data visualizations for further decision making.
- Development Environment (Dev): where actual development (including defect fixes) happens.
- System Test Environment (ST): Also known as QA environment – where testing happens.
- Performance Test Environment: for performance testing.
- Acceptance Test Environment (UAT): After successful System testing, the build is deployed in UAT environment for User Acceptance tests (performed by Business team).
- Production Environment (Live): Also known as ‘Live environment’ – this is where the end-user access the application. Real world example – We (end-users) access our Bank’s net-banking application in ‘production environment’ to initiate transfers.
Depending on the criticality and the type of application-under-test (AUT) there might be few variations like a staging environment (migrate/test on real production data), pre-production environment (where you can test user profiles, security, disaster recovery and back-up) and training environments (where client wants the organization’s user training to happen before actual go-live).
- Stubs/Drivers: The biggest of all is the lack of interfaces for Services. A service connects two components, e.g. a website and a database. Most of the times, while testing a web service one of the component will be unavailable (under development, third-party, not tested, etc.). Stubs and Drivers are used to imitate the interfaces.
- Before progressing, the complete architecture of the application needs to be understood, even by the Test team
- Since business needs and technical solutions are closely aligned in an SOA application, Test design need to be based on both business and technical analysis
- Test Data: Testing process spans across multiple systems thus creating complex data needs
- Distributed: Service-Oriented Architecture is a collection of heterogeneous technologies, i.e. SOA Testing requires people with different skill sets. Additionally, it might involve different third-parties developing and testing different components.
- Since the application is an integration of multiple services – Security testing (authentication and authorization) is pretty much difficult.
- Since an individual service might be used by different applications, Performance testing should be done for fine tuning and optimum performance.
- All level of testing should be done for a successful delivery – Service, Process & End-to-End
In simple terms – Testing of the Extract, Transform and Load functions before data is actually moved into a production Data Warehouse system. It is sometimes also known as Table Balancing or production reconciliation.
Data is important for businesses to make the critical business decisions. A data-centric testing process, the main objective of ETL Data Warehouse testing is to identify and mitigate Data defects and general errors that occur prior to processing of data for analytical reporting. ETL Data Warehouse testing plays a significant role validating and ensuring that the business information is exact, consistent and reliable.
Note: Common ETL Data Warehouse testing tools include QuerySurge, Informatica, etc.
Testing of the applications built using Service-Oriented Architecture, i.e. different loosely coupled services. While testing a SOA based application – what should be the main levels or focus areas of testing? Let’s discuss…
- Service: First & foremost – the Services J testing if the deployed service is satisfying the business function it is designed for. Each and Every Service needs to be first tested independently, i.e. request-response.
- Process: Secondly the Integration, i.e. whether the integrated application (front-end > service(s) > back-end) is working as expected.
- End-to-End: Last but not the least – the User Interface. The front-end. After all that’s the first thing which user sees ?
Though there are many tools available in the market for SOA testing but SOAP UI stands out as the most popular among all – might be because it’s open-source ? Developed by SmartBear, SOAP UI is an open source & by far the most popular functional testing desktop tool for Services and API Testing which supports multiple protocols – SOAP, REST, HTTP, JMS, AMF & JDBC.
Though both are related to data, but ETL and Database testing are two different concepts focused in two different directions. ETL Data Warehouse testing is not about database interactions – store, modify and retrieve. The main difference: ETL Data Warehouse testing is normally performed on high-volume data involving heterogeneous systems and a Data warehouse (extract-transform-load), whereas database testing is commonly performed on small-scale data involving homogeneous transactional system (CRUD create-read-update-delete operations to/from a single database).
Differences between an OLTP and OLAP system: OLTP stands for Online Transactional Processing system which is commonly a relational database and is used to manage day-to-day transactions. OLAP stands for Online Analytical Processing system which is commonly a multidimensional system and is also called a Data warehouse.
When analyzing security of a mobile app, testers perform three major checks: client-side, server-side and the protocols by which data is transferred between them.
Mobile Data Encryption at Rest
- Insecure Data Storage: Trace all the routes of data out of an app and flag code elements that can lead to compromised data (e.g. saving unencrypted data, use of cloud storage)
- Improper Session Handling: The use of the Universally Unique Identifier (UUID) and verify it is not being used for session management
- Malicious code injection, such as requests or queries that can trip up the app and cause it to divulge otherwise protected information
Mobile Data Encryption in Transit
- Insufficient Transport Layer Protection: If appropriate Secure Socket Layer (SSL) or Transport Level Security (TLS) capabilities are being employed for data in transit
- Client-Side Injection: Trace all the routes of data into an application and validate if input validation is being performed to counter core injection attacks (i.e. SQL injection)
- Poor Authentication and Authorization: Areas where user is challenged and trace where ID’s and passwords enter and exit the application
- Login-related weaknesses, such as being able to bypass the login prompt to perform functions like interacting with external Web applications and services
Mobile Application Back-ends
- Weak Server Side Controls: Scans the back-end APIs calls
- Side Channel Data Leakage: Uncover data leaking to various data sinks such as clipboard, and log files etc.
- Sensitive Information Disclosure: Similar to side channel data leakage, identify specific data elements are not leaving the app (e.g. to the network, via notifications, to peripherals etc.)
Waterfall methodology is a sequential model divided into different phases wherein each phase is designed for performing specific activity. The output of one phase becomes input for the next phase, i.e. it is mandatory for a phase to be completed before the next phase starts. Progress is seen as flowing steadily downwards (like a waterfall) through the phases.
Waterfall Methodology Phases
- Requirements gathering and Analysis: the requirements are defined, discussed, analyzed, clarified and documented. Result – Requirement Specification document.
- Design: With requirements as input, specify hardware and software requirements and define the overall system architecture
- Implementation (Code and Unit Test): With inputs from system design, develop-unit test-integrate the software.
- Testing: Testing of individual components, integration and complete system for the systematic discovery and debugging of defects.
- Deployment: After testing-fixing-retest, the software/application is deployed in the production environment for use by end-users
- Maintenance: Support and maintenance to fix production issues (if any), provide user trainings, release minor enhancements or defect patches.