Testing and validation Archives - Indium https://www.indiumsoftware.com/blog/tag/testing-and-validation/ Make Technology Work Mon, 29 Apr 2024 12:32:30 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.3 https://www.indiumsoftware.com/wp-content/uploads/2023/10/cropped-logo_fixed-32x32.png Testing and validation Archives - Indium https://www.indiumsoftware.com/blog/tag/testing-and-validation/ 32 32 AI-Enabled Metrics for Release Decision https://www.indiumsoftware.com/blog/ai-enabled-metrics-for-release-decision/ Mon, 19 Feb 2024 13:21:05 +0000 https://www.indiumsoftware.com/?p=26264 Developments in artificial intelligence (AI) can help with the faster, well-informed strategic decision-making process by assessing data, recognizing patterns and variables in complex circumstances, and recommending optimal solutions. The purpose of AI in decision-making is not complete automation. Rather, the goal is to help us make quicker and better decisions through streamlined processes and effective

The post AI-Enabled Metrics for Release Decision appeared first on Indium.

]]>
Developments in artificial intelligence (AI) can help with the faster, well-informed strategic decision-making process by assessing data, recognizing patterns and variables in complex circumstances, and recommending optimal solutions. The purpose of AI in decision-making is not complete automation. Rather, the goal is to help us make quicker and better decisions through streamlined processes and effective use of data.

In a QA cycle, we capture various metrics to gauge the testing we have done against the baseline values according to industry standards. In this article, we are using an AI model to make the release sign-off decision, calculated with automated metrics.

AI-Enabled Model

AI-based release decision, often referred to as AI model deployment or rollout, involves determining when and under what conditions an AI system should be put into production or made available to end-users. Here are some key considerations for making AI-based release decisions:

Model Evaluation: Before making a release decision, it’s essential to thoroughly evaluate the AI model’s performance using appropriate metrics. This evaluation should include various aspects, such as accuracy, precision, and any other relevant performance indicators. The model should meet predefined quality and accuracy standards.

Here is the AI model designed…

Based on the above, the most important decisions are arrived at, which are mentioned below:

Release Tollgate Decision

This decision entails the criteria for Production Readiness, determining whether to sign off for production or not. The decision is based on the provided values.

Quality Quotient

The Quality Quotient is a percentage derived from established metrics used for assessing and improving software quality. The following parameters are captured, and the quality quotient is determined with a predefined formula. The decision is based on the following range of values: 0% to 98%.

Testing & Validation

Extensive testing is necessary to identify and address potential issues, including edge cases that the AI model might encounter. Testing should cover a wide range of inputs to ensure the system’s robustness. Validation involves verifying that the AI model’s performance aligns with business objectives and requirements to contribute to the desired goals.

Use Cases

This model is evaluated for two projects. One is in the social media domain, which has weekly pushes to production. We have the model with the process of capturing the status of tests and defects through tools like JIRA and qTest. The captured data is fed into a dynamic dashboard with built-in formulas for calculating the metrics needed for sign-off.

The results are greatly helpful in making the release decision. We have some feedback mechanisms which helped to evolve the model and we are recommending the same to the customer.

The second one is for a fortnightly release financial domain project. Here the model gave indicative results for making the release decision.

Release decisions should be data-driven and grounded in a well-defined process that considers the AI system’s technical and business aspects. It’s crucial to strike a balance between delivering AI solutions swiftly and ensuring they adhere to quality, ethical, and security standards. Regularly reviewing and updating the release criteria is essential as the AI system evolves and new information emerges.

The post AI-Enabled Metrics for Release Decision appeared first on Indium.

]]>
Mendix Application Development – Continuous Refactoring https://www.indiumsoftware.com/blog/mendix-application-development-continuous-refactoring/ Fri, 31 Mar 2023 07:11:41 +0000 https://www.indiumsoftware.com/?p=16097 While designing the application we need to consider the multiple aspect of the application to reduce efforts put into maintaining the app as well it will be easy to adopt the changes while the development all the refactoring pain points can be considered while designing itself it will be easy to do the same in

The post Mendix Application Development – Continuous Refactoring appeared first on Indium.

]]>
While designing the application we need to consider the multiple aspect of the application to reduce efforts put into maintaining the app as well it will be easy to adopt the changes while the development all the refactoring pain points can be considered while designing itself it will be easy to do the same in the future also we can avoid the rework.

To make sure this we should try creating small modules in the app for specific parts which will be dedicated to respective functionality. All the dedicated entities should reside in respective modules so that access control and security configuration will be easy. Refactoring will be required throughout the development, it can be due to multiple reasons like requirement change, complexity increases on specific features and if we do not refactor the code, it will be difficult to maintain the app in feature.

Two major refactoring in Mendix involves the Domain Model and Microflow

Domain Model Refactoring

In Traditional development we call it normalization, it is a process of organizing data in a relational database to reduce redundancy, improve data integrity and minimize the risk of data inconsistencies. It involves breaking down larger tables into smaller, more manageable ones that are linked through associations and constraints. The goal of normalization is to optimize data storage and retrieval performance and improve the overall efficiency and scalability of the database.

Similarly, normalizing a domain model in Mendix involves breaking down large or complex entities into smaller, more manageable ones that are better optimized for data storage and retrieval. The goal is like normalization, which is to minimize redundancy, improve data integrity, and reduce the risk of data inconsistencies.

Unfortunately, many Mendix developers doesn’t aware traditional database development which leads unnecessary entities, relationship between entities, usage of the persistent and non-persistent entities.

At a high level to refactor a domain model in Mendix, developers can follow best practices such as:

  1. Identifying entity dependencies: Developers should identify which entities depend on others and how they are related. This information can be used to determine which entities should be broken down and how they should be linked.
  2. Analyzing entity attributes: Developers should analyze the attributes of each entity to identify any redundant or duplicate information. Attributes that are common across multiple entities can be moved to a separate entity and linked using associations.
  3. Normalizing relationships: Developers should use best practices for entity relationships, such as many-to-many associations, to ensure that the data is structured in a way that is easy to manage and retrieve.
  4. Defining constraints: Developers should define constraints such as primary keys, unique keys, and foreign keys to ensure that data is stored and retrieved correctly and efficiently.
  5. Testing and validation: Developers should thoroughly test and validate the normalized domain model to ensure that it is functioning as expected and that there are no data inconsistencies.

Refactoring a domain model in Mendix can be a challenging process that requires careful planning and execution. The domain model is a critical part of a Mendix application, and any changes to it can have a significant impact on the application’s functionality, performance, and scalability. Some of the challenges that developers may face when refactoring a domain model in Mendix include:

  1. Data migration: When making changes to a domain model, developers may need to migrate data from the old model to the new one. This can be a complex process that requires careful planning and execution to avoid data loss or corruption.
  2. Impact Analysis: Changes to a domain model can have a ripple effect on other parts of the application, such as microflows, pages, and widgets. Developers need to carefully review and update these components to ensure that they are compatible with the new domain model.
  3. Testing: Refactoring a domain model can introduce new bugs or issues, which can be difficult to identify and fix. Developers need to thoroughly test the application to ensure that it is functioning correctly after the changes have been made.
  4. Time and resource constraints: Refactoring a domain model can be a time-consuming process that requires a significant amount of developer resources. This can be challenging in cases where there are strict deadlines or limited development resources.

To address these challenges, developers can follow Standard operating procedure (SOP) such as creating a backup of the existing domain model before making changes, carefully planning and testing data migration, and using automated testing tools to identify and fix issues. Additionally, developers can work closely with stakeholders and end-users to ensure that any changes to the domain model are aligned with business requirements and do not impact the end-user experience.

Refactoring Microflow

This process involves restructuring and optimizing microflows to improve their performance, maintainability, and scalability. Microflows are a fundamental building block in Mendix application development, and they are used to model business processes and workflows. Over time, as the application grows and evolves, microflows can become complex and difficult to manage, leading to issues such as poor performance, increased technical debt, and reduced maintainability. E.g., Each microflows should be dedicated to achieving one functionality and you can call multiple SubMicroflows in Main microflows to achieve the complete goal. Any business logic can be used at multiple places should be moved to SubMicroflows.

To address these issues, Mendix developers should regularly review and refactor their microflows. This involves breaking down complex microflows into smaller, more manageable ones, simplifying conditional expressions and loops, removing unnecessary actions, and optimizing queries and other data access operations.

There are several benefits of microflow refactoring in Mendix. Firstly, it can improve the performance of the application by reducing the number of database queries and optimizing data retrieval and manipulation. Secondly, it can improve the maintainability of the application by making microflows easier to understand and modify, reducing the risk of bugs and other issues. Finally, it can improve the scalability of the application by ensuring that microflows can handle increased loads and volumes of data.

To perform microflow refactoring in Mendix, developers should follow Mendix provided guidelines such as defining clear naming conventions, breaking down microflows into smaller, more manageable ones, avoiding unnecessary complexity, and using best practices for data access and manipulation. Additionally, developers can use tools such as the Mendix Modeler to analyze the complexity of their microflows and identify areas that need optimization.

Additionally, the following can be considered as refactoring in Mendix.

Refactoring Application

  • Functional Use case, one module will be dedicated to one specific functional requirement, applying SOLID principles.
  • System Admin configuration can be moved to one module.
  • Separate Module for each Integration with External App /API
  • Design Proper Folder Structure in Each module to Maintain the Business Logic, UI Pages
  • Separate Folder for each functionality under that folder separate folder for validation logic, business logic, pages
  • Move the common code of the pages into Snippets.

Read about our success here, Development Of The Mendix Application For Middle Eastern Government Services

Conclusion

Refactoring is an important process in Mendix development that involves improving the design, structure, and quality of the application to improve maintainability, performance, and scalability. However, it is essential to ensure that the refactoring process does not result in rework, which can occur when changes are not properly planned or executed.

To avoid rework during Mendix refactoring, it is crucial to consider design considerations such as architecture, data models, and software patterns. These design considerations help to ensure that the refactoring process is executed in a way that is compatible with the existing codebase, and that the changes made do not introduce new bugs or issues.

For example, when refactoring a Mendix application, developers should consider the existing architecture and ensure that the changes made align with the overall design principles of the application. They should also consider the data models used in the application and ensure that any changes made do not impact data integrity or consistency.

We can help automate business processes, build CRMs, and improve sales.

Click Here

The post Mendix Application Development – Continuous Refactoring appeared first on Indium.

]]>