Water analysis control system
One of the most meaningful and valuable projects I have had the opportunity to contribute to. For one of the largest drinking water companies in the Netherlands, I contributed to the development of an internal system that safeguards the reliability and integrity of water quality measurements while simultaneously optimizing the underlying business process.
Project info
- Start
- March 2024
- End
- July 2025
- Complexity
- 8 / 10
- [[Team size]]
- 3
- Type
- QAS
- [[Stack]]
- C# Javascript HTML & CSS SQL PowerShell YAML Blazor WASM Entity Framework Core xUnit / NUnit MudBlazor GIT IIS Scrum Kanban CI/CD DevOps
About the Project
For one of the largest drinking water companies in the Netherlands, I contributed to the development of a new internal system that safeguards the reliability and integrity of water quality measurements. The company supplies clean drinking water to millions of residents on a daily basis and also performs independent measurements for a large number of external clients. The accuracy and reliability of this data are therefore critical.
The objective of the project was to replace a large number of Excel-based processes with complex macros by a centralized, controllable, and reproducible system. The result is an application that analyzes complete measurement runs, applies corrections based on internal standards, automatically detects deviations, and releases results in a controlled manner to the LIMS system. This has improved not only efficiency, but also the reliability and traceability of the entire process.
The Team, My Role, and the Collaboration
The team consisted of a senior lead developer, a second developer, a tester, and a product owner from the client organization. I joined the project full-time approximately three months after it started. The active development phase then continued for about another year.
My role extended beyond implementation. In close collaboration with analysts, we often started from existing Excel screens and macros. Rather than translating them one-to-one, we critically analyzed and redesigned them with the goal of optimizing the underlying business processes. This was not merely a digital transformation effort, but a structured redesign of a critical operational workflow to make it more robust and future-proof.
As a result, we were deeply involved in functional design, workflow definition, and user experience. We were given significant trust and freedom to contribute substantively and make well-founded design decisions.
After the departure of the senior lead developer, I assumed responsibility for the project. Shortly thereafter, the system was rolled out in phases to production, which proceeded in a controlled manner without notable disruptions.
Functional Operation of the System
The system supports the complete laboratory run analysis process and safeguards the quality and integrity of measurement results across various instruments and methods. A run consists of a sequence of samples processed together on an analytical machine. Within such a run, different types of samples are included, each serving a specific role in quality assurance.
In addition to customer samples, a run includes first-line control samples, blank samples, and samples containing internal standards. These are not evaluated in isolation but always in relation to the entire run. The quality of a single measurement does not stand on its own; reliability is determined by the overall context.
Correction factors are calculated based on internal standards and applied to the measured values. Blank samples are used to detect potential contamination or carry-over. The system then performs automated validations at both sample and run level, including checks against acceptance limits, internal consistency, and deviations from historical or statistical expectations.
If a run does not meet the defined criteria, this is explicitly flagged. The analyst is provided with detailed insight into the deviations and can make a substantiated decision to approve, adjust, or reject the run. Only after this validation step are first-line control samples processed into ELC charts and the final results written to the LIMS system.
In addition, the system facilitates periodic reporting and annual evaluations, providing insight into trends and structural deviations. This supports not only daily operations but also long-term quality monitoring.
The strength of the system lies in the combination of automated validation, transparent decision-making, and human oversight. This ensures that measurement results are not only technically correct, but also explainable, reproducible, and demonstrably reliable.
Architecture and Technical Choices
Given the complexity of the domain logic, a deliberate choice was made for an architecture that enforces separation of concerns and supports long-term maintainability. The application is structured according to Clean Architecture principles, where the core of the system, the validation and correction logic, is fully decoupled from infrastructure and presentation. This ensures that the domain logic remains leading, testable, and independent of technical implementation details.
Within the application layer, a CQRS approach was adopted in combination with MediatR. Commands and queries are explicitly separated, making use cases clearly defined and the processing steps of a run transparent and traceable. This aligns well with a system where auditability and control are essential.
Entity Framework Core is used for the data layer, with careful modeling of runs and samples to ensure consistency, cohesion, and performance. The front end is built using Blazor WebAssembly, and the application is hosted in IIS. It is integrated into a CI/CD pipeline to enable controlled and reproducible deployments, including database migrations.
The chosen architecture and technology stack support what matters most for this system: reliability, testability, and sustainable maintainability of complex domain logic.
Reflection
For me, this has been a particularly meaningful project. The complex subject matter, strong collaboration, and active involvement in functional design made it both intellectually and technically challenging. Throughout the project, we were given substantial trust and freedom to carefully design and implement the system, resulting in a successful phased production rollout and a stable end result. All of this makes it one of the most meaningful and valuable projects I have had the opportunity to contribute to.
Note: Due to the nature of the system and the sensitivity of the underlying data, I am limited in what I can publicly share. Therefore, this website does not include screenshots, code examples, or detailed technical documentation related to this project.