Epidemiological surveillance systems evolve with time, depending on the context and the data already collected. Then, the software used must evolve in order to meet requirements. However, introducing new requirements in order to update the software takes time, is expensive and may lead to the problem of software regression. The problem of failed software developed for epidemiological surveillance are often the result of an unsystematic transfer of business requirements to the implementation. This problem can be avoided if the system is established using a well-defined framework/architecture permitting the rapid development/update of the surveillance software. Empirical research shows on the one hand that Model Driven Techniques such as Model Driven Architecture (MDA) are more effective than code-centric approaches for the development and the maintenance of software. On the other hand, Agile Processes such as Scrum are more effective than Structured Processes when requirements are subject to frequent change. Researchers demonstrated that developers of medical software such as epidemiological surveillance software are experiencing difficulties when following Structured Processes and code-centric approaches. The main goal of this empirical study was to apply the combination of Scrum and Model Driven Architecture for the development of epidemiological surveillance of tuberculosis. During this research, we found the approach ease of use and very useful when the MDA tool can […]
In this article we consider the inverse problem of reconstructing piece-wise Lamé coefficients from boundary measurements. We reformulate the inverse problem into a minimization one using a Kohn-Vogelius type functional. We study the stability of the parameters when the jump of the discontinuity is perturbed. Using tools of shape calculus, we give a quantitative stability result for local optimal solution.
Sequential Monte Carlo methods have been a major breakthrough in the field of numerical signal processing for stochastic dynamical state-space systems with partial and noisy observations. However, these methods still present certain weaknesses. One of the most fundamental is the degeneracy of the filter due to the impoverishment of the particles: the prediction step allows the particles to explore the state-space and can lead to the impoverishment of the particles if this exploration is poorly conducted or when it conflicts with the following observation that will be used in the evaluation of the likelihood of each particle. In this article, in order to improve this last step within the framework of the classic bootstrap particle filter, we propose a simple approximation of the one step fixed- lag smoother. At each time iteration, we propose to perform additional simulations during the prediction step in order to improve the likelihood of the selected particles.