Table 4.
Results of synthesis of the underlying reasoning for the scoring according to the 11 functional aspects.
ISSEP (CA and UK) | ECoSur (VN and DK) | ATLASS (DK) | PMP-AMR (BE, DK, IT, and NO) | NEOH (DK, BE, IT, NO, and NL) | SURVTOOLS (DK and NL) | |
---|---|---|---|---|---|---|
User friendliness | Conceptual framework easy to follow. Evaluation(s) more complicated | Relatively easy to understand and could be improved with a web interface | Can be used without much preparation | Easy to understand and fill in without training | Complex without training, long/exhausting. Scoring OH attributes is relatively simple | Tool itself is easy to fill in, but more complex to conduct evaluations |
Meets evaluation needs/requirement | Relationships of integrated surveillance activities/outputs described. No guidance on evaluation | Measurement of the level of collaboration, but not the overall added value of collaborating for surveillance activities | Predefined network is comprehensive, but measurement of smaller progressions not possible | Qualitative scoring system could be improved. Partially meeting needs for AMU and AMR evaluation(s) | Comprehensive, less intuitive to use for specific technical details/laboratory part | Epidemiological performance easiest to perform, other parts more difficult |
Efficiency | Requires a lot of time to conduct evaluation(s) | Evaluation matrix easy to understand/apply. Validation meeting with stakeholder required | Questionable whether all data are really needed | Easy to fill in. Immediate generation of results. Suitable for administrators | Takes a long time to fill in tool. “Theory of change” (ToC) could be better integrated. Not a management tool | Takes some time to fill in the tool and longer time for evaluations |
Use of a step-wise approach to the evaluation | The tool has five evaluation levels | Only possible to follow progress of collaboration if evaluation repeatedly done | Follows a step-wise approach with areas containing sub-categories reflecting the level of implementation and geography | Follows (inherent) a step-wise approach with four levels with logic progression. Level 1: planning of activity/locally and levels 2, 3, and 4: undertaking activities /regionally/nationally | Stepwise approach to evaluation with the following steps: context description, initiative within context description, OH-ness, and ToC (outcome and impact). If evaluation of progress, repeated evaluations over time needed | Does not follow a step-wise approach. Order would be given by choice of evaluation question(s) and not by the toll itself |
Overall appearance | The conceptual framework is well-presented | Well-structured, web platform needed | Useful for evaluation of AMU and AMR and residue surveillance at laboratory level | The general assessment part excellent, the sector specific less so. Nice layout, some parts could be improved | Extensive handbook. Excel tool is mostly understandable but too compressed in layout | Generates evaluation plan. Takes time to evaluate integrated surveillance. Objective results |
Actionable evaluation outputs | No clearly defined actionable outputs | Generation of three graphical outputs of results: one for organizational attributes, one for organizational indexes, and one for functional attributes | Monitors progress and suggests next level | Actions can be agreed upon during assessment. Graphics could be improved. Gaps in sector evaluation | A web diagram makes it easy to identify gaps. Scoring is subjective: may lead to biased results | Not generated by tool. Evaluation could generate first-level actionable outputs (e.g., effect of designs). Other outputs on, e.g., awareness more difficult to obtain |
Evaluation of OH aspects | Comprehensive | Existence of specific attributes measuring OH aspects, e.g., shared leadership | All sectors covered and measures integration | Not addressed in particular | Major strength of the system's approach and the tool | Can be used for all aspects. Layout does not support all components |
Workability regarding required data (1: very complex and 4: simple) | Large amounts of data required | Dependent on the complexity of the surveillance system evaluated | Large amounts of data required | Apparently simple. Data are easily accessible | Requires effort/time to gather data. Some data complex to get (e.g., learning/system organization) | Relatively simple to get the data for filling in tool, but for some evaluation questions/objectives, it is complex to acquire the data |
Workability regarding required people (1: many and 4: few) | Stakeholders from all sectors required | Meant to be applied by an evaluation team | Needs expertise from several areas | All stakeholders invited to evaluation meetings (2 days). One person can do evaluation, but then data capture needed (e.g., through interviews) | Interview of essential actors and stakeholders, but only one evaluator needed | Few people needed |
Workability regarding analysis to be done (1: difficult and 4: simple) | No guidance on analysis provided | Easy identification of the criteria influencing the evaluation results to support formulation of recommendations | Automated analysis | Generated by the tool. Mostly yes/no answers to questions | Once tool is filled in, it provides support for analyses. Comparing ToC and scoring difficult | Dependent on the number and complexity of evaluation question(s) |
Time (1: >2 months, 2: 1–2 months, 3: 1 week−1 month, and 4: <1 week) | Long time required for evaluation(s) | Dependent on the complexity of the surveillance system evaluated | If assessor experienced in surveillance or detailed NAP report available, takes relatively short time | Take relatively short time | Filling in the Excel tool is relatively fast once you have the information ready. Defining the ToC and gathering data is time-consuming | Short time to fill in tool. Long time for some of the evaluation objectives/questions |