Evaluation Offline Control App

Last modified by David Nestle on 2019/02/15 15:26


The Evaluation Offline Control App is the central administration tool for the configuration of automated and manual OGEMA evaluation. The app is also able to generate KPI result pages and to send daily status and alarm reports. Furthermore a graphical view and CSV export of raw data used for certain EvaluationProviders can be opened via the app.

For custom projects more focused versions of the app can be provided. These apps usually should have a dependency on Evaluation Offline Control and should only adapt certain classes like the smartr-eval-admin app. Evaluation Offline Control is usually not intended to be used by end users directly, but for configuration and manual evaluations, backups, as base for providing configurations for opening Schedule Viewer Expert and KPI pages in inherited limited applications etc.

This app is the basic evaluation control and monitoring tool for the OGEMA framework. The technical details and concepts of this framework are documented in OGEMA GaRo Evaluation API.

GUI Pages in the Evaluation Offline Control App

Evaluation Provider Overview

  • All GaRo evaluation providers on the systems are listed
  • You can open the page to start and configure daily auto-evaluation for each evaluation provider
  • You can check the result types of each evaluation provider
  • Fundamental characteristics such as number of outputs (results) and pre-evaluations required are shown.

Offline Evaluation Control (Evaluation starter page)

  • Via the button in the second line you can switch between a the configuration of a single evaluation start and the configuration of a daily evaluation that is scheduled automatically (auto-eval). When the auto-eval-configuration has been opened once the respective configuration resource is created and you can then activate/deactivate auto-evaluation for the provider via a Checkbox.
    If you want to make sure that auto-evaluation calculates all results of a provider even if new results are added by the developer you should deselect all Results for the auto-eval configuration. Note that this is not possible for manual evaluation, but here all available results are selected by default anyways.
  • Via the button "Schedule Viewer" you can open a schedule viewer that allows to plot all input time series of the evaluation configuration currently selected. For more information on the schedule viewer see Timeseries Viewer Expert.
  • The exact way of evaluation is determined by the settings in the page "Configuration". If the configuration "Autodetect and complete required Pre-Evaluation data" is enabled (default) required pre-evaluations are automatically queued and generated..The dropdown "Shall existing results for the time-span be re-calculated" decides whether a calculation is done even if data is already available. This is necessary if the code or input data to the evaluation has changed. The default option "ONLY_PROVIDER_REQUESTED" means that existing pre-evaluation data is not re-calculated, but the results of the actual provider selected is calculated even if data exists.
  • If the configuration "Perform Auto-Evaluation" in the configuration page is disabled no more evaluations will be queued automatically even if configured for certain providers.
  • It the provider itseld defines special KPIpages you can add or update them by hitting the button "Add KPI-pages offered by provider", which will be shown in the bottom right corner if the provider offers pages. Usually the evluation providers that are required to generate the results displayed by a page should be configured for auto evalution so that the page always shows most current results without having to perform the evaluation manually.
    These pages also may define messages that are sent after the auto-evaluations have taken place. Auto-evaluation as well as sending messages on a server instance collecting and evaluating data from various gateways in the field has to take place each day after the gateways have sent their data. The number of milliseconds to wait after the start of the day can be controlled via the property org.ogema.eval.utilextended.autoevalretardmilli.
  • The gateways selected in the page Configuration Page -> Gateway Configuration as "isUsed" are pre-selected for evaluation.
  • In the Configuration Page you can also select to provide a "BackupButton". In the lft column you will then find a button that creates a zipped backup of all remote slotsdb files and log files for the gateways and the time interval selected and will it store the zip file in  ../evaluationresults/remoteSlotsBackup.zip. This can also be used to create a data set for development rapidly.

Results Overview (JSON Result Descriptor Overview)

  • The page shows all JSONResult files that are calculated as result of evaluation runs. Usually each queued evaluation run that is shown before and during execution on the page "Queued Evaluation Runs"  ("Queued Evaluation Run Overview") generates a JSON result file when finished.
  • Only files that have an entry in in the ResourceList jsonOGEMAFileManagementData/workspaceData/DefaultWS/fileData are shown (the workspace location may be different based on the workspace selected in the toplevel dropdown of the page). So after a clean start there may be JSON files in the directory that are not shown here.

KPI Overviews

  • There is a set of default KPI pages that are always offered via the menu.
  • Specific KPI pages for certain evaluation providers can be acticated via the configuration page, such pages are implemented by class KPIPageGWOverviewMultiKPI. The pages offered there are hard coded. If such a page is offered by an evaluation provider via the method GaRoSingleEvalProvider.getPageDefinitionsOffered() a buttron "Add KPI-pages offered by provider" is shown at the bottom-right of the respective Evaluation Starter Page.
  • For each such page implemented by class KPIPageGWOverviewMultiKPI an entry in the ResourceList offlineEvaluationControlConfig/kpiPageConfigs is made. If the element pageId is present and active the page is re-created on startup under an URL defined by the pageId. Otherwise the page would disappear on every restart of the server, even it is not a clean start.
  • The gateways selected in the page Configuration Page -> Gateway Configuration as "isShown" are shown on the page (but only gateways can be shown that are evaluated, so usually it does not make sense to show gateway that is not used unless the gateway is selected manually or older data shall be shown when more gateways where selected as "isUsed".
  • By default KPIs are saved as overall results per day, week and month for all gateways evaluated. But provider-specific views are always also gateway-specific. In this case gateway-specific KPIs are stored.
  • You can open a graphical view showing all KPI values for a gateway or overall shown on the page in the Timeseries Viewer Expert. In contrast to the view opened via the Offline Evaluation Control page this is not the raw data of the time series, but the KPI values. So in the view opened here you usually have a single value per day wheres the view of the Offline Evaluation Control may contain one value per minute or even per second.

Evaluation configuration management

  • Note that the remote slotsdb directory has to be set for the timeseries-multieval-garo-jaxb DataProvider in the property org.smartrplace.analysis.backup.parser.basepath . The default path for storing result workspaces is ../evaluationresults, it can be changed via the property de.iwes.tools.timeseries-multieval.resultpath.
  • Separate configurations for auto-evaluation and single-run evaluation are necessary => problem that results are shown separatly ?

Basic quality check and result analysis

  • Perform single evaluations of basic-quality_eval_provider e.g. by selection "ThreeFullDaysBeforeNow" or schedule auto-evaluation for the provider
  • Check if the number of data rows with data and with good data match the expectations.
    TODO: Develop documentation what can be expected.
  • If the expectation is not met, analyse the log file of the evaluation on the data server.
    If you are using WinSCP set up a bookmark.
Created by David Nestle on 2018/10/19 11:05