Skip to main content

v2.6

· 5 min read

Version 2.6 concentrates on extended support for premined data models, a smoother user interface, and refreshed Template Apps, while also delivering a series of stability and maintenance updates.

Extended support for premined data models

Version 2.6 introduces further capabilities for loading externally mined mpmX data models directly into Qlik apps. This approach builds on the Portable Backend Adapter (PBA) and enables selected use cases where the process mining logic has already been executed outside of Qlik in existing or more scalable data platforms (like Cloud DWHs or RDBMS) and the resulting model can be therefore reused with use case specific or end user specific UIs or tool chains . First fully supported backend type is Snowflake for which we provide a native App to configure our mining logic which generates the mpmX compliant data model.

It reflects mpmX’s open platform approach and enables greater flexibility, particularly in hybrid or multi-system environments.

This functionality is technically designed to support a broader range of SQL-compatible backends. However, successful integration depends on several key conditions:

  • The data model must follow the mpmX schema and field naming conventions.
  • The target system must be accessible via a connector from the Qlik ODBC Connector Package (available in Standard QSEoW as well as QSaaS).
  • A defined minimum SQL feature set must be available (e.g. support for basic filtering, subqueries to support delta load).

At this stage, full implementation and testing have only been completed for Snowflake. While adaptations for more data platforms are already prepared, official supports dependents on the availability of mpmX mining implementations on further data platforms. For the next release we are working on the support of another major cloud data platform.

Preconfigured Template App for users of snowflake mining App

A preconfigured template app for customers which also use our Snowflake native mining app is part of the release bundle and show cases the easy setup of the integration. The integration also insures full backward compatibility of the model to previously generated Qlik models, therefore enabling offloading of Qlik ETL/Mining workloads to snowflake without changes related to master items, UI and so. Several customers use this already in QS Enterprise on Windows as well as Qlik SaaS environments.

Important!

While the underlying logic is broadly extensible, current Wizard support and official compatibility are limited to tested scenarios. Broader database support is planned but depends on infrastructure availability and further validation. Partners and customers interested in extending the approach to other platforms should coordinate with our consulting team.

Improvement Native Event Log Loading and Mining in Qlik

Since version 2.5, it has been possible to load event logs from supported database systems via the Import Wizard and then mine the process natively within Qlik. With versions 2.5.1 and 2.6, this capability has been further refined in terms of compatibility and usability.

The feature allows users to configure and launch full process analyses directly from externally provided event logs without requiring manual scripting. This supports self-service use cases where the technical infrastructure provides ready-to-use logs.

Technical Requirements

  • The data source must be accessible via a connector included in the Qlik ODBC Connector Package.
  • The system must support basic SQL operations (e.g. SELECT, INFORMATION_SCHEMA Queries to get the list of accessible databases, schemas and tables for a certain Qlik data connection to support the event log selection dialogs in the wizard).
  • The Wizard must be able to detect the database type from the connection string to adapt script generation accordingly.

Validated integrations include PostgreSQL (v14.8 or later), Snowflake, and MS SQL / Azure SQL.

Import Wizard Improvements

The Import Wizard has been further refined to simplify and accelerate the setup of Mining Apps. Usability improvements and clearer system feedback help reduce configuration time and support a smoother onboarding experience.

ChangeBenefit
Standardized source-selection buttons with iconsConsistent look-and-feel; unavailable options are greyed out, active ones highlighted.
Compact hint boxes and clearer button textKey guidance is visible without taking up excess space.
“Clear” icon in input fieldsOne click removes current values; fields are disabled automatically when not applicable.
Alphabetical sorting in all database dropdownsFaster navigation through long lists.
Improved keyboard support in FilterSelectorThe Return key now confirms a choice exactly like a mouse click.

Template App

Two previously available sheets are now part of the standard Template App again:

Sheet: Ad Hoc Lead Time Analysis

Analyze the time between any two process steps. Filter by lead time, compare different contexts, and view time distributions.

Business value: Helps identify inefficiencies, spot delays, and optimize subprocess timing.

Use Case:

In Purchase-to-Pay, check if payments are made too early or too late by analyzing time between Invoice Receipt and Payment.

Sheet: Process Log

View the raw event log in a structured format. Follow individual cases step by step and see who or what triggered each activity.

Business value: Supports validation, audit, and troubleshooting by showing the process at event level.

Other Notable Changes

  • Enhancements to internal request handling ensure complete data retrieval in large Qlik SaaS environments.
  • Session scripts now substitute initialization variables automatically, reducing redundancy.
  • Wizard start-up and script-version errors are reported with clearer messages.
  • ActivityOriginID is now mandatory in OCPM mode only.
  • Numerous fixes address list loading, filtering, column ordering and related edge cases.

Security

XSRF-Token Support
(with a new qlik installation. Existing installations remain as before)

During every session, mpmX now requests and re-uses an XSRF token. If no token is returned, the system continues with the previous method, so existing installations work without extra steps.