Software AG, founded in 1969, is headquartered in Darmstadt, Germany and has offices worldwide. Notable recent acquisitions have been Apama (2013), Zementis (2016), Cumulocity (2017) and TrendMiner (2018). Software AG’s Apama was first released in 2001 as a complex event processing platform targeted at supporting algorithmic trading in capital markets. It remains a market leader in that space. It was acquired by Progress Software in 2005 before being sold to Software AG in 2013.
The company behind TrendMiner was originally founded in 2008, in Belgium, as a spin-off from the University of Leuven. The current product was subsequently conceived in 2013 with the first beta customers in 2014 and a formal launch of the product at the beginning of 2016.
TrendMiner is a time-series based analytics solution intended for use in process manufacturing. The product has capabilities that could easily be deployed in other sectors and Software AG is currently considering an extension into discrete manufacturing. TrendMiner is a part of the company’s IoT and Analytics business unit, wherein Software AG has more than 300 clients across manufacturing, retail, healthcare and transport, amongst others. Users include Bosch, Siemens and Deutsche Telekom. TOTAL, Lanxess, Novo Nordisk are all TrendMiner customers.
Apama is the crux of Software AG’s streaming analytics platform, and was the first product of its type to be introduced to the market. However, it is not the only component of Software AG’s streaming analytics solution. Under the covers, its analytics platform comprises several additional components including, but not limited to, Cumulocity Core for managing, integrating and connecting to IoT sensors and actuators; Zementis for deploying predictive models; the Industry Analytics Kit, which provides pre-built capabilities of various types; and MashZone NextGen, which is a visualisation and dashboarding product.
Apama itself is currently in version 10.1 and is available as both a proprietary Enterprise Edition and a freely available Community Edition. It can be deployed in-cloud, on-premises, or on IoT gateways and devices. It can also be deployed in a Docker or Cloud Foundry container.
Customer Quotes
"Equipment servicing costs are down an astounding 40 percent, and revenue is forecast to grow by half (thanks to implementing Apama, Terracotta DB and WebMethods)" Global leader in image technology and print services
"The ability to respond quickly to client requests and roll out completely new service offerings in two months gave us a huge strategic advantage. Our team, working with Software AG’s IoT platform (Apama and WebMethods), made it happen." Royal Dirkzwager
Software AG Apama essentially has three elements: an integration framework for connecting to external systems and data feeds, an execution engine known as the Apama Correlator, and front-end tools for developers. As far as integration is concerned, Apama ships with connectivity to JMS, Universal Messaging, MQTT, Kafka, HTTP, databases, web services, files on disk and 27 capital markets-specific sources. There is also a connectivity plug-in framework that allows you to create your own in-process adapters and to reuse elements of these plug-ins across other adapters. For example, you can reuse JSON mapping logic across multiple adapters, so you only have to define such processes once.
Apama Correlator consists of multiple parallelised containers that are known as “contexts”. Multiple (thousands) of contexts may be active simultaneously and each context can process any number of Apama applications, which are written in either EPL (Apama’s event processing language) or Java (Apama has a built-in JVM). An optimised thread scheduler allows elastic scaling. Within the Correlator there are three major components: the Apama HyperTree, the Temporal Sequencer and the Stream Processor. Respectively, these provide an indexing method that allows low latency matching between events and patterns you are trying to detect; support for temporal and spatially-based correlations; and the ability to store and organise events that occur within a time interval and orchestrate real-time analytics across those time windows. Additionally, events can be stored in Software AG’s TerracottaDB, an in-memory NoSQL database. This is useful, for example, for long-running queries. Apama Correlator also features tight integration with Zementis for deploying predictive models in Apama, with support for both PMML (Predictive Modelling Mark-up Language) and TensorFlow models.
There are three front-end tools provided for developers of Apama applications, intended for use by developers, business analysts and end users respectively. The first of these is an Eclipse-based environment that supports development using EPL, the second provides a drag-and-drop interface wherein you graphically model declarative patterns across windows of data, and in the third case there is a web-based, wizard-driven interface whereby you parameterise pre-built templates. Software AG also offers a range of pre-built applications (accelerators) that span a variety of use cases, including IoT, intelligent GRC (governance, risk and compliance), and smart logistics.
Figure 1 – A dashboard created in MashZone NextGen
A variety of pre-built components and capabilities are available inside Apama via the Industry Analytics Kit. There are a number of commonly occurring quality issues that can arise with streaming events, as well as various fundamental analytics that are regularly reused, and what Software AG has done, is to pre-build these components for you. It has also developed extended capabilities that will, for example, link different capabilities together, such as linking a threshold breach to a missing data analytic that will look for any gaps in regular breaches. Software AG has also developed a number of proprietary, industry specific capabilities that are available in the Industry Analytics Kit.
Apama also integrates bi-directionally with MashZone NextGen, Software AG’s self-service analytics environment for preparing, visualising and analysing data. In particular, both Apama and MashZone NextGen can be used to build real time, interactive dashboards (an example of which can be seen in Figure 1).
Software AG Apama was the first product of its type to be introduced and it has been, and remains, a market leader. Software AG’s streaming analytics portfolio, Apama included, is extremely comprehensive and performant. In demonstration of the latter, Apama has been benchmarked (by Intel) with a throughput of approximately 70 million records per second on a single server but is capable of running on as little as an ARM v7 processor or a Raspberry Pi.
Moreover, Software AG is ahead of the curve when it comes to IoT. Apama is deployable on IoT gateways and at the edge, while Software AG’s Cumulocity IoT platform (which integrates with Apama) supports more than 350 protocols used by a variety of vendors of sensors, industrial devices and gateways. Where other vendors in the space are only beginning to see IoT deployments, Software AG already has over 300.
The Bottom Line
From a streaming analytics point of view, it is difficult to think of anything that you might require that is not provided within Apama and its surrounding products. Moreover, the Industry Analytics Kit and Software AG’s accelerators mean that you can get started with an Apama-based project much more quickly than you might expect. In summary, the company provides an impressive product set for streaming analytics.
Software AG CONNX provides mainframe and legacy-oriented data integration, connectivity, data virtualisation and replication capabilities.
CONNX offers three main sets of capabilities: data access (allowing applications to access supported data sources in real-time), data virtualisation (allowing federated query access across data sources, typically either from or including mainframe environments) and data movement (classical extract, transform and load [ETL], as well as extract, load and transform [ELT], and/or change data capture).
Connectivity options provide native SQL access to more than 150 data sources. These are not limited to mainframe or legacy environments but extend to platforms such as Hadoop and databases such as VoltDB, MongoDB, AWS RedShift, Azure DB and Snowflake.
The trend in the integration market is for vendors to offer all integration capabilities through a single platform. This includes not just database access but also Internet of Things (IoT) connectivity and integration, application integration, and API-based integration and management. In the case of Software AG these capabilities are spread across multiple products: CONNX, various webMethods offerings, and Cumulocity (for IoT connectivity).
Software AG has a clear strategy with respect to how CONNX and webMethods work in conjunction, though this is less obvious with respect to Cumulocity.
Leaving that aside, as a stand-alone data integration platform for accessing and leveraging a variety of data sources, CONNX has a lot to commend it. It is particularly noteworthy that Softwar AG has installed CONNX for new clients that are not mainframe users. Given that a major differentiator for the company is that it has a depth of understanding of mainframe and legacy resources that few, if any, other companies can match, it is heartening that even non-mainframe users are interested – and willing – to implement CONNX.
TrendMiner, a Software AG company
Last Updated: 28th February 2020 Mutable Award: Diamond 2019
TrendMiner is a self-service analytics solution designed for domain experts within the process manufacturing space, providing trend analysis and root cause identification in the case of production issues, and predictive capabilities. It addresses both batch-based process manufacturing, as in the chemical and pharmaceutical sectors, and continuous processes as in oil refining.
Customer Quotes
“A TrendMiner soft alarm on impurity concentration gives direct insight in product quality. As a result, truck loading can start earlier, lead time is shorter and the probability of rejection due to off-spec conditions has significantly decreased.” European Chemical Company
“TrendMiner is a user-friendly self-service analytics tool that provides us with tremendous time savings and delivers better quality results than data models. Process Engineers, Technologists, Run Engineers or Operators… Thanks to TrendMiner, they can unlock the power of the industrial data for their day-to-day questions.” TOTAL Refining and Chemicals
TrendMiner is used to analyse time-series data (Figure 1 provides a typical illustration) that has been collected from perhaps millions of sensors spread across a process manufacturing site. There are three principle components within the product: TrendHub, ContextHub and DashHub. The first is used for actual trend and root cause analysis, but also allows you to monitor good process behaviour, providing early warnings. It may also be used with soft sensors. The second hub is used to capture automatically generated or manually entered operational events and is also used to augment the collected data with third party contextual information, such as SAP PM, LIMS or ERP systems. The third hub is used to make analytics-driven visualisations for creating personal production cockpits. However, rather than continue to describe each of these separately it will be more useful to describe how the solution works in practice.
Figure 02
For example, you are in a batch processing environment and some of your batches have been identified as sub-standard. You want to determine what has gone wrong. You start by identifying a baseline (Software AG calls it a fingerprint) of what a normal batch process should look like. You do this by taking the time-based profile of multiple successful batches and overlaying them on one another, filtering by key performance indicator (KPI). This is a very simple and more or less automated process. You then compare – again in an automated fashion – the anomalous results with the ones you expect, with the results presented to you visually. Figure 2 represents the results of such an exploration where the grey, blue and purple fingerprints, each representing a different KPI, do not show any significant differences, whereas the deep red fingerprint does show a significant divergence, suggesting that there is a problem with whatever is represented by that fingerprint.
Reverting to a more technical description of TrendMiner, notable additional features include both a recommendation engine and (patented) pattern recognition. In the case of the former, the engine – enabled by machine learning – will make suggestions about what might be the cause of a problem, which will be particularly useful when causes might be non-obvious. As far as the latter is concerned, the software can recognise recurring patterns and use these to support predictive maintenance and similar functions. This is further supported by the ability to raise alerts or trouble tickets in the event that a particular pattern, or combination of patterns, or the absence thereof, is identified as having (not) occurred. The product also includes a similarity search capability – searching for similar patterns – whereby you define how precise you want this similarity to be: 95%, 99% or whatever.
Some relevant geo-spatial capabilities are supported by TrendMiner through the use of APIs or via Software AG’s integration capabilities. For example, you can create a map of your assets that can relate directly to asset related sensor data as a part of analysing its performance. In this way, the geo-spatial structure of your plant can help speed up root cause analysis. Similarly, there are facilities for say, power grids or gas distribution companies to represent their assets on a map.
DARPA (Defense Advanced Research Projects Agency) has stated that it believes that the new generation of artificial intelligence will involve what it described as “contextual (machine) learning”. This is precisely the space within which TrendMiner is operating. In particular, it supports a self-service approach to analysis that does not require domain experts to have any knowledge of data science. This contextual strength, plus the fact that it integrates with other Software AG products such as WebMethods and Cumulocity, and not to forget its core capabilities, makes the product an obvious choice for shortlisting when it comes to process manufacturing. But therein lies the rub: there is a paucity of general-purpose analytics tools that work with time-series and we would like to see TrendMiner extended, not just to discrete manufacturing but to time-series analysis more generally, particularly in support of predictive and prescriptive maintenance.
The Bottom Line
“Self-service” is a term that is increasingly bandied about, quite often when it shouldn’t be. However, it is an expression that you can genuinely apply to TrendMiner and we find the approach taken by the company to be very intuitive. It is highly recommended.
This paper highlights the benefits of bringing together Software AG’s recently acquired StreamSets and previously existing CONNX data integration technologies.
We use third-party cookies, including Google Analytics, to ensure that we give you the best possible experience on our website.I AcceptNo, thanksRead our Privacy Policy