Realtime Enterprise for IoT and beyond

Frank Stienhans


Frank Stienhans
, posted on
, posted 
August 25, 2016

What’s a Rich Text element?

The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.

asdsadsadn sldjflsdjf

Static and dynamic content editing

A rich text element can be used with static or dynamic content. For static content, just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!

How to customize formatting for each rich text

Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of" nested selector system.

In my previous post, we zoomed into Realtime Feeds from your IoT devices to SAP systems. I mentioned that an enterprise should not use the databases underneath their core business processes (ERP, CRM, SCM ...) to perform combined IoT processing. This has several reasons:

  1. Security: IoT devices live often in the public space and can potentially be physically accessed by any individual. The Core Business Backbones are just on the opposite of the security spectrum. Even with security gates in front, you should not feed your device data into your core business backbone.
  2. Data Volume: With successful IoT based products in the market it is very likely that the IoT based data volume will soon outgrow your core enterprise data.
  3. Utilization: Performing IoT Combined Processing in the Database running your core business is likely to impact your core business processes, due to the massive amount of extra load.
  4. Innovation Speed: The Core Business Backbone evolves at a speed of monthly, quarterly or even slower frequency. IoT intelligence often needs to evolve at a daily or even faster speed including A/B testing etc.
  5. Uptime: You always want to seek reducing the blast radius of a downtime. Also: The Downtime impact of the Core Business Processes and of Combined IoT Processing are significantly different.
  6. Choices: Decoupling Core Business Backbone technology from technology for Combined IoT Processing, provides you with more choices on how to best architect your overall landscape.

High Level Architecture

I think we can jointly agree that this architecture is the logical conclusion:

While you should run both Combined-IoT-Processing and Core Business on a large scale cloud (as explained last week), you should run both in separate (AWS...) Accounts and Networks for maximum isolation. Please note that this does practically not need to lead to extra cost or slower performance or reduced central governance. There are companies with hundreds of AWS Accounts for those reasons. (And I was fortunate to assist some of them.)

Now, how do we pinch holes into those firewalls so that we can get data from the IoT devices and the enterprise backbone into our IoT Processing area? 

Well, we don't.

In my previous post I described how we use AWS Platform Capabilities to get a realtime IoT feed into SAP HANA (as example).

A realtime feed from your Core Business

What are the options to get a realtime feed from the core business backbone running on SAP HANA Systems? I analyzed a number of SAP product options, including Smart Data Integration, Smart Data Streaming, HANA System Replication, HANA Table Replication and others.

I wanted a solution which is lightweight, has significant scaling capability and does not lock me into a particular way of thinking. We settled with an approach which builds on Smart Data Adapters. Smart Data Adapters allow SAP HANA to query other databases from other vendors. We can also turn it around and write through that channel into other databases. Now for security and a list of other reasons we do not want to write directly into the Combined-IoT-Processing Clusters.

We created a special Kinesis "producer agent" that exposes a Database Server Interface via ODBC. HANA is interacting with this "database" via the Smart Data Adapter.In SAP HANA Studio the Amazon Kinesis Integration into our SAP HANA System looks like this:

In our implementation SAP HANA table triggers issue INSERT and UPDATE commands to SAP HANA virtual tables that are translated to Amazon Kinesis Stream records. The agent is working transient (no disk I/O), stateless, with low memory and CPU utilization. As it is HANA Table Trigger based it does not lead to significant load on your HANA Server.

A single Amazon Kinesis stream scales to terabytes of data per hour and has a Triple-AZ Service Architecture enabling high end uptime and durability, just like Ocean9 for SAP HANA Systems.

At the receiving end of this Kinesis Stream you can have:

  • SAP HANA (see last weeks post)
  • Spark Streaming (via the standard adapter for Amazon Kinesis)
  • Amazon ElasticSearch Service (via Amazon Kinesis Firehose)
  • Amazon Redshift (via Amazon Kinesis Firehose)
  • Amazon S3 (via Amazon Kinesis Firehose)
  • your own Kinesis consumer adapter for any technology of your choice.


Amazon Kinesis is simply put an interface to the world of realtime stream processing.

With the solution we are fulfilling all of the objectives:

  • All traffic is encrypted at rest and in motion.
  • We did not pinch any holes in firewalls.
  • We did not put significant load on your core business backbones.
  • We decoupled the desire for IoT innovation from the need to protect your core business backbone.
  • We enable you to decide what is best for your company, whether it is the awesome SAP HANA engine or any of the other great technologies out there.

While I started with this topic we looked through the lens of IoT. However I hope you realize that this solution goes well beyond the IoT use case.How about a selective realtime backup stream for your most mission critical data and storing the result e.g. in Amazon S3 with 99.999999999% durability?

(Kinesis is such an AWESOME service that it can be taken into so many directions.)

By enabling a core business backbone database to provide a realtime stream of changes and updates, we have essentially turned a relatively static asset (from a lifecycle perspective) into a core for business and IT innovation.

You are free to decide about the best solution for your Enterprise.

Ocean9 and our growing Partner Network are happy to assist you on your journey to and inside the cloud.

We hope you liked it. Please, let us know what you think.

Back to top