IIC Journal of Innovation 17th Edition Applying Solutions at the Digital Edge | Page 31

Driving Industry 4.0 at Distributed Edges with Cloud Orchestration
when connection is not available . Once available , buffered data is sent to its destination in a manner that avoids distributed conflicts between edge and cloud systems .
Relevant data for mission critical applications at the edge . It is important that the necessary business data required to run these business-critical and latency sensitive applications are always available . Decisions on what data and how often ( intervals ) it is synchronized with the cloud might differ from one scenario to another . In context of manufacturing processes , business application configuration data , master data , manufacturing orders , and execution order are examples of data to be buffered on the corresponding edge nodes from the cloud . In context of mission critical execution , master data at the edge might consists of Bill-of-Material ( BOM ), routings , operation activities , materials , resources , and work centers .
Depending on the complexity and criticality of the processes , enough business data should be sent from the cloud to the edge so that business applications can run “ offline ” within a pre-set time interval ( e . g ., n numbers of shifts or batches ) without connectivity . Other requirements regarding compliance to data security , privacy , and sovereignty ( e . g ., GPDR ) affect the choice of what data must be exclusively at the edge , synchronized with the cloud , or exclusively in the cloud . A detailed discourse on these topics , however , require further discussions in future works .
Architecture – Leveraging the capabilities of the cloud to the edge . This section details the edge computing architecture for mission critical and latency sensitive scenarios based on [ 21 ] [ 15 ]. The corresponding edge computing architecture is designed based on standardized and industrialized components that facilitate plug and play and easy deployment . One of the main innovations of the architectural approach is the combination of capabilities from cloud native and on-premises solutions to provide standardized business applications and reliable data synchronization for the edge .
Modern edge computing approaches leverage cloud native principles but addresses its distinguishing characteristics such as constraints on computing and storage resources , heterogenous ingestion , intermittent connectivity , and minimal latency . Business applications and orchestration infrastructure are developed and realized following modern paradigms such as containerization and industry de-facto standards such as Kubernetes .
Some benefits of cloud native principles are elasticity ( ability to automatically scale computing resource regardless of current load to ensure the availability and performance of services ), resilience ( ability to provide a service even if certain components built into these services are not working ), performance , modularity and re-usability , resource efficiency , automation , and observability . The result is high availability of services and the guarantee of designed response times . All these characteristics are relevant for distributed edge computing deployments as they address the specific requirements of business-critical manufacturing processes .
At the same time , the edge computing approach presented in this paper provides flexibility , extensibility , and individualization regarding deployment and configuration of business application
IIC Journal of Innovation - 27 -