Apache NiFi MiNiFi enables data collection in a brand new environment - small sensor footprint, intermittent or limited bandwidth distributed system, and disposable or short-lived hardware. You can prioritize this data or perform initial analysis on the edge, as well as immediately encrypt and protect it.
Concept: Apache NiFi offers a revolutionary data flow management system and extensive integration of existing data production, consumption and analysis ecosystems, all of which are robust data delivery and a (data) logging infrastructure It is protected by. Learn about the additional project Apache MiNiFi, which extends the scope of NiFi's power to the maximum. MiNiFi is a lightweight application that can be placed on hardware that is one order of magnitude smaller than the existing standard data collection platform and is less powerful. As a JVM-enabled native agent MiNiFi enables data gathering in a brand new environment - small sensor footprint, intermittent or limited bandwidth distributed system, and disposable or short-lived hardware. You can prioritize this data or perform initial analysis on the edge, as well as immediately encrypt and protect it. Regional governance and regulatory policies are applied to geopolitical boundaries and comply with legal requirements. And all of this configuration can be done from the existing NiFi and central control using the stable data UI that the data flow administrator has already liked and trusted.
Required prior knowledge / targeted participants: Developers and data flow administrators need some knowledge of Apache NiFi as a platform for routing, conversion, and data delivery through the system (a brief overview is provided ). In this talk we will focus on extending data collection, routing, data history, and NiFi control functions, through IoT / edge integration via MiNiFi.
Key Points: Participants will learn about the opportunity to collect and capture data flows close to the source of data, "edge", such as IoT devices, vehicles, machines, etc. Participants prioritize, filter, protect, and manipulate this data in the initial data lifecycle and understand the potential for data visibility and performance improvement.