The solution architecture is illustrated below. During deployment, 90 days of events were generated and stored in the Fabric EventHouse database. After the deployment, the Telemetry Data Simulator can be invoked at any time to generate real-time events. Please refer to Event Simulator Guide for details on the usage. The real-time data is ingested into Event Hub which sends the data to Fabric EventStream. The EventStream ingests data into EventHouse. The events table in the EventHouse database hosts combined historical data and real-time data, which is used by the real-time intelligence dashboard via Kusto Query Language (KQL) queries. The EventStream data is also fed into the Activator. The Activator has built-in intelligence to detect anomalies and send notifications via email. The notification can also be configured to send alerts to a specific Teams Channel. For more information, please refer to Activator Guide.
Since the 90 days of historical data were generated during the deployment and will become stale after some time, to solve this problem, a Fabric Data Ingester (not shown in the diagram) can be used to refresh the events data, replacing old data with fresh 90-day data. Please refer to Fabric Data Ingestion Guide for details on the usage.
Once data is ingested into the Fabric EventHouse KQL database, you can use the Fabric Data Agent to chat with your operational data. You can ask business questions without the need to write any KQL queries as the Fabric Data Agent will translate your business questions into appropriate KQL queries, execute the queries, and then translate the technical results into business terms to answer questions. For more information, please refer to Fabric Data Agent Guide.
In summary, this solution accelerator provides a working solution with data simulators to provide a live environment where you can observe or demo the capabilities.
