IoT — the Internet of Things — has moved from buzzword to business reality for many UK SMEs. Whether it's temperature sensors in a warehouse, telematics on a delivery fleet, energy monitors across a manufacturing site, or environmental sensors in a retail estate, the devices are already deployed. The problem is the data.
Most SMEs with IoT estates have the same experience: data is being collected, but it's siloed in the vendor's platform, difficult to combine with other business data, and largely unreported. The promise of IoT — real-time visibility, predictive maintenance, operational efficiency — remains unfulfilled.
Microsoft Fabric changes that equation significantly. This article explains how IoT data flows into Fabric, what the architecture looks like, and what it realistically takes to get it working for a business your size.
Why IoT data is different
Before getting into Fabric, it's worth understanding what makes IoT data distinctive — because the architecture decisions follow from the data characteristics.
Volume: A modest IoT deployment of 100 sensors emitting readings every 30 seconds generates over 10 million data points per month. This isn't transactional data; it's streaming data, and it needs infrastructure that can handle continuous ingestion without choking.
Velocity: Many IoT use cases depend on near-real-time processing. A temperature alert that arrives 4 hours late is useless. The pipeline needs to move fast.
Variety: IoT data rarely arrives in the clean, structured format of a CRM export. Timestamps may be inconsistent, devices may go offline and back-fill, sensor drift introduces noise. Data quality is a first-class problem.
Value density: Raw IoT data has low value density. A single temperature reading is meaningless; patterns across thousands of readings over time are where the insight lives. Processing and aggregation are essential.
Where Microsoft Fabric fits
Microsoft Fabric is not an IoT platform. It doesn't replace Azure IoT Hub, Event Hubs, or the device connectivity layer. What it does is provide the destination — and increasingly, the processing layer — for IoT data once it's been collected.
The typical architecture for an SME IoT and Fabric integration looks like this:
1. Device layer: Your physical sensors, machines, or devices. These communicate via MQTT, HTTP, or proprietary protocols to a connectivity layer.
2. Ingestion layer: Azure IoT Hub or Azure Event Hubs. IoT Hub is purpose-built for device management and bidirectional communication; Event Hubs is simpler and better suited to pure data streaming. For most SME use cases, Event Hubs is sufficient and more cost-effective.
3. Fabric Real-Time Intelligence: Fabric's real-time capability — built on KQL (Kusto Query Language) and Eventstream — ingests the streaming data from Event Hubs, applies transformations, and routes it to storage and downstream consumers.
4. OneLake: The processed data lands in OneLake — Fabric's unified storage layer — where it becomes available to the rest of the Fabric platform: data pipelines, notebooks, warehouses, and Power BI.
5. Power BI: Real-time dashboards, trend analysis, alert visualisations. The layer your operations team actually interacts with.
Fabric's Real-Time Intelligence capability
Fabric's Real-Time Intelligence (RTI) workload is the component most directly relevant to IoT. It's worth understanding what it provides:
Eventstream is a no-code/low-code tool for ingesting streaming data from sources including Azure Event Hubs, IoT Hub, Kafka, and custom endpoints. You define the source, apply optional transformations (filtering, parsing, enrichment), and route the output to one or more destinations — a KQL database, a Lakehouse, or Power BI directly.
KQL Database (formerly Azure Data Explorer) is an extremely fast time-series database optimised for the kind of queries IoT data demands: "show me all temperature readings above 80°C in the last 6 hours, by sensor location." KQL is not SQL, but it's learnable in a day for anyone with SQL experience.
Real-Time Dashboards in Power BI can refresh every few seconds, surfacing live operational data to your team without manual refresh cycles or scheduled reports.
A practical SME example
Consider a food manufacturing business with temperature monitoring across cold storage units, production lines, and dispatch areas. Currently, the temperature data lives in the sensor vendor's cloud portal — accessible but isolated. It can't be correlated with production runs, customer orders, or maintenance schedules.
With Fabric in place:
- Temperature events stream from the sensors via MQTT to Azure Event Hubs
- Fabric Eventstream ingests the stream, parses the JSON payload, and routes readings to a KQL Database
- A KQL query detects anomalies — readings outside tolerance — and writes alerts to a Lakehouse table
- A Power BI dashboard shows real-time temperature across all units, with colour-coded alerts for out-of-tolerance readings
- A daily pipeline joins temperature data with production run records to calculate cold-chain compliance per batch
The operations manager sees a live dashboard. The quality team gets automated compliance reports. The maintenance team gets predictive alerts before equipment failure. All from data that was already being collected — just not being used.
What it actually takes to implement
The architecture described above is achievable for an SME. The honest answer on complexity is: it depends on your starting point.
If your IoT data already flows into Azure Event Hubs or IoT Hub, connecting Fabric is relatively straightforward. Eventstream configuration, KQL database setup, and Power BI dashboard build — a competent Fabric practitioner can have a working prototype running in days.
If your IoT data is currently in a vendor silo, the first question is whether the vendor provides an API or Event Hub integration. Most modern IoT platforms do. The integration work becomes a data extraction exercise before the Fabric piece begins.
If your devices communicate over proprietary protocols with no standard connectivity, you may need a gateway layer — typically a small edge device or Azure IoT Edge deployment — to bridge the protocol to MQTT or HTTPS. This adds complexity and cost, but is well-understood and solvable.
Data quality work is almost always needed. Raw IoT data rarely arrives clean. Timestamp normalisation, handling offline periods, filtering sensor noise, and dealing with duplicate messages are standard pre-processing tasks that need to be built into the pipeline.
Costs to expect
The Azure infrastructure costs for a modest SME IoT deployment — 100–500 sensors, sub-second to minute-level frequency — are typically in the range of £100–£400/month, depending on message volume and retention requirements. Fabric capacity (F2 or F4) adds £200–£400/month. These are operational costs once the platform is built.
Implementation costs depend on complexity, but a focused engagement to connect an existing IoT estate to Fabric and deliver operational dashboards typically runs 4–8 weeks for a senior practitioner. That's a meaningful but one-time investment against ongoing operational value.
Is it worth it?
The honest answer: it depends on what you're trying to achieve and how much value is currently locked in your IoT data.
If your IoT deployment is already generating data that informs decisions — and you just need better access to it — Fabric integration has a clear and calculable ROI. If your IoT deployment is still proving out its business case, integrating with Fabric before you've validated the use case adds cost without clarity.
The right starting point is usually the same: understand what decisions you want to make with this data, and work backwards from there. Fabric can support almost any IoT analytics use case — but the architecture should follow the business need, not the other way around.
If you're working through an IoT data challenge and want an honest view of whether Fabric is the right solution, book a free assessment. We'll tell you what's realistic — and what it would cost to get there.
