Skip to main content

Azure Stream Analytics for Real-Time Fraud Detection

 The following is a guest blog post by Dmitri Riz from the InfoStrat Tech Blog

Microsoft Azure Logic Apps is a service that allows you to design and build scalable solutions for app integration, data integration, system integration, enterprise application integration (EAI), and business-to-business (B2B) communication, whether in the cloud, on premises, or both.

One of the features I wanted to explore in more detail is the support for real-time fraud detection for transactions originating from Dynamics 365 or, speaking more broadly, the ability to detect and flag irregularities in incoming data streams in real time.

The tool to support this is Azure Stream Analytics. It is a fully managed, real-time analytics service designed to help you analyze and process fast moving streams of data that can be used to get insights, build reports or trigger alerts and actions.

Set up an Azure Event Hub receiving real-time Dynamics 365 transaction data and a Stream Analytics job scanning this Event Hub and using SQL-like queries to detect suspicious data.

Note that this approach uses deterministic SQL queries to discover fraud and other abnormalities in the data stream and relies on pre-created logic to discover fraudulent patterns in data. I will look at how to use the magic sauce of AI / machine learning to approach the same problem in the following post.

Azure Event Hub

The stream analytics job in this example will use data ingested into an Azure Event Hub. Event Hubs is a highly scalable data ingestion pipeline that integrates to many Azure services and is very helpful with many real-time processing solutions.

To set it up, you need to create an event hub namespace, and add a new event hub to it (see walkthrough here). The event hub will ingest our event data and will provide the input stream for the stream analytics job.

I have created a namespace dr-fd-post and a single event hub in this namespace called dr-fd-posthub.

Azure Logic App

I want to capture a transaction data stream from a Dynamics 365 - based system. I'm going to use an Azure Logic App to send new transactions to my event hub.

To keep things simple for this post, my Logic App sends the entire new record to the event hub.

The Logic App is triggered by the creation of a new record of type Payment in the connected Common Data Service (CDS) environment.

The next step copies the record to a JSON object and can be used for additional manipulation. The Final Send Event step uses the new record’s data to create a new event in my event hub (my event hub is configured to ingest events as JSON objects).

At this point, I want to make sure that my CDS data gets correctly ingested in Azure Event Hub. I'm going to create a few Payment records in Dynamics and use EventHub Viewer Windows app to make sure it's working. I can see my new Payments records in Dynamics.

Next, I'm going to fire up the EventHub Viewer and select correct Azure login and subscription.

The next validation step is to connect to my event hub. I see that new records do indeed get ingested by the Azure Event Hub.

Stream Analytics
With my ingestion input part of the pipeline configured, I am ready to set up a new Stream Analytics job to read incoming Payments data collected in my event hub. Stream Analytics pipeline consist of an input, query and an output. Diagram below illustrates potential Stream Analytics pipelines (see this article for more details).

Navigate to Create a resource -> Internet of Things -> Stream Analytics job in Azure Portal to add it. 
After a new job is created, add an Event Hub stream input to the job (a single job may have multiple inputs). 
I've chosen the dr-fd-posthub as the data source for my single input stream.

Transformation Query
The core of the solution is to configure a Stream Analytics query that would detect anomalous data in the ingested real-time event stream and flag it for downstream processing. I'm going to use the query builder included in the Azure management interface for stream analytics.

To build the query, I need to be able to have a pretty good idea about the types of irregularities that might be encountered in my data stream. For example, to detect suspicious phone calls, I might be looking for the same calling number making calls from different locations within a short window of time.

In my example of scanning payment transactions, I want to find all transactions from a payee where payment amount significantly deviates from the historical average transaction amount that my system calculates and maintains for each payee. While this example is simplistic, it is enough to illustrate the potential of the tool.

In the screenshot below, the query reads subset of fields from posted events and calculates a simple deviation of payment to average amounts.

With the filter applied, I see some records that fit my detection criteria and merit additional attention.

Stream Analytics Query Language provides many specialized facilities for advanced stream analysis, including geospatial, windowing and time management analysis functions. 

For example, I could have used a TUMBLINGWINDOW function to organize my event stream into a sequence of time-based windows, and group transactions by the payee within each window to detect payees who post transactions suspiciously often.

Anomaly Detection functions such as AnomalyDetection_SpikeAndDip use machine learning models to compute a p-value score to indicate how anomalous a given event is over the entire query input stream.

Job Output
The last step is to define an output for my job. I could send the query results to a SQL Database or even to an Azure Function (if I wanted to update a suspicious transaction record in Dynamics 365) but for the purposes of this post, I just want to store the results of stream analysis. The simplest way to do it is to configure an Azure Blob Storage container.

To do this, I have created a new Storage account and a new container dr-fraudsink in it and added a single job output pointing to this container using FraudSink as an output alias.

My job now has an input, a transformation query and an output. I can now start the job and monitor Payment event creation and detection. After a few Payment records are created in Dynamics 365, I can now switch to my Azure Blob container. In Azure portal I can see new blob created by my stream analytics job and two records flagged as potentially fraudulent.

This approach can be used for a wide variety of tasks, including real-time alerts, self-updating data visualizations and dashboards, and various IoT applications.

About the author: Dmitri Riz leads InfoStrat's Dynamics practice.  He has served as technical lead and architect for multiple enterprise-wide projects. 

Popular posts from this blog

Power Apps Portal: The Successor to Microsoft Dynamics Portal

In case you have been reviewing Microsoft's new pricing for its Dynamics products which was released this month and have been unable to find Dynamics Portal, it has been rebranded as Power Apps Portal and shifted to the Power Apps side of the Microsoft product family. Rebranding the portal product underscores the importance of app scenarios involving external users such as customers and suppliers.  It also provides a simpler interface than Dynamics 365 for occasional users. The new portal pricing is based on the number of unique users who log into the portal each month (for authenticated users) and on the number of page views for anonymous users.  "A login provides an external authenticated user access to a single portal for up to 24 hours. Multiple logins during the 24-hour period count as 1 billable login. Internal users can be licensed either by the PowerApps per app or per users plans, or a qualifying Dynamics 365 subscription." Pricing starts at $200/mo

Replacing Microsoft InfoPath with Power Apps

Source: Microsoft has offered a number of forms automation products over the years, and the most long running was InfoPath which was released as part of Office 2003.  InfoPath is a powerful and flexible product that stores user data in XML while offering form features such as rules, data validation, scripting, and integration with SharePoint.  The popularity of SharePoint resulted in many organizations standardizing on InfoPath for forms, especially internal forms which are hosted on an intranet such as employee reviews, leave and payment requests, and human resources forms. Microsoft has discontinued InfoPath, with mainstream support ending July 13th, 2021, and extended support ending July 14th, 2026. Microsoft has named Power Apps as the successor to InfoPath .  Power Apps has much in common with InfoPath.  Both products include integration with SharePoint.  Both are geared toward the citizen developer and do not require advan

Key Concepts for Microsoft Dynamics 365: Tenant, Instance, App and Solution

To understand Microsoft Dynamics 365 (formerly Dynamics CRM), you need to learn some new terms and concepts that may be a bit different from what you know from databases and solutions that are hosted on premises. This post introduces some of the key terms and how these concepts are important for planning your implementation. While Dynamics 365 is available on premises, it is most commonly deployed on the Microsoft cloud.  This blog post discusses only cloud implementations. Microsoft has multiple clouds such as commercial and government community clouds. We start with a Microsoft tenant .  A tenant is the account you create in the Microsoft Online Services environment (such as Office 365) when you sign up for a subscription. A tenant contains uniquely identified domains, users, security groups, and subscriptions.  Your tenant has a domain name of such as  User accounts belong to a tenant, and subscriptions are assigned to user accoun