How do you rate this blog

Monday, November 4, 2019

Azure Cognitive Services: Computer Vision Image Analysis using Python

Computer Vision using Python -


There are scenarios in which you might need to obtain from images like -
1) Tag visuals
2) Detect Objects in the image
3) Detecting brands
4) Categorize an image
5) Describe an image
6) Detect faces
7) Detect Image types
8) Detect Domain Specific content
9) Detect Color Scheme
10) Generate a thumbnail
11) Get Area of interest

This blog speaks about using computer vision service from Azure which helps you to describe the image which you can do it locally based on the example provided by Microsoft using Python. Given below are the steps to do this on windows 10 -

My Environment:
OS: Windows 10
Python Version: 3.7.4
Assuming you have a azure account. Else you can signup for 12 months free (https://azure.microsoft.com/en-in/free/)

Things might change when you are implementing as API's might be deprecated.

Actions on the Azure Portal -

Step 1 - Login into the portal and search for Cognitive Services and click on Add, you should be able to see the content as shown in the snapshot below - 




Step 2 - Search for computer vision and you should be able to see the service provided by Microsoft, click on create




Step 3 - Once you click you will be shown the image below, fillup the details and make sure to choose F0 for free trial if you have signed up for free account.




Step 4 - Once you create please make sure you note down the Key and Endpoint by clicking on Quick start as shown in the image below.


Once you have the two components lets start preparing the environment for running the code.


Actions on the workstation -

1) Install python on Windows 10: https://www.python.org/downloads/windows/

To validate the installation open command prompt and use the command python --version

I have used Python 3.7.4 for this.

2) Install below libraries -
matplotlib - pip install matplotlib
pillow - pip install pillow

3) Create two environment variables called COMPUTER_VISION_ENDPOINT, COMPUTER_VISION_SUBSCRIPTION_KEY on your machine with the values you got from Azure portal Step 4.

3) Use an image of your choice and place it in a path in your computer, in my case it is C:\Users\Bharathrs\Documents\Images\


4) Step 4, create a file call it imageanalysis.py and add the code below


Once it is done run the program using the command python .py and you should be able to see a JSON response as shown in the image below and you can see all the descriptions regarding the image in the same.

Sunday, August 11, 2019

Power BI in Windows application using Graph UI

We have all done embedded power bi on a web page which is the most widely used mode of exposing the dashboards or reports to the end users. With windows 10 improving there has been increase in the usage of apps as well. So there has been an increase in the requests coming from end users to have it as an app instead of typing an URL. So the way some teams have done this was have a webpage in which the power bi is embedded, then they call this inside UWP because Microsoft did not have a native control on the app which allowed you to add the power bi reports. Well that is an uncessary overhead in the case where the clients are looking only for an app especially in the case of remote site operations. To overcome this Microsoft have released a control in Graph UI

How to do this?
Well it is a very simple process and requires hardly 30 minutes to get this up and running on a dev environment by following the below steps -
1) Have a Power BI pro account else you need an Azure Active Directory and a master user
2) Fill the onboarding document.
    This requires your step 1 details.
    Then enter a name for your application and click on select all. This step is basically to provide the app access to the api which it needs and click on Register.


    Please copy the application ID which you will obtain after this step.
    Create a workspace.


    
Import a power bi report which you want to embed in the UWP app, if do not have a power bi dashboard then import a Sample Power BI report.

    Last step will be to grant permisson which can be done by clicking on 'Grant Permission' button.




    Once you do this copy the details as shown in the image below.
   



3) Creating UWP App -
    Now to create your application, download the visual studio community edition 2019 if you do not have a visual studio 2015 and above.
    Once you have it installed, create a new uwp project



   
    Enter the project name and click on Create.


   
    Then VS will ask you to choose target version and minimum version for the app, you can leave it as default and click on OK.
   
    Once it is done you should be able to see the project structure as shown in the image below. 


   
   
    you will not have the graph ui controls by default, to add this go to 'References' and do a right click and click on 'Manage NuGet Packages' as shown in the image below.


   
   
    Search for 'Microsoft.Toolkit.Uwp.UI.Controls' in the browse window and click on install.


   
   
    Now if you browse the toolbox you should be able to see the Graph UI control and also see Power BI embedded. 


   
    Click on MainPage.xaml and you should be able to see a sample screen. You should be able to see the xaml code, 

   
    Add the reference of graph UI into this by adding the below code -
   

   
    Add the below code with the App ID and GroupID (WorkspaceID) which was obtained in the above step.
   
   
   
   
    Your final code should like below -
  
   
    Once it is done, you can click on Run button and the app will load in debug mode and you should be able to see the app as below.

   
    You will be shown a login page, enter your credentials and then it should load the sample app / report which would have been added based on the selection done while registering.
   
   
   
    There can possible blocking points which can be -
   
    1) If you have not enabled developer settings on your system -
   
    In this case, search for Developer settings in the start and click on Developer Mode. Once it is done it should install and enable the mode, if it requires admin privileges the request the concerned team to enable it for you.


   
   
    2) If you have an exception stating debugger is not attached, then change the debug environment to 64 bit and it should start working for you as shown in the image below.
   
    before -  

   
    after - 

   
    Else right click on the project and go to the debug option and change Debugger type to Mixed mode.
   
   
    As you can see it is pretty easy and can be used in lot of cases along with your UWP app. Hope it was informative, do let me know if you have any questions on the same.

Saturday, August 10, 2019

To add Highcharts in Power BI as a Custom Visualization


One week back....

I have been working on Power BI for a long time now, although power bi is easy to use and have monthly releases, what frustrates me is the lack of charts which other reporting tool have, for example highcharts, D3 have so many visualization. So I was looking for an option to add custom charts into dashboard as we are working on reports for streaming data. I came across documents which provided the way to integrate third party charts in power bi. This was a very nice feature considering the wide variety of charts coming up in the market till the time Power BI catches up with the competition.


In this blog I will be adding highcharts into powerbi report. Why highcharts? Well of all the charting libraries I have come across, highcharts have the best repository of charts, further you can easily incorporate the charts into your web page. Further you can see that they are actively evolving the library to provide rich visualization to the end users.

Now, let's see how to add highcharts into power bi, It will involve three main sections:
1) Environment setup
2) Coding to add highcharts
3) Testing the visualization on the powerbi report.

1) Environment setup:
To set up the environment for custom visualization you can refer to the microsoft site <https://docs.microsoft.com/en-us/power-bi/developer/custom-visual-develop-tutorial#setting-up-the-developer-environment> or you can follow me here.
This requires the below tools -
a) Installation of nodejs
b) pbiviz library installation
c) Installing certificate
d) creating project and opening it via visual studio code

    a) Installation of nodejs -
    To install nodejs go to the link - nodejs  and download either the latest build or the LTS as shown in the image below. In my case I am going for the LTS for windows 64 bit environment.

    Once you install this, open command prompt on your windows machine and execute the command "node -V". I have installed the version 10.16.1, you might have different version if you are installing this at a later point in time.


    ensure npm is added in the PATH variable in your system environment variable.


    Once this is done open a new command prompt and enter the command 'npm' and you should be able to see the below output on usage. If you do not get this, please ensure the PATH variable is correct, in some cases restart of windows resolves this issue. 


b) Installation pbiviz library  - 
    open command prompt and execute the below command -

    npm i -g powerbi-visuals-tools
 
In this command: i stands for install and -g stands for global which ensures it is available throughout the system and not restricted to the folder in which you have downloaded it.
 
once this is done execute 'pbiviz' in command prompt and you should be able to see the image below. 






c) Installation of certificate - 
We need to install the certificate. To do this follow the steps below -
 
Step 1 - open command prompt and execute the command - 'pbiviz --install-cert' and copy the passphrase and you should see a window open for importing certificate.


Step 2 - Import the certificate for current users and click on next



Step 3 - The certificate path would automatically be taken care of in the latest version of pbiviz.

        

Step 4 - Paste the passphrase which we had noted down in step 1 and click on next.


        
 Step 5 - Leave the default selection and click on next.

       
        
 Step 6 - You will be shown the summary, click on Finish and you should get a message stating import was successful.


 
d) creating project and opening it via visual studio code
    
Step 1 - We will use Visual studio code to do our coding. You can download visual studio in this link  for visual studio code.

        
Step 2 - Create a folder and open command prompt in this path and execute the command, this will create a nodejs project with required project structure.
       
        pbiviz new myHighChart


       
       
Step 3 - Now let us open the proejct in Visual Studio Code using the command 'code .'
       
        First execute - cd myHighChart
       
        Then execute - code . to open visual studio code.
        

        
Step 4 - You should be able to see the folder structure as shown in the image below. Click on View -> Terminal.


Step 5 - Install the highcharts component in the terminal which we opened in the previous step.
       
        npm install highcharts --save
        

Step 6 - To verify the installation open the package.json file and ensure the below highlighted ones are present.


This completes the setup of environment. Now let's see how to enable the highcharts visualization in your tool.

2) Coding to add highcharts -
   
    Step 1 - Open src > visual.ts and you should be able to see that microsoft already has provided you a default code which renders a sample visualization as shown in the image below. You can go ahead and check it out, but I will leave that to your interst.
   
   
    Step 2 - Code: 
 
    Add the below import statements:   
   


    In the constructor you only need the below code:
       
   
   
    The update function needs to like this -
   
   

    
You can leave the rest as it is. 

The complete code is provided below.

   
    Once this is done, on the terminal type 'pbiviz start' and you should see a message stating 'Compiled successfully'


3) Testing the visualization on the powerbi report -

To test the visualization you have to:
a) Go to powerbi <https://app.powerbi.com> and enable the developer option in settings as shown in image below

step 1- 



Step 2- 


b) Go to any report and edit it or click on any report and click on edit. Once it is done on the visualization page you should see a custom chart icon with the name as Custom Visual on hover as shown in the image below.



c) Once you are able to see the widget in this screen, click on it and add a column from your dataset into Category Data to trigger the update function which will draw the chart as shown in the image below. 



d) If you want to package the visualization to be available then you need to execute: 'pbiviz package'. You will see a file and you can import this file into your powerbi desktop / portal using import from file option.


This blog shows how to add the visualization into power bi. We need to configure the properties like xaxis, measure, series, legend etc. Do let me know if you want me explain that, I can write subsequent blogs on the same.
       
       

Wednesday, June 26, 2019

Azure IoT

This is a post for people who are exploring Azure IoT for the first time. The idea is to stream the data from code push it into Azure and see it via visualizations in Power BI.
To do this we will need the following Azure resources.
1. IoT Hub
2. Event Hub
3. Stream Analytics
4. Service Bus
5. Logic Apps
6. SQL Server Database

The data flow diagram will be as shown in the image below




Steps -

1. Configure IoT hub
2. Write a code in python which will simulate sending data to IoT Hub
3. Configure Event Hub and add it in the message routing
4. Configure Stream Analytics with Event hub as an input
5. Configure SQL Server
6. Configure Service Bus
7. Configure SQL Server and Service Bus as two outputs for Stream Analytics
8. Create Logic apps based out of SQL Server database tables and create a work order table


Step 1 - Create and Configure IoT Hub -
a) Choose IoT Service in Azure and click on create. Once you do this you will be taken to a screenshot below -



b) After the IoT Hub service is provisioned, you should be able to see something similar to the screenshot below -
  




c) Now we need to create a sample IoT Device as shown in the snapshot below. We will be streaming our data using the connection string of this IoT Device.















     
            

d) Configure the message routing once you create the Event Hub.


       





      
              

Step 2 - Write a code in python which will simulate sending data to IoT Hub:

a) Pick up the connection string into a variable as stated in Step 1 c.
b) init the iot hub connection.
c) Format the string to simulate sensor data.
d) Send the data to iot hub
e) check for the status of reply.
f) Check for the status in iot hub.



import os
import datetime
import time
import random

import iothub_client
from iothub_client import IoTHubClient, IoTHubClientError, IoTHubTransportProvider, IoTHubClientResult
from iothub_client import IoTHubMessage, IoTHubMessageDispositionResult, IoTHubError, DeviceMethodReturnValue




CONNECTION_STRING = '<IoT Hub Connection String>'

sensorlist = [
                
                'Engine RPM'
    ]


AssetList = [
                'Device 1',
                'Device 2'
    ]

EngineVoltage = 233

sensorval_dict={
                
                "Engine RPM":5000
}


PROTOCOL = IoTHubTransportProvider.HTTP
MESSAGE_TIMEOUT = 10000

def iothub_client_init():
    # Create an IoT Hub client
    client = IoTHubClient(CONNECTION_STRING, PROTOCOL)
    return client

def send_confirmation_callback(message, result, user_context):
    print ( "IoT Hub responded to message with status: %s" % (result) )
    
    
def sensorDataParser(SensorData):
    attributes = SensorData.split(',')
    DeviceID = attributes[0]
    Parameter = attributes[1]
    Datetimestamp = attributes[2]
    Datetime = datetime.datetime.fromtimestamp(int(Datetimestamp)/1000)
    Val = attributes[3]
    #TypeofParam= attributes[4] Commented as of today
    MSG_TXT="{\"DeviceID\" : \""+ DeviceID +"\", \"Parameter\" : \""+ Parameter +"\", \"Datetimestamp\" : \""+ str(Datetime) +"\", \"Val\" : \""+ Val +"\"}"
    return MSG_TXT     


# Handle direct method calls from IoT Hub
def device_method_callback(method_name, payload, user_context):
    global INTERVAL
    print ( "\nMethod callback called with:\nmethodName = %s\npayload = %s" % (method_name, payload) )
    device_method_return_value = DeviceMethodReturnValue()
    if method_name == "SetTelemetryInterval":
        try:
            INTERVAL = int(payload)
            # Build and send the acknowledgment.
            device_method_return_value.response = "{ \"Response\": \"Executed direct method %s\" }" % method_name
            device_method_return_value.status = 200
        except ValueError:
            # Build and send an error response.
            device_method_return_value.response = "{ \"Response\": \"Invalid parameter\" }"
            device_method_return_value.status = 400
    else:
        # Build and send an error response.
        device_method_return_value.response = "{ \"Response\": \"Direct method not defined: %s\" }" % method_name
        device_method_return_value.status = 404
    return device_method_return_value


def iothub_client_connector():

    try:
        client = iothub_client_init()
        print ( "Successfully connected with the IoT Hub" )
        #client.set_device_method_callback(device_method_callback, None)
        
        while True:
            # to be changed later when the streaming comes from sensor data generator
            
            #to obtain random values from the list above
            now = datetime.datetime.now()
            datetimeval = int(datetime.datetime.timestamp(now)) * 1000
            Randomasset = random.choice(AssetList)
            #print(Randomasset)
            Randomsensor =random.choice(sensorlist)
            #print(Randomsensor)
            Radnomsensorval =sensorval_dict.get(Randomsensor)
            #print("The value for the sensor "+Randomsensor+" is "+ str(Radnomsensorval))
            senseval = Radnomsensorval + (random.random() * 15)
            
            #formation of the string
            SampleString = Randomasset+","+Randomsensor+","+str(datetimeval)+","+str(senseval)+",Control Analog"
            
            #breaking of the string
            devicereading=sensorDataParser(SampleString)
            print("Able to generate the message "+ devicereading)
            
            #sending the string to IoTHub
            message=IoTHubMessage(devicereading)
            
            #Check for success
            client.send_event_async(message, send_confirmation_callback, None)
            time.sleep(1)
            
    except IoTHubError as iothub_error:
        print ( "Unexpected error %s from IoTHub" % iothub_error )
        return
    except KeyboardInterrupt:
        print ( "IoTHubClient sample stopped" )

    except IoTHubError as iothub_error:
        print ( "Unexpected error %s from IoTHub" % iothub_error )
        return
    except KeyboardInterrupt:
        print ( "IoTHubClient sample stopped" )
                 


Step 3 - Configure Event Hub and add it in the message routing

a) Search for Event hub and click on Add, you will be taken to the screenshot below.
Choose a name, Enable Kafka (Please note this will be available in Standard tier and above. It will not be available in Basic Version)


                 


              


b) Once you create you should be able to see the image below


                      

                   


c) Click on the Event hub and then you can see the details of the event hub. Click on Add Event hub and provide the name, also choose the number of partition and the number of days for which the message should be retained.






Step 4 - Configure Stream Analytics with Event hub as an input

a) Search for Stream Analytics Jobs service in Azure portal and click on Add, you should be looking at a screen as per the screenshot below -

                  



b) Once this is done configure the Event hub as the input of data. You will have other option or Iot Hub but in this case we will be using Event hub as an input.


                                 


                      
    You need to configure the below -
    Input Alias: The name you want to provide for the input.
    You can choose the option: "Select Event Hub from your Subscriptions"  - This will allow you to choose the Event hub which we have created in the previous step.
    Subscription: Choose the subscription under which the Event hub is created.
    Event Hub Namespace: Choose the event hub namespace from the dropdown.
    Event Hub Name: Choose the option of "Use Existing" and choose the name of the event hub you have created in Step 3.
    Event Hub Policy Name: I have choose RootManageShared
    Event Hub Policy Key: This should get autopopulated.
    Event Hub Consumer Group: I will be leaving this blank so that it uses the $Default consumer group of the Event Hub.
    Event Serialization Format: I will be choosing JSON as I have formatted the data as a JSON in Step 2.
    Encoding: I will leave it as default which is UTF8
    Event Compression Type: None (since it is just an example, but you can use gzip or deflate option in case you are compressing large streaming data)
    

c)  Once this is done, we will need to configure the output and subsequently the query to obtain the data from Event hub and store it in SQL Server database.


Step 5 - Configure SQL Server
a) Search for SQL Server databases and then click on +Add which will take you to a screen as shown in the image below:
The first tab will be Basic:
Subscription: Choose the subscription that you want the database to be created.
Resource Group: Choose the Resource group

Database Details:
Database Name: Enter the name of the database
Server: If you do not have SQL Server resource, then click on create new and you should see in the section below:
Once the server name is added, then lets continue to provision Azure SQL database
Want to use SQL elastic pool: Yes (if you feel you will have multiple databases and you want to manage all of them within certain costs.)
Compute + storage: General Purpose

                 




Adding new Server:
Servername: Provide the name of the server of your choice. Please note the .database.windows.net will be suffixed to the name you provide.
Server Admin Login: Provide the login username of your choice
Password: <Strong password>
Confirm Password: <confirm your Strong password>
Location: Which region your database should exist, please ensure it is in the same region else there will be cost for transfer for data across regions.
Allow Azure Services to access server: Check this as we require azure services to access this database.


                 






The second tab is Additional settings:
You can leave everything as default.
Use Exisiting data: None (if you want to start fresh, if you have an existing data and you want to start with that then you can choose backup.)
Database Collation: You can leave it as default unless you have a specific collation type which you use in your PoC or organisation.

                 



Once this is done click on Review + Create and then click on Create.

Open Management Studio v17 and above to connect to clouddatabase with the servername, username and password as provided by you.

Once you are sucessfully able to open the database, create a table using the script below. This is the table which will hold the streaming data from the devices in this PoC.


SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE TABLE [dbo].[DemoIot](
    [DeviceID] [nvarchar](50) NULL,
    [Parameter] [nvarchar](50) NULL,
    [Datetimestamp] [datetime] NULL,
    [val] [float] NULL,
    [ROWID] [int] IDENTITY(1,1) NOT NULL,
 CONSTRAINT [PK_DemoIot_1] PRIMARY KEY CLUSTERED
(
    [ROWID] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
GO

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE TABLE [dbo].[Fact_ExceptionRecords](
    [DeviceID] [nvarchar](100) NULL,
    [Date] [datetime] NULL,
    [Type] [nvarchar](100) NULL,
    [Description] [nvarchar](300) NULL
) ON [PRIMARY]
GO



Step 6 - Configure Service Bus

a) Search for Service Bus service in azure portal and click on +Add, once you do this you will be able to see the screen as per the image below.

Provide the below details:
Name: The name of the service bus.
Pricing Tier: I will be choosing Basic
Subscription: Choose the subscription under which you want the resource to be present.
Resource Group: Choose the resource group under which you want the resource to be present or create a new one.
Location: Choose the region in which the service bus service to be present. Please ensure that the region remains the same as other services you have created, if not it can have cost associated to it because of the transfer of data.


                  





b) Create a queue in the service bus:
click on the service bus name and you should be able to see the screen as per the screenshot below.

Name: The name of the queue.
Max Queue Size: leave it with the default of 1 GB if you are using this for learning purpose, else modify it according to the volume of data that you are expecting.
Message Time to Live: This determines the time frame for which the messages will stay in the queue.
Lock Duration: The duration for which the message is locked so that only the one reciever has access to the data and once the time limit is reached, it will release the lock and the data will be available for other recievers to lock.
Enable Duplicate Detection: This will check if the same message is present in the queue and will not allow the message to be added if it feels it is duplicate.
Enable Sessions: This ensures the first in first out policy for the data as sessions ensure ordering of the messages in the queue.


             



Step 7 - Configure SQL Server and Service Bus as two outputs for Stream Analytics
Go to stream analytics which you have created in Step 4 and click on the resource, then you should be able to see the Outputs. Click on this and you will be able to choose the list of all resources which you can choose as outputs. The various options when this article has been written are -

    1) Event Hub
    2) SQL Database
    3) Table Storage
    4) Service Bus Topic
    5) Service Bus Queue
    6) Cosmos DB
    7) Power BI
    8) Data Lake Store Gen 1
    9) Azure Function

For this post, let us consider SQL Server Database and Service Bus which have been created in Step 5 and 6 respectively.

a) Configure SQL Database as the output.
On click of SQL Server database, you will be shown a configuration screen as per the screenshot below -


                      



Output Alias: The name you want to provide for the SQL Database output in stream analytics jobs
"Select SQL Database from your Subscriptions": Since you have already created the SQL Server database in Step 5.
Subscription: Choose the subscription under which you have created the SQL Server database.
Database: The name of the database which you have created
Username: The username to access the database
Password: The password to access the database
Table: The table in the database for which you have executed the script for.
Merge all input partitions into a single writer: Default, you can leave it as it is.
Max Batch Count: 10000 :You can leave it as default unless you see there are going to more records.

b) Configure Service Bus as the output -
For Service bus, there are two options p-
Service Bus Topic
Service Bus Queue
for this post, I will consider Service Bus Queue.

Below are config details for the service bus as output -
Output Alias: The name of the output for Service Bus Queue in stream analytics jobs.
Select queue from your Subscription: Use this option to ensure that you do not end up creating new service bus queue as we have already created this in Step 6.
Subscription: The name of the subscription you have used for the subscription of the service bus.
Service Bus Namespace: The namespace of the service bus.
Queue Name: Use Exisiting : This is to ensure you use the service bus queue which you have already created.
Queue Policy Name: RootManageSharedAccessKey
Queue Policy Key: This will be autopopulated and is not editable.
Property Columns: If you want some custom values to go to service bus, you can provide the names as a comma separated values. This is not required now and hence we will keep it blank.
Event Serialization Format: JSON
Encoding: UTF8
Format: Line Separated


                                  



c) Once this is done, go the stream analytics jobs, you will see a query window. Click on edit query and add a sample code below -

SELECT
       cast([DeviceID] as nvarchar(MAX)) as DeviceID
      ,cast([Parameter] as nvarchar(MAX)) as Parameter
      ,cast([Datetimestamp] as datetime) as Datetimestamp
      ,cast([val] as float) as val
INTO
    [SQLOutput]
FROM
    [eventhubinput]


SELECT
       cast([DeviceID] as nvarchar(MAX)) as DeviceID
      ,cast([Parameter] as nvarchar(MAX)) as Parameter
      ,cast([Datetimestamp] as datetime) as Datetimestamp
      ,cast([val] as float) as val
INTO
    [eventhubinput]
FROM
    [safracpump]
WHERE cast([val] as float) > 246.0

                                  



As you can see there are two outputs and in one single query. To test if it works fine you can upload sample data and then check. To do this,

a) Add the below records into a text file
{"DeviceID" : "DEVICE1", "Parameter" : "Engine Voltage", "Datetimestamp" : "2019-06-19 11:53:26.629000", "Val" : "233.825983439"}
{"DeviceID" : "DEVICE1", "Parameter" : "Engine Voltage", "Datetimestamp" : "2019-06-19 11:53:26.629000", "Val" : "233.825983439"}
{"DeviceID" : "DEVICE1", "Parameter" : "Engine Voltage", "Datetimestamp" : "2019-06-19 11:53:26.629000", "Val" : "233.825983439"}
{"DeviceID" : "DEVICE1", "Parameter" : "Engine Voltage", "Datetimestamp" : "2019-06-19 11:53:26.629000", "Val" : "233.825983439"}
{"DeviceID" : "DEVICE1", "Parameter" : "Engine Voltage", "Datetimestamp" : "2019-06-19 11:53:26.629000", "Val" : "253.825983439"}
{"DeviceID" : "DEVICE1", "Parameter" : "Engine Voltage", "Datetimestamp" : "2019-06-19 11:53:26.629000", "Val" : "233.825983439"}
{"DeviceID" : "DEVICE1", "Parameter" : "Engine Voltage", "Datetimestamp" : "2019-06-19 11:53:26.629000", "Val" : "253.825983439"}
{"DeviceID" : "DEVICE1", "Parameter" : "Engine Voltage", "Datetimestamp" : "2019-06-19 11:53:26.629000", "Val" : "233.825983439"}
{"DeviceID" : "DEVICE1", "Parameter" : "Engine Voltage", "Datetimestamp" : "2019-06-19 11:53:26.629000", "Val" : "253.825983439"}


b) Once it is done click on the input and choose "Upload sample data from file"


                                     



                                                                       




c) Once it is done click on test and you should be able to see two output tabs one for SQL and other for Service bus as shown in the image below -

                                             

                                          
                                      


Step 8 - Logic Apps to check the data in SQL Server table and create a record if the value is greater than treshold

a) Creation of Logic apps:
Search for Logic apps service in azure and click you should be able to see the image below -
                            





Name: The name of the logic app
Subscription: The name of the subscription you want
Resource Group: Use Exisiting
Location: The region in which you want the resource to be present.
Log Analytics: In case you want to monitor the workflows you can have it on, else it can be off.

Logic to be implemented -
Once a record is inserted into a table in the Azure SQL Database
Check if the value is greater than the treshold value
If the condition is true then insert the value into a new table.


b) Creation of flow:
Click on the Logic app designer and you should be taken to a screen where in you should be able to add the flows and implement your logic.

Once you do this you should be able to see the list of all inputs. We will choose SQL Server as an input.
              





                    

Once you select SQL Database it shows you two triggers for starting the flow. (Please note you can only choose trigger as a first step, you will not be allowed to use action)
The two options being -
1) When an item is created : This means the table should have a column with an auto increment feature. If the table does not have this, then the table will not appear in the dropdown.
2) When an item is modified : This means the table should have a column with rowversion so that it can understand if there is any modification to the record which will initiate the work flow.


                                                             




In our case the trigger will be when a new record is inserted into the table as there is no scenario where in we will be updating a record. Once you choose this you will be able to see the below screens -
Change the interval from 3 minutes to 1 second, This denotes the interval at which the service will check for the events.


                                                            




The connection in the snapshot is already present, if you want to add it click on new connection in your case and you should be able to see the screen as shown in the snapshot below -


                                                          




Enter the credentials for the database and choose connect via On Premise Gateway if your database is not in Azure environment or is a virtual machine inside Azure.
                                                   
Once it is done choose the name of the table from the dropdown. If you want to add any parameter which will help in filter / ordering / if you want to obtain specific columns from the table you can mention.

                            


                                                      

Once it is done, click on New Step and you will be provided with lot of options and suggestions. Click on Condition based on the logic which we are planning to implement.


                                                       



On click of this, you should be able to see a condition box comes up. Leave the condition to be And by default. Place the cursor on the "Choose a value"  you should be able to see a pop up comes up with dynamic content, it would have the list of all the columns which you have in the table. Choose the column "val" with "is greater than" in this case and enter a number of 246.

                                                       
                                         



Go to true condition below and click on "add an action", choose SQL SErver and you will be able to see a list of options. Choose "Insert row", you will be able to see a drop down populated with tables based on the database connection string. Choose the table "Fact_ExceptionRecords".

                              

                                                            

                                                





In the add new parameter choose the columns in the table as shown in the snashot below.


                                                       



Once you choose the columns, click on the columns one by one and you should be able to see the input columns in the dynamic content. Use the appropriate column names and for description few words "This has been flagged because the val is " Val (column) as shown in the image below.

 
                                                   


                            

Once this is done, click on save.

                                           


Once you start running you should be able to see the records in Runs history as shown in the snapshot below. It should appear as succeeded. If there is any error it will return failed. If the condition is not met it will return it as skipped.


                                                



Now if you run the code which generates the data you should be able to see the spikes in the messages on IoT Hub, Even Hub, Stream Analytics. You should be able to see the data in both the tables of SQL Server database.
Do let me know what you think about this blog, if it has helped you or if you feel anything needs to be improved.

I will try to add more services of azure in future blogs.