Raspberry Pi Online Simulator for Azure IoT Hub

Have you thought about trying out Azure IoT Hub, but don’t have an IoT device?  Try the Raspberry Pi Online Simulator.  This is what it sounds like, a simulated Raspberry Pi device running a small NodeJS program that sends data to your IoT hub.  It sends simulated temperature data using a mock BME280 temperature sensor.  All you do is plug in the connection string to your IoT hub and hit ‘Run’.  Its even got a little led that blinks when sending data and receiving the callback.  Send data with this, and you can view it in near real time using the Device Explorer for IoT Hub Devices.  Happy coding!

 

 

 

 

 

Why Azure is so remarkable

Technology moves so fast these days its easy to take for granted the innovations in recent years,  combined with exponential accessibility of new technologies made available to developers at large.  Azure continues to usher in new possibilities of virtually unlimited scale, while removing the burden of overhead, giving us resources we could have only dreamed of just a short time ago.  In that sense, Azure is the Robin Hood of technology, giving the common developer what only those with lots of resources enjoyed previously!  What I’m referring to here transcends the technical.  It’s about empowerment.   In keeping of Microsoft’s every person and every organization on the planet to achieve more,  Azure levels the playing field by giving us the power of super computers, AI, and internet of things, all with global scale and ease of accessibility.

This ushers in a new approach to architecture.  No longer is it necessary to write from scratch some of the traditional models like pub sub, you can stand up services for that.  Take event hubs as an example.  A globally ready,  event ingestion service that can scale to millions of messages per second.    Aha!  You just did it didn’t you?  That last sentence blew past and perhaps it wasn’t as astonishing as it would have been to you a few years ago.  But today, it has become more commonplace and acceptable.  Its worth repeating, millions of messages per second.  Try standing up your own servers 10 years ago that could handle this volume of data, and at an affordable price.  This alone makes possible solutions for consultants such as myself to offer to small and medium size businesses with virtually no overhead and investment.  Where I would have (and have before) struggled by provisioning servers and/or hosting together with writing my own message based distributed system, I can now just spin up a hub and I’m good to go.

Indeed the inspiring Microsoft commercials featuring Common speak truth to the times we currently enjoy.  The future we look forward to.  The possibilities available to us.  We’re experiencing a time unlike I’ve ever seen before.  Its a great time to be a developer.

Monitor your API’s with Postman

Do you write and maintain APIs?  Do you know how they’re performing?  If any of your endpoints are returning data, are you sure the schema of the data returned is correct?  You could write your own monitoring solution from scratch, but who has time to do that?   I’ve found Postman to be a great solution for monitoring my APIs. Granted, using it is like a car mechanic taking their car to a service center to get their oil changed, but for me its about saving time.  Tons of time.

Postman has so many features, it would be tough to write about all of them in one blog entry.  I’ll tackle my favorites for now.   First, with Postman, you can fire off scheduled, automated requests to your APIs which are grouped into collections,  with monitors.   Monitors not only let you call your endpoints with optional data parameters, but you can check the results of each call with configurable tests.  Postman can also automatically generate API docs for you, complete with a curl endpoint call ready to copy and paste into your shell.  You can even publish these docs with a paid tier subscription.

Postman has an online dashboard as well as a desktop app component. Here’s my dashboard containing a couple monitors I have set up.

Postman online dashboard

With the desktop app you can set up, configure and call any of your endpoints, set collections and monitors, then sync them to the cloud.  Collections between the online dashboard and desktop component are synchronized when you are signed in.  This is a great feature if you want to check things on the go, or are switching between computers.   Here’s a view from the online dash of a call history from a monitor I have set up.

Postman Collection History

There are several usage plans available, including a free tier, Pro and Enterprise tier, each with features you can read about.

Another amazing feature is the API Network.  This allows you to pull in ready made API client configurations from published third parties right  into your desktop app.  Here’s an example from the Bing Cognitive Services API.  You can see my query is all set up for me, I just have to obtain a key and replace the placeholder value with it.  This is super cool.

Postman Bing API

But wait, there’s more.  You can also set up mock servers for your endpoints before they are even live!  This is great for when you want to simulate a response from a call before you’ve written the code.    Full disclosure, this is not a paid endorsement of Postman, its just that neat of a tool.

Telemetry data with Raspberry Pi 3, Windows 10 IoT core and Azure IoT Hub

I had a great time getting the Raspberry Pi 3 set up with Windows 10 IoT Core, a BME280 temperature sensor, and sending temperature data to Azure IoT Hub.  This was my first experience with each of these components, so I was learning as I went.  Working in technology, if you’re not used to or not comfortable learning as you go, you might want to take a step back and re-examine why.  Its well known that change is common in this field, and we should embrace the unknown as opportunities to learn and grow, not merely stay in our comfort zones.  Staying in your comfort zone will get you left quickly behind.  I’m reminding myself of this as much as anyone else, but I digress.

First off, Windows 10 IoT Core.  Mind blown.   Extra points for exceeding expectation.  This is a really useful  operating system geared toward IoT devices, including the Raspberry Pi.  I didn’t know what to expect, and was super stoked about how easy it way to set up.  I was expecting to have lots of trial and error, set up time, guessing, incompatibility and a steep learning curve.   But with Windows 10 IoT core, not only flashing my micro-SD to install it, but setting it up and getting it working with my Raspberry Pi was a breeze.   Here’s  a quick step by step to get started.

1. Download and install Windows IoT Core Dashboard
2. Insert your SD card into your computer and navigate to the ‘Setup a New Device’ tab in the dashboard
3. Select the version of IoT Core you want to install, a name for your device, and an administrator password
4. Click ‘download and install’. The image is downloaded from the internets and installed on the card

You’re good to go now.  Slide the SD card into your PI and boot it. The IoT Core Dashboard should recognize your PI on the network, and it will appear on the My Devices tab.   If it does not try reading this.    When you see the device in the IoT Dashboard, right click it to open a browser window, sign in with the admin user name and password you provided, and you will be interacting with the OS through the Device Portal.

Windows 10 IoT Core has some pretty neat tools built in that you can navigate to through the device portal.  You can view installed apps, which apps are running, as well as start and stop apps.   There are even sample apps you can deploy to your device and run.

Windows 10 Iot Core App Samples

View CPU, View memory, I/O and network usage

Viewing Resource Usage in the Windows 10 IoT Core Viewing Available Networks in Windows 10 IoT Core Device Portal

View available networks

Viewing Available Networks in Windows 10 IoT Core Device Portal

For this post, I’m going to demonstrate how I was able to use the BME280 with the Raspberry Pi and Windows 10 IoT Core to send temperature data to Azure IoT Hub.  One of the great things that attracted me to Winows 10 IoT core was the ability to write C# code to access, interact with and control the various components of the Raspberry Pi.  For this example, I was able to use Visual Studio Community Edition, and mix and match some samples I found online either through the sensor vendor or MSDN.

So now to the fun part and back to ‘mind blown’ when I was actually writing C#, not Java, not Python, not Node.js, C# to interact with the Pi, I/O pins and everything.  We’ve all got our favorite languages, C# happens to be mine.  Not only that, but I was able to send messages to the Hub.

The IoT Hub is interesting and revolutionary.  I say revolutionary because I remember not long ago when such concepts were not available and easily accessible to the masses.  There were no (limited) options to send data into a nebulous cloud endpoint, with virtually unlimited scale, and plug in any number of listeners which then can take action via any number of workflows or triggers.  There are solutions I’ve built in the past that could have benefited from using the Hub.  I wish I could go back and retrofit these solutions using the Hub now.   Sending temperature data, or any other data to the IoT hub, and for that matter, the Azure Event Hub kind of implements set and forget in a new way, sort of a send and forget.   Get your data up to the cloud and handle it there.  The Hub acts as an abstraction layer between ingestion and processing.  Moreso, a “component or service that sits between event publishers and event consumers to decouple the production of an event stream from the consumption of those events”  This is easily understated and misunderstood.  Its truly remarkable, and also made easily available at scale with Azure.

Let’s dive into the code.  I’m using an open source library to access the temp readings from the BME280 sensor.  First, I just new up a timer to call the _timer_Tick method when the app starts and the entry point is hit (OnNavigatedTo event of the main page)

Then in the timer tick event, I read the temp from the sensor

Then send it to the hub using the Microsoft Azure Devices Client

Don’t forget to new up your devices client with your hub connection string and device ID, which you get when you set up your hub and register your device.

To deploy this app to your device, right click on the project in Visual Studio, then click properties. On the properties page, select the Debug menu item, then set “Remote Machine” as your target device.  Set the IP address of your remote machine in the Remote Machine setting.  This will deploy the app to the IoT device, and allow you to debug it.

Deploying to Windows 10 IoT Core from Visual Studio

Now let’s see the data in real time as it arrives in the hub. To do this, we’ll download, install and run the Windows Device Explorer.   Enter your hub connection string on the connection string tab.   Your device will appear on the management tab.  Select that device, then click on the data tab and click monitor.  You should see the telemetry messages appear as they are sent to the hub.  Here’s mine:

View device to cloud messages using Device Explorer for IoT Hub

So there you have it. Now imagine scaling this to 1000 devices, then building a reporting and monitoring solution from this. Happy coding!

Say goodbye to debugging Xamarin iOS on a different machine with Remoted iOS Simulator (for Windows)

The Remoted iOS Simulator (for Windows) just came out for Visual Studio, and if you haven’t tried it, I definitely recommend doing so if you are tired of remoting into your MAC build server to debug your Xamarin iOS apps.

The Remoted iOS Simulator allows you to “Test and debug iOS apps entirely within Visual Studio on Windows”.  I must say having tried it out, I found it works as advertised!  It’s wonderful to be able to run a fully live, real iOS app in a simulator window locally.  It saves time as I don’t have to remote into my Mac build server or be physically near it to interact with a running app.

Simply download the installer, run it, and you’re good to go.  Set your iOS Xamarin project as the startup project.  When you hit ‘debug’ a local iOS simulator pops up and runs locally right on your machine.  You can interact with the app as it runs on the Mac.

The simulator has some basic but useful features like locking the home screen, setting the location, and the ability to rotate and shake the device.

Here is a screenshot of a map view in a real app.  Giddy!  Best of all, the simulator stays running and connected to your Mac after you stop debugging so can use a live Xaml previewer like Gorilla Player for design time visuals.  This is one breakthrough that deserves full applause from the masses.

An app running in the Remoted Simulator for Visual Studio on Windows

 

 

 

 

 

Calling all Datatables!

We use datatables a lot, and a problem we face is column headers not adjusting when the user expands or collapses the sidebar menu in our UI template.   The datatables table body expands or shrinks according to the horizontal page space gained or lost when the sidebar menu expands or collapses,  but the headers for some reason do not.  The result is a misaligned table that looks messy.

I decided to address this once and for all, and went looking for a solution I could implement at least on a page by page basis.  I quickly discovered this approach would be extremely inefficient at best,  since I was dealing with a system with dozens and dozens of tables,  if not hundreds.  And that approach isnt very DRY.  Let’s face it, that approach is never really the best approach.

Lo and behold, a quick search turned up a call in the datatables API where I could summon all visible datatables instances in the page,  whatever the page may be.  Kind of like a bat signal!  I put this in the shared/layout of my ASP.NET MVC app, and it is invoked on each page when the  sidebar menu expands or collapses.  Voilà!  If there are any datatables instances on the page, the columns adjust.  I had to put the delayed call to the datatables API in because the adjustment was happening before the class change had taken place.

Execute code in parallel using Parallel.For

If you’ve ever had code that executes repetitive tasks, consider executing them in parallel using the ParallelFor method in the System.​Threading.​Tasks Namespace.

In this example, I’ve made a dinner decision maker, that randomly decides what to eat for dinner for the next N days.  In our program, we’ll randomly select from dinner choices, and add them to a dictionary.

Its best to use a Concurrent Dictionary as a thread safe collection to add items to during a parallel loop.   Additions to the collection are performed by TryAdd().  We’ll add unique keys and values to the collection potentially at the same time.  The TryAdd() method returns true if the key/value pair was added successfully, false if already exists.  Keep in mind executing  code in parallel is not always the right choice, and can lead to some pitfalls.

In the code below, we’re executing a code block in parallel.  I’ve made the current thread sleep for a random amount of time inside the parallel block, as it seems to generate a more wide variety of dinners.  The code block could have been a named function instead.  The point is that the operating system will decide if and how to execute the code at run time.  It may result in a not so improved performance, or it may not.

 

Scheduled Tasks in Azure with Azure Automation

If you’re like me, you’ve probably done some work with SQL Server Agent jobs to schedule backups or other tasks like truncating logs. Since I have a preference to log virtually every action in the applications I build, I end up setting up jobs to truncate or otherwise archive logs after a certain period of time. As the forces of the cloud become ever more pervasive, it leads us to the question of how to schedule these same types of tasks in Azure. Luckily like most things, Microsoft has an answer for this, and it is Azure Automation.  Complete coverage of Azure Automation is outside the scope of this article.  For my purposes here, it is suffice to say that Azure Automation is a is a SaaS offering that provides the ability to run Powershell scripts called Runbooks that execute tasks.

Runbooks can be authored, scheduled and managed all within the Azure portal either through a graphical editor, or by directly editing code.  I’ll cover the basics here, that will allow you to author your very own “SQL Server Agent job in the cloud”!  We’ll be using the non graphical editing approach.

To get started, create your Automation account.  There are several prerequisites and permissions that you can read about.  Once in the home page for your automation account, select ‘Runbooks’ from the main menu to begin authoring your first Runbook.

 

You’ll then be taken to the landing blade for Runbooks.  Click ‘Add a runbook’ and follow the steps to create a new Runbook.  You have the option here of importing a Runbook as well.  I’ll be creating a Runbook of type ‘Powershell’.  You can optionally select Graphical Editor or a Workflow type.

 

Once this is created, you’ll be taken back to the Runbooks main blade.  Select a Runbook and you’ll be taken to the details blade of that Runbook.  Its here from the details blade that we can edit, schedule and publish our Runbook.    Select ‘Edit’ to begin editing your Runbook code in your browser.

 

The Powershell scripting language that you’ll use allows for parameters to be passed in to your script. I simply modified one of the samples available here that runs a SQL command statement, to delete from my log table where the log records are older than a certain date. The parameters I used include:

1. The name of the server I want to connect to
2. The port I’ll connect on, 1433
3. The database name I’ll run my query against
4. The table I will run my query against
5. A credential ‘Asset’ that you have to configure separately. This is a user name and password for authenticating against the database server and database.  To configure a credential, navigate to the Runbook main blade and follow the steps under the ‘Credentials’ menu.  It will allow you to create a saved user name and password.

Once you’re done editing your runbook code you can either click ‘Revert to Published’ which will discard your changes and roll back to your published version, if you have already published a script.  Or, you can publish the script.  Once you publish the script you can run it manually.  You can see the results of the last job ran in the details blade of the Runbook, by clicking ‘Jobs’.

To attach a schedule to the job, go back to the Runbooks main blade and click ‘Schedules’.   Here I’ve created a schedule called ‘Weekly’.  I can set the schedule in the edit blade.

 

I then attach the schedule to the job in the details blade of either the Runbook.

When you want to run the Runbook manually, click ‘Start’ from the Runbook details blade. You’ll be prompted to provide values for your variables.

That’s it.  Your job is now able to be scheduled and run.  Again, the code I used is modified from the sample here.  You could theoretically pass a variable to determine how far back you want to delete the logs in your table.

<#
.SYNOPSIS
Outputs the number of records in the specified SQL Server database table.

.DESCRIPTION
This runbook demonstrates how to communicate with a SQL Server. Specifically, this runbook
outputs the number of records in the specified SQL Server database table.

In order for this runbook to work, the SQL Server must be accessible from the runbook worker
running this runbook. Make sure the SQL Server allows incoming connections from Azure services
by selecting 'Allow Windows Azure Services' on the SQL Server configuration page in Azure.

This runbook also requires an Automation Credential asset be created before the runbook is
run, which stores the username and password of an account with access to the SQL Server.
That credential should be referenced for the SqlCredential parameter of this runbook.

.PARAMETER SqlServer
String name of the SQL Server to connect to

.PARAMETER SqlServerPort
Integer port to connect to the SQL Server. Default is 1433

.PARAMETER Database
String name of the SQL Server database to connect to

.PARAMETER Table
String name of the database table to output the number of records of

.PARAMETER SqlCredentialAsseet
Credential asset name containing a username and password with access to the SQL Server

.EXAMPLE
Use-SqlCommandSample -SqlServer "somesqlserver.databases.windows.net" -SqlServerPort 1433 -Database "SomeDatabaseName" -Table "SomeTableName" -SqlCredentialAsset sqluserCredentialAsset

.NOTES
AUTHOR: System Center Automation Team  <---- Original Author  #>


param(
[parameter(Mandatory=$True)]
[string] $SqlServer = '',

[parameter(Mandatory=$False)]
[int] $SqlServerPort = 1433,

[parameter(Mandatory=$True)]
[string] $Database = '',

[parameter(Mandatory=$True)]
[string] $Table = 'log',

[parameter(Mandatory=$True)]
[string] $SqlCredentialAsset = ''
)

$SqlCredential = Get-AutomationPSCredential -Name $SqlCredentialAsset

if ($SqlCredential -eq $null)
{
throw "Could not retrieve '$SqlCredentialAsset' credential asset. Check that you created this first in the Automation service."
}
# Get the username and password from the SQL Credential
$SqlUsername = $SqlCredential.UserName
$SqlPass = $SqlCredential.GetNetworkCredential().Password

# Define the connection to the SQL Database
$Conn = New-Object System.Data.SqlClient.SqlConnection("Server=tcp:$SqlServer,$SqlServerPort;Database=$Database;User ID=$SqlUsername;Password=$SqlPass;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;")

# Open the SQL connection
$Conn.Open()

# Define the SQL command to run. In this case we are getting the number of rows in the table
$Cmd=new-object system.Data.SqlClient.SqlCommand("delete from $Table where date < DATEADD(day, -180, GETDATE())", $Conn)
$Cmd.CommandTimeout=120

# Execute the SQL command
$Ds=New-Object system.Data.DataSet
$Da=New-Object system.Data.SqlClient.SqlDataAdapter($Cmd)
[void]$Da.fill($Ds)

# Output the count
$Ds.Tables.Column1

# Close the SQL connection
$Conn.Close()

How to set the time zone of an Azure web app

To set the time zone of an Azure web app, navigate to the web app and click Application Settings from the Settings menu

Azure Settings

Under the settings blade, add a new app setting with the key value WEBSITE_TIME_ZONE and the value of your desired timezone.

 

 

The values for the time zones can be found in the registry under HKLM\Software\Microsoft\Windows Nt\CurrentVersion\Time Zones.

Multi-Line Code Editing in Visual Studio 2017

Here’s a quick tip if you’ve ever wanted to edit multiple lines of code at once in Visual Studio. Simply position your cursor at a point in your code, then press and hold SHIFT and ALT.  Next, press the up or down arrow to select the lines you want to edit.  When you begin typing you will behold a gift from the gods – editing multiple lines at once!

Multi Select Visual Studio
Multi Select Visual Studio