Roundup #52: .NET Core and systemd, System.Threading.Channels, screencastR, Configuration Pitfalls

Here are the things that caught my eye recently in .NET.  I’d love to hear what you found most interesting this week.  Let me know in the comments or on Twitter.

.NET Core and systemd

In preview7 a new package was added to the Microsoft.Extensions set of packages that enables integration with systemd. For the Windows focused, systemd allows similar functionality to Windows Services, there is a post on how to do what we discuss here for Windows Services in this post. This work was contributed by Tom Deseyn from Red Hat. In this post we will create a .NET Core app that runs as a systemd service. The integration makes systemd aware when the application has started/is stopping, and configures logs to be sent in a way that journald (the logging system of systemd) understands log priorities.

Link: https://devblogs.microsoft.com/dotnet/net-core-and-systemd/

An Introduction to System.Threading.Channels

I’ve recently begun making use of a relatively new (well, it’s a little over a year old at the time of writing) feature called “Channels”. The current version number is 4.5.0 (with a 4.6.0 preview also available as pre-release) which makes it sound like it’s been around for a lot longer, but in fact, 4.5.0 was the first stable release of this package!

In this post, I want to provide a short introduction to this feature, which I will hopefully build upon in later posts with some real-world scenarios explaining how and where I have successfully applied it.

Link: https://www.stevejgordon.co.uk/an-introduction-to-system-threading-channels

screencastR – Simple screen sharing app using SignalR streaming

In this article, we will see how to create simple screen sharing app using signalR streaming. SignalR supports both server to client and client to server streaming. In my previous article , I have done server to client streaming with ChannelReader and ChannelWriter for streaming support. This may look very complex to implement asynchronous streaming just like writing the asynchronous method without async and await keyword. IAsyncEnumerable is the latest addition to .Net Core 3.0 and C# 8 feature for asynchronous streaming. It is now super easy to implement asynchronous streaming with few lines of clean code. In this example, we will use client to server streaming to stream the desktop content to all the connected remote client viewers using signalR stream with the support of IAsyncEnumerable API.

Link: https://jeevasubburaj.com/2019/08/13/screencastr-simple-screensharing-app-using-signalr-streaming/

Avoiding ASP.Net Core Configuration Pitfalls With Array Values

ASP.NET Core continues to improve on the legacy of the .NET Framework. Our team is impressed with its performance and excited about future possibilities, but change is seldom a smooth transition. In this post, I’ll explain a pitfall you may run into using the newest configuration model in .NET Core and options to mitigate the issue.

Link: https://rimdev.io/avoiding-aspnet-core-configuration-pitfalls-with-array-values/

Enjoy this post? Subscribe!

Subscribe to our weekly Newsletter and stay tuned.

Event Sourcing with SQL Stream Store

Event Sourcing with SQL Stream Store

I’ve known about SQL Stream Store for a bit (I believe when it was called Cedar) but haven’t really looked into it much. The premise is to provide a stream store over SQL. Currently supporting MS SQL Server, PostgreSQL, and MySQL. Meaning it is simply a .NET Library you can use with an SQL implementation. Let’s take a look at how you can implement Event Sourcing with SQL Stream Store.

SQL Stream Store Demo Application

For this demo/sample, I’m going to create a .NET Core 3 console app. The idea will be to create the stereotypical event-sourcing example of a bank account.

I wanted to explore the primary functions needed of an event stream.

  • Create a Stream
  • Append a new Event to a Stream
  • Read Events from a Stream
  • Subscribe to a Stream to receive appended Events

NuGet Package

As always, the first thing is to get the SqlStreamStore NuGet Package by adding it as a PackageReference in your csproj.

I’ve also included Newtonsoft.Json because I’m going to be (de)serializing events.

Events

I’m going to create two events to represent the bank account transactions. Deposited and Withdrawn. Both of these will extend the abstract AccountEvent that will contain the TransactionId, Dollar Amount, and DateTime of the event.

Creating a Stream

There is no way of creating an empty stream. This is actually really nice. Instead when appending an event to a stream, if the stream does not already exist, it’s created. This is the same how EventStore works and should also feel familiar in comparison to the API.

Appending to a Stream

Appending a new event to a stream is pretty straight forward. From the IStreamStore there is an AppendToStream method that takes a few args.

The first is StreamId. This generally would represent your aggregate root ID. In this demo, I’m setting it Account:{GUID}.

The second arg is ExpectedVersion. This is also similar to the EventStore API. This is used for handling concurrency. You can specify an integer that represents the number of events that are persisted in the stream. You can also use the ExpectedVersion enum that can specify Any to not concern yourself with concurrency or NoStream to verify its the first event.

Finally, the 3rd param is an instance of NewStreamMessage. It contains the MessageId (GUID), an event name and the event json body.

An interesting takeaway here is the event name is intentionally made as a string so you are not serializing/deserializing to the CRL type name. This is a great idea since the CLR type name is likely to be changed more than just a plain string which you can keep constant.

Reading a Stream

You can read a stream forward or backward. Forward meaning from the very first event until the last, which is what I’ll use in this example.

You simply specify the StreamId, the starting version (0) and how many events to pull from the stream. The result contains an IsEnd to indicate there are no more events left in the stream to read.

Subscribing to a Stream

Subscribing to a stream is pretty straight forward. Specify the StreamId, the starting version/position of where you want to be notified of new events from, and then StreamMessageReceived for handling the newly appended event.

Wrapping it up

Now that I’ve covered all the primary aspects, it’s just a matter of adding some input/output to our console app by allowing the user to deposit and withdrawal from the account.

In the demo, I’m using the InMemoryStreamStore so there is no persisted data upon restarting the app. I’m also using a new GUID to represent the AccountId on each run.

Source Code

All the source code shown in this post is available on GitHub.

This was a quick look at just a few of the APIs in SQL Stream Store but should give you a feel for how it works.

If you’d like a more detailed contented related to various aspects of event sourcing, let me know in the comments or on twitter.

https://github.com/dcomartin/SqlStreamStore.Demo

Related Links:

Enjoy this post? Subscribe!

Subscribe to our weekly Newsletter and stay tuned.

Custom Metrics to AWS CloudWatch from ASP.NET Core

CloudWatch Custom Metrics

I was playing around with AWS CloudWatch and was curious to send custom metrics from ASP.NET Core. Specifically the execution time of an HTTP request.

AWS SDK

I created a simple middleware that starts a StopWatch before calling the next middleware in the pipeline. When it returns, stop the StopWatch and send the data to CloudWatch.

First is to add the relevant NuGet packages.

Configuration

If you’ve never used the AWS SDK/Packages, I recommend checking out my post on Configuring AWS SDK in ASP.NET Core. It goes over creating a named profile to store your AWS credentials and using appSettings or Environment variables to pass them through to ASP.NET Core.

Middleware

Next is to create a simple middleware using the SDK. The middleware is having the IAmazonCloudwatch injected into the constructor. In the Startup’s ConfigureServices is where this is configured.

We’re simply calling the PutMetricDataAsync with a list of one MetricDatum. It contains the metric name, value, unit, timestamp, and dimensions.

Startup

As mentioned, in order to use the AWS SDK and the IAmazonCloudWatch in the middleware, you can use the AddDefaultAWSOptions and AddAWSService<IAmazonCloudWatch> to ConfigureServices to register the appropriate types.

Results

If you look in CloudWatch, you can now see the new custom metrics we’ve published.

Next

Before you start using this in a production environment, there are a couple of glaring issues.

First, sending the metrics to AWS, although incredibly fast should be handled outside of the actual HTTP request. this can be done completely asynchronously from the actual request. There is no point in delaying the HTTP response to the client. To solve this, we’ll add a queue and do it in a background service.

Secondly, some of the cost associated with CloudWatch is based on the number of API calls. This is why there is a List<MetricDatum> in the PutMetricDataRequest. It allows you to send/batch multiple metrics together to limit the number of API calls.

This is a better solution is to collect PutMetricDataRequest separately and send them in batch in a background service.

More on that coming soon.

Enjoy this post? Subscribe!

Subscribe to our weekly Newsletter and stay tuned.