Each new .NET version brings performance enhancements, optimizations and new features to elevate productivity and application efficiency.
The release of .NET 9 is no exception, bringing a variety of enhancements, including performance improvements, new types and additional methods.
In today’s post, we’ll dive into the key advancements it offers.
LINQ Performance Improvements
When it comes to performance, .NET 9 truly raises the bar.
One of the standout enhancements is the optimization of LINQ methods. Not only have three new methods been introduced (covered in a previous blog post), but existing methods have also been significantly improved.
Fun fact: LINQ used to be significantly slower, which made it less popular but after so many improvements over the years, it’s incredible to see that they’ve managed to take yet another step forward.
In this blog post, I’ve highlighted a few improvements that I’m excited to share:
[Benchmark]
public bool Any() => _list.Any(i => i == 1000);
[Benchmark]
public bool All() => _list.All(i => i >= 0);
[Benchmark]
public int Count() => _list.Count(i => i == 0);
[Benchmark]
public int First() => _list.First(i => i == 999);
[Benchmark]
public int Single() => _list.Single(i => i == 0);
[Benchmark]
public object Chunk() => _values.Chunk(10);
[Benchmark]
public object Distinct() => _values.Distinct();
[Benchmark]
public object GroupJoin() => _values.GroupJoin(_values, i => i, i => i, (i, j) => i);
[Benchmark]
public object Join() => _values.Join(_values, i => i, i => i, (i, j) => i);
[Benchmark]
public object ToLookup() => _values.ToLookup(i => i);
[Benchmark]
public object Reverse() => _values.Reverse();
[Benchmark]
public object SelectIndex() => _values.Select((s, i) => i);
[Benchmark]
public object SelectMany() => _values.SelectMany(i => i);
[Benchmark]
public object SkipWhile() => _values.SkipWhile(i => true);
[Benchmark]
public object TakeWhile() => _values.TakeWhile(i => true);
[Benchmark]
public object WhereIndex() => _values.Where((s, i) => true);
[Benchmark]
public int DistinctFirst() => _arrayDistinct.First();
[Benchmark]
public int AppendSelectLast() => _appendSelect.Last();
[Benchmark]
public int RangeReverseCount() => _rangeReverse.Count();
[Benchmark]
public int DefaultIfEmptySelectElementAt() => _listDefaultIfEmptySelect.ElementAt(999);
[Benchmark]
public int ListSkipTakeElementAt() => _listSkipTake.ElementAt(99);
[Benchmark]
public int RangeUnionFirst() => _rangeUnion.First();
Some of the examples and insights were derived from blog post by Stephen Toub, be sure to check it out!
The results are truly remarkable:
Interestingly: A 50% performance boost, while substantial, almost feels routine when compared to optimizations exceeding 1000 times.
The craziest part? No memory allocation.
But how were these incredible improvements achieved?
There have been many improvements, but one that stands out is the enhancement to the Iterator<T>. For instance, it's now used more efficiently, allowing it to be created just once in some cases, as opposed to the previous method where multiple iterators were needed in certain situations.
Another key improvement in LINQ is the increased use of Spans in internal methods, resulting in better performance without triggering memory allocations.
You can also take advantage of this in your own code by using the AsSpan method from Collections Marshal, which accesses the internal array backing each list and allows you to create a Span directly from it.
Exceptions Performance Improvements
.NET 9 has also made significant performance improvements in handling exceptions.
Since exceptions are frequently used for flow control in large applications, which may throw millions of exceptions, this enhancement can lead to a substantial performance boost.
Here are the results I've got:
NOTE: This is not an encouragement to use exceptions more than necessary, but rather a presentation of the improvements that have been made.
For an alternative approach, take a look at my blog on the Result Pattern.
P.S. If you're interested in learning how to conduct your own benchmarks, check out this blog post.
Lock Type
We now have a concrete type for locking, transitioning from using a locking object to the new System.Threading.Lock type.
private static readonly Lock Lock = new();
public void LockingInDotNet9()
{
lock (Lock)
{
// PS: You can't use await inside!
}
}
The lock statement now detects when the target is a Lock object. In such cases, it leverages the updated API instead of the traditional System.Threading.Monitor API.
Additionally, the compiler identifies scenarios where a Lock object is converted to another type, generating Monitor-based code accordingly.
Time Ordered GUIDs
Version 7 GUIDs can now be generated using the new Guid.CreateVersion7() and Guid.CreateVersion7(DateTimeOffset) methods.
Additionally, the Version property provides access to the version field of a GUID object.
// To learn more checkout: https://www.nikolatech.net/blogs/guids-vs-ulids
var guidV7 = Guid.CreateVersion7();
var guidV7WithTimestamp = Guid.CreateVersion7(DateTimeOffset.UtcNow);
01934991-207a-7f49-99f7-23fe10eb07d3
01934991-207a-7f5a-a673-71a48445b3ed
I’ve also covered this topic in a dedicated blog post, check it out!
Feature Switches
Feature switches are an excellent addition, enabling conditional inclusion or exclusion of functionality during builds.
This approach enhances app performance and reduces size, especially when leveraging trimming or Native AOT compilation.
FeatureSwitchDefinitionAttribute: Treats a feature switch property as a constant during trimming. Code protected by the switch is removed if the feature is disabled.
FeatureGuardAttribute: Marks a feature-switch property as a safeguard for code annotated with RequiresUnreferencedCodeAttribute, RequiresAssemblyFilesAttribute or RequiresDynamicCodeAttribute.
To define switches in your project you need to update yourW .csproj file:
<ItemGroup>
<RuntimeHostConfigurationOption Include="Feature.IsSupported" Value="true" Trim="true" />
</ItemGroup>
internal sealed class Feature
{
[FeatureSwitchDefinition("Feature.IsSupported")]
internal static bool IsSupported =>
AppContext.TryGetSwitch("Feature.IsSupported", out bool isEnabled) ? isEnabled : true;
internal static void Implementation() =>
Console.WriteLine("Feature is supported");
}
if (Feature.IsSupported)
{
Feature.Implementation();
}
This leads to several benefits:
- Reduce Application Size: Exclude unused code, resulting in smaller binaries.
- Enhance Performance: Minimize the application's footprint for faster load times and improved efficiency.
- Customize Builds: Tailor applications to include only necessary features for specific deployment scenarios.
Task.WhenEach
Task.WhenEach is definitely one of my favorite new features.
Now, we can conveniently handle tasks as soon as they finish. When tasks run independently, the most efficient approach is to start processing them immediately after completion.
WhenEach returns an IAsyncEnumerable, enabling you to use await foreach to handle tasks as they complete.
Using WhenEach for this scenario is as simple as it gets:
// With .NET 9 utilizing Task.WhenEach really simplifies logic
await foreach (var task in Task.WhenEach(tasks))
{
Console.WriteLine(await task);
}
Before .NET 9, you had to repeatedly call Task.WaitAny in a loop to retrieve the next completed task, or rely on third-party libraries. Constantly removing tasks and checking them in a rapid sequence, as you can imagine, wasn't an optimal approach for performance.
// Equivalent code before Task.WhenEach
while (tasks.Any())
{
var readyTask = await Task.WhenAny(tasks);
tasks.Remove(readyTask);
Console.WriteLine(await readyTask);
}
Json Schema Exporter
The JsonSchemaExporter class enables you to generate JSON schema documents from .NET types using a JsonSerializerOptions or JsonTypeInfo instance.
The resulting schema specifies the JSON serialization contract for the .NET type, detailing the structure of the data that can be serialized and deserialized.
You can customize the schema output by configuring the JsonSerializerOptions or JsonTypeInfo instance passed to the GetJsonSchemaAsNode method.
var schema = JsonSchemaExporter.GetJsonSchemaAsNode(
JsonSerializerOptions.Web, typeof(Order));
Console.WriteLine(schema.ToJsonString());
public class Order
{
public Guid Id { get; set; }
public string SerialNumber {get; set; }
public DateTime CreatedAt {get; set; }
public DateTime ModifiedAt {get; set; }
}
{
"type":[
"object",
"null"
],
"properties":{
"id":{
"type":"string",
"format":"uuid"
},
"serialNumber":{
"type":"string"
},
"createdAt":{
"type":"string",
"format":"date-time"
},
"modifiedAt":{
"type":"string",
"format":"date-time"
}
}
}
Params
From .NET 9, the params modifier isn't limited to array types anymore.
void ProcessOrders(params IEnumerable<Order> orders)
{
// Processing orders...
}
You can now use params with any recognized collection type, including Span<T>, ReadOnlySpan<T>, and types that implement IEnumerable<T> and have an Add method.
Conclusion
In conclusion, .NET 9 introduces significant improvements in both performance and functionality.
The optimizations in LINQ, such as faster iteration and reduced memory allocations, greatly enhancing the performance of common operations.
Additionally, improvements in exception handling and the introduction of a new Lock type further enhance efficiency.
The ability to generate Time-Ordered GUIDs, utilize feature switches, leverage Task.WhenEach and more, certainly make migrating to .NET 9, including its STS, worthwhile.
If you want to conduct additional testing or check out other enhancments you can find the source code here:
Source CodeI hope you enjoyed it, subscribe and get a notification when a new blog is up!