Showing posts with label .NET Framework. Show all posts
Showing posts with label .NET Framework. Show all posts

Sunday, 7 September 2025

Show Nuget Dependency Graph

Showing Nuget Dependency Graph

In .NET Framework and .NET solutions, Nugets are added to solutions to conveniently add libraries. Each project got possibly multiple Nuget package dependencies. Each Nuget package itself can reference other Nuget libraries, which again references again additional libraries and so on. An overview of all the Nuget libraries actually used by a project, those that can be called top-level and transitive dependencies. Transitive dependencies are those indirectly references by the top-level dependencies. Do not confused this with those libraries that are actually referenced in the project file (.csproj files for example) as a Package Reference directly with those called top-level dependencies in this article, top-level here means the Nuget is has got a dependency graph level depth of one, compared to transitive dependencies where the dependency graph level higher than one.



function Show-NugetDependencyGraph {
    [CmdletBinding()]
    param ()
    $tempHtmlPath = [System.IO.Path]::GetTempFileName() + ".html"
    $assetFiles = Get-ChildItem -Recurse -Filter "project.assets.json"

    $currentProjects = (gci -recurse -filter *.csproj | select-object -expandproperty name) -join ',' #get folder name to show

    $script:mermaidGraph = @"
<!DOCTYPE html>
<html>
<head>
  <script type="module">
    import mermaid from 'https://cdn.jsdelivr.net/npm/mermaid@10/dist/mermaid.esm.min.mjs';
    mermaid.initialize({ startOnLoad: true });
  </script>
  <style>
    body {
      font-family: sans-serif;
      padding: 20px;
    }
    .mermaid {
      background: #f9f9f9;
      padding: 20px;
      border-radius: 8px;
      min-height: 800px;
      overflow: hidden;
    }
    .mermaid svg {
      min-height: 800px;
      width: 100%;
      height: auto;
    }
  </style>
</head>

<meta charset="UTF-8">

<body>
<h2>Nuget Dependency Graph for '$currentProjects' (Max Depth: 3)</h2>
<div class="mermaid">


graph TD


"@
    $visited = @{}
    $nodes = @{}
    $edges = @{}
    $topLevelDeps = @{}
    $transitiveDeps = @{}
    function Escape-MermaidLabel {
        param ([string]$text)
        $text = $text -replace '\(', '('
        $text = $text -replace '\)', ')'
        $text = $text -replace '\[', '['
        $text = $text -replace '\]', ']'
        $text = $text -replace ',', ','
        return $text
    }
    function Normalize-NodeId {
        param ([string]$text)
        return ($text -replace '[^a-zA-Z0-9_]', '_')
    }
    function Add-Dependencies {
        param (
            [string]$pkgName,
            [object]$targets,
            [int]$depth,
            [string]$path = ""
        )
        if ($depth -gt 3 -or $visited.ContainsKey($pkgName)) { return }
        $visited[$pkgName] = $true
        $pkgVersion = $pkgName.Split('/')[1]
        $pkgId = $pkgName.Split('/')[0]
        $escapedVersion = Escape-MermaidLabel($pkgVersion)
        $nodeId = Normalize-NodeId ("{0}_{1}" -f $pkgId, $pkgVersion)
        $nodeLabel = "$nodeId[""$pkgId<br/>v$escapedVersion""]:::level$depth"
        if (-not $nodes.ContainsKey($nodeId)) {
            $script:mermaidGraph += "$nodeLabel`n"
            $nodes[$nodeId] = $true
        }
        $currentPath = if ($path) { "$path → $pkgId ($pkgVersion)" } else { "$pkgId ($pkgVersion)" }
        if ($depth -eq 1) {
            $topLevelDeps["$pkgId/$pkgVersion"] = $currentPath
        } else {
            $transitiveDeps["$pkgId/$pkgVersion"] = $currentPath
        }
        foreach ($target in $targets.PSObject.Properties) {
            $pkg = $target.Value.$pkgName
            if ($pkg -and $pkg.dependencies) {
                foreach ($dep in $pkg.dependencies.PSObject.Properties) {
                    $depName = $dep.Name
                    $depVersion = $dep.Value
                    $escapedDepVersion = Escape-MermaidLabel($depVersion)
                    $depNodeId = Normalize-NodeId ("{0}_{1}" -f $depName, $depVersion)
                    $depNodeLabel = "$depNodeId[""$depName<br/>v$escapedDepVersion""]:::level$($depth+1)"
                    if (-not $nodes.ContainsKey($depNodeId)) {
                        $script:mermaidGraph += "$depNodeLabel`n"
                        $nodes[$depNodeId] = $true
                    }
                    $edge = "$nodeId --> $depNodeId"
                    if (-not $edges.ContainsKey($edge)) {
                        $script:mermaidGraph += "$edge`n"
                        $edges[$edge] = $true
                    }
                    Add-Dependencies ("$depName/$depVersion") $targets ($depth + 1) $currentPath
                }
            }
        }
    }
    foreach ($file in $assetFiles) {
        $json = Get-Content $file.FullName | ConvertFrom-Json
        $targets = $json.targets
        foreach ($target in $targets.PSObject.Properties) {
            $targetPackages = $target.Value
            foreach ($package in $targetPackages.PSObject.Properties) {
                Add-Dependencies $package.Name $targets 1
            }
        }
    }

    $topLevelDepsCount = $topLevelDeps.Count #number of top level dependencies
    $transitiveDepsCount = $transitiveDeps.Count #number of top level transitive dependencies

    $script:mermaidGraph += @"
classDef level1 fill:#cce5ff,stroke:#004085,stroke-width:2px;
classDef level2 fill:#d4edda,stroke:#155724,stroke-width:1.5px;
classDef level3 fill:#fff3cd,stroke:#856404,stroke-width:1px;
</div>
<script>
  function enablePanZoom(svg) {
    let isPanning = false;
    let startX, startY;
    let viewBox = svg.viewBox.baseVal;
    let zoomFactor = 1.1;
    // Initial zoom: scale to 200%
    const initialZoom = 2.0;
    const newWidth = viewBox.width / initialZoom;
    const newHeight = viewBox.height / initialZoom;
    viewBox.x += (viewBox.width - newWidth) / 2;
    viewBox.y += (viewBox.height - newHeight) / 2;
    viewBox.width = newWidth;
    viewBox.height = newHeight;
    svg.addEventListener("mousedown", (e) => {
      isPanning = true;
      startX = e.clientX;
      startY = e.clientY;
      svg.style.cursor = "grabbing";
    });
    svg.addEventListener("mousemove", (e) => {
      if (!isPanning) return;
      const dx = (e.clientX - startX) * (viewBox.width / svg.clientWidth);
      const dy = (e.clientY - startY) * (viewBox.height / svg.clientHeight);
      viewBox.x -= dx;
      viewBox.y -= dy;
      startX = e.clientX;
      startY = e.clientY;
    });
    svg.addEventListener("mouseup", () => {
      isPanning = false;
      svg.style.cursor = "grab";
    });
    svg.addEventListener("mouseleave", () => {
      isPanning = false;
      svg.style.cursor = "grab";
    });
    svg.addEventListener("wheel", (e) => {
      e.preventDefault();
      const { x, y, width, height } = viewBox;
      const mx = e.offsetX / svg.clientWidth;
      const my = e.offsetY / svg.clientHeight;
      const zoom = e.deltaY < 0 ? 1 / zoomFactor : zoomFactor;
      const newWidth = width * zoom;
      const newHeight = height * zoom;
      viewBox.x += (width - newWidth) * mx;
      viewBox.y += (height - newHeight) * my;
      viewBox.width = newWidth;
      viewBox.height = newHeight;
    });
    svg.style.cursor = "grab";
  }
  document.addEventListener("DOMContentLoaded", () => {
    setTimeout(() => {
      const svg = document.querySelector(".mermaid svg");
      if (svg) {
        enablePanZoom(svg);
      } else {
        console.warn("SVG not found after 1.5s.");
      }
    }, 1500);
  });
</script>
<h3>πŸ”Ž Filter Dependencies (Total Count: $($transitiveDepsCount + $topLevelDepsCount))</h3>
<input type="text" id="searchInput" onkeyup="filterTables()" placeholder="Search for NuGet package..." style="width: 100%; padding: 8px; margin-bottom: 20px; font-size: 16px;">
<style>
  table {
    border-collapse: collapse;
    width: 100%;
    margin-bottom: 40px;
    font-size: 14px;
  }
  th, td {
    border: 1px solid #ccc;
    padding: 8px;
    text-align: left;
  }
  tr:nth-child(even) {
    background-color: #f9f9f9;
  }
  tr:hover {
    background-color: #e2f0fb;
  }
  th {
    background-color: #007bff;
    color: white;
  }
</style>
<h3>πŸ“¦ Top-Level Dependencies (Count: $transitiveDepsCount)</h3>
<em>Note: Top-level Dependencies are Nuget packages which have a Dependency Path of length 1. To check which Nuget packages are actually listed in the project file(s), open the .csproj file(s) directly.</em>
<table id="topTable">
  <thead><tr><th>Package</th><th>Dependency Path</th></tr></thead>
  <tbody>
"@
    $sortedTopLevel = $topLevelDeps.GetEnumerator() | Sort-Object Name
    foreach ($dep in $sortedTopLevel) {
        $script:mermaidGraph += "<tr><td>$($dep.Key)</td><td>$($dep.Value)</td></tr>`n"
    }
    $script:mermaidGraph += @"
  </tbody>
</table>
<h3>πŸ“š Transitive Dependencies (Count: $topLevelDepsCount)</h3>
<table id="transitiveTable">
  <thead><tr><th>Package</th><th>Dependency Path</th></tr></thead>
  <tbody>
"@
    $sortedTransitive = $transitiveDeps.GetEnumerator() | Sort-Object Name
    foreach ($dep in $sortedTransitive) {
        $script:mermaidGraph += "<tr><td>$($dep.Key)</td><td>$($dep.Value)</td></tr>`n"
    }
    $script:mermaidGraph += @"
  </tbody>
</table>
<script>
function filterTables() {
  const input = document.getElementById('searchInput').value.toLowerCase();
  ['topTable', 'transitiveTable'].forEach(id => {
    const rows = document.getElementById(id).getElementsByTagName('tr');
    for (let i = 1; i < rows.length; i++) {
      const cells = rows[i].getElementsByTagName('td');
      const match = Array.from(cells).some(cell => cell.textContent.toLowerCase().includes(input));
      rows[i].style.display = match ? '' : 'none';
    }
  });
}
</script>
</body>
</html>
"@
    [System.IO.File]::WriteAllText($tempHtmlPath, $script:mermaidGraph, [System.Text.Encoding]::UTF8)
    Start-Process $tempHtmlPath
}

# Run the function
Show-NugetDependencyGraph


The function above Show-NugetDependencyGraph can be added to the $profile file of the user you are logged in as. Usage : Make sure you are inside a folder where your project of the .NET Framework or .NET solution you want to see the Dependency graph and then just run the function Show-NugetDependencyGraph. Inside the subfolders, you will find project.assets.json file, usually in the obj folder. Note that this Powershell script do support showing multiple projects, but there are limitations in the graph drawing not allowing too many Nuget packages drawn into one and same graph, so the best analysis is done per-project. The Powershell script adds support for pan and zoom to provide an interactive Nuget Dependency graph. VanillaJs is used. Note that this script supports both .NET and .NET Framework. The script will recursively look for project.assets.json files in subfolders and then use the Convert-FromJson method to inspect the json file(s) found. The method Add-Dependencies is called recursively to build up the hash tables variables of the script that will keep the data structure that is keeping the list of Nuget libraries and transitive dependencies. The script also builds up VanillaJs script string that adds pan and zoom capabilities and the html template provides tables for the top-level and transitive Nuget libraries. Note also that the script builds up the html template that presents the Mermaid based Nuget dependency graph, using the script level variable $script:mermaidGraph. Note the usage of script-level variable here, this is necessary to hoist the Powershell variable up since we make use of recursion and this is required. Screenshots showing examples after running the Powershell script.

Table showing transitive dependencies in table :
Example of dependency graph of nuget libaries :

Sunday, 1 June 2025

Using SqlBulkCopy with EntityFramework

This article tests out variying methods of ways of doing bulk inserts in EntityFramework. The code shown in this article uses .NET Framework 4.8
and Entity Framework 6.5.0. Support is better in Entity Framework Core, so the code shown here for SqlBulkCopy must be adjusted a little bit to also work with EntityFramework Core. A Github repo has been added here:

https://github.com/toreaurstadboss/BulkOperationsEntityFramework

A benchmark has been run to test out the different approaches. Not suprisingly, SqlBulkCopy is the fastest and most economic way of performing the bulk inserts (uses the least memory). The code for handling SqlBulkCopy is provided via extension methods listed below:

BulkInsertExtensions.cs




using System.Collections.Generic;
using System.Data;
using System.Data.Entity;
using System.Data.SqlClient;
using System.Threading.Tasks;

namespace BulkOperationsEntityFramework.Lib.Extensions
{

    /// <summary>
    /// Provides extension methods for performing bulk insert operations using SqlBulkCopy.
    /// </summary>
    public static class BulkInsertExtensions
    {
        /// <summary>
        /// Performs a bulk insert of the specified entities into the database.
        /// </summary>
        /// <typeparam name="T">The type of the entity.</typeparam>
        /// <param name="context">The DbContext instance.</param>
        /// <param name="entities">The collection of entities to insert.</param>
        /// <param name="tableName">
        /// Optional: The name of the destination table. If null, the entity type name is used.
        /// </param>
        /// <param name="columnMappings">
        /// Optional: Custom column mappings (propertyName → columnName).
        /// 
        /// <example>
        /// Example 1: Rename columns
        /// <code>
        /// var mappings = new Dictionary<string, string>
        /// {
        ///     { "Name", "ProductName" },
        ///     { "Price", "UnitPrice" }
        /// };
        /// await context.BulkInsertAsync(products, "Products", mappings);
        /// </code>
        /// </example>
        /// 
        /// <example>
        /// Example 2: Ignore properties using attributes
        /// <code>
        /// public class Customer
        /// {
        ///     public int Id { get; set; }
        ///     public string FullName { get; set; }
        /// 
        ///     [BulkIgnore]
        ///     public string TempNotes { get; set; }
        /// 
        ///     [NotMapped]
        ///     public string CalculatedField { get; set; }
        /// }
        /// 
        /// var mappings = new Dictionary<string, string>
        /// {
        ///     { "FullName", "CustomerName" }
        /// };
        /// await context.BulkInsertAsync(customers, "Customers", mappings);
        /// </code>
        /// </example>
        /// 
        /// <example>
        /// Example 3: Snake_case mapping
        /// <code>
        /// var mappings = new Dictionary<string, string>
        /// {
        ///     { "FirstName", "first_name" },
        ///     { "LastName", "last_name" },
        ///     { "Email", "email_address" }
        /// };
        /// await context.BulkInsertAsync(users, "user_accounts", mappings);
        /// </code>
        /// </example>
        /// </param>
        /// <param name="bulkCopyTimout">The bulk copy timeout in seconds. Default is set to 30.</param>
        public static void BulkInsert<T>(this DbContext context, IEnumerable<T> entities, string tableName = null,
            Dictionary<string, string> columnMappings = null, int bulkCopyTimeout = 30)
            where T : class
        {
            var dataTable = entities.ToBulkDataTable(columnMappings, out var finalMappings);
            BulkCopy(context, dataTable, tableName ?? typeof(T).Name, finalMappings, bulkCopyTimeout);
        }

        /// <summary>
        /// Asynchronously performs a bulk insert of the specified entities into the database.
        /// </summary>
        /// <typeparam name="T">The type of the entity.</typeparam>
        /// <param name="context">The DbContext instance.</param>
        /// <param name="entities">The collection of entities to insert.</param>
        /// <param name="tableName">
        /// Optional: The name of the destination table. If null, the entity type name is used.
        /// </param>
        /// <param name="columnMappings">
        /// Optional: Custom column mappings (propertyName → columnName).
        /// 
        /// <example>
        /// Example 1: Rename columns
        /// <code>
        /// var mappings = new Dictionary<string, string>
        /// {
        ///     { "Name", "ProductName" },
        ///     { "Price", "UnitPrice" }
        /// };
        /// await context.BulkInsertAsync(products, "Products", mappings);
        /// </code>
        /// </example>
        /// 
        /// <example>
        /// Example 2: Ignore properties using attributes
        /// <code>
        /// public class Customer
        /// {
        ///     public int Id { get; set; }
        ///     public string FullName { get; set; }
        /// 
        ///     [BulkIgnore]
        ///     public string TempNotes { get; set; }
        /// 
        ///     [NotMapped]
        ///     public string CalculatedField { get; set; }
        /// }
        /// 
        /// var mappings = new Dictionary<string, string>
        /// {
        ///     { "FullName", "CustomerName" }
        /// };
        /// await context.BulkInsertAsync(customers, "Customers", mappings);
        /// </code>
        /// </example>
        /// 
        /// <example>
        /// Example 3: Snake_case mapping
        /// <code>
        /// var mappings = new Dictionary<string, string>
        /// {
        ///     { "FirstName", "first_name" },
        ///     { "LastName", "last_name" },
        ///     { "Email", "email_address" }
        /// };
        /// await context.BulkInsertAsync(users, "user_accounts", mappings);
        /// </code>
        /// </example>
        /// </param>
        /// <param name="bulkCopyTimout">The bulk copy timeout in seconds. Default is set to 30.</param>
        public static async Task BulkInsertAsync<T>(this DbContext context, IEnumerable<T> entities, string tableName = null,
            Dictionary<string, string> columnMappings = null, int bulkCopyTimout = 30)
            where T : class
        {
            var dataTable = entities.ToBulkDataTable(columnMappings, out var finalMappings);
            await BulkCopyAsync(context, dataTable, tableName ?? typeof(T).Name, finalMappings, 30);
        }

        private static void BulkCopy(DbContext context, DataTable table, string tableName,
            Dictionary<string, string> finalMappings, int bulkCopyTimeout = 30)
        {
            var connection = (SqlConnection)context.Database.Connection;
            var wasClosed = connection.State == ConnectionState.Closed;

            if (wasClosed)
                connection.Open();

            using (var bulkCopy = new SqlBulkCopy(connection))
            {
                bulkCopy.DestinationTableName = tableName;
                bulkCopy.BulkCopyTimeout = bulkCopyTimeout;

                foreach (var map in finalMappings)
                {
                    bulkCopy.ColumnMappings.Add(map.Key, map.Value);
                }
                bulkCopy.WriteToServer(table);
            }

            if (wasClosed)
                connection.Close(); // Ensure the connection is closed after the operation, if it was closed before
        }

        private static async Task BulkCopyAsync(DbContext context, DataTable table, string tableName, Dictionary<string, string> mappings,
            int bulkCopyTimeout = 30)
        {
            var connection = (SqlConnection)context.Database.Connection;
            var wasClosed = connection.State == ConnectionState.Closed;

            if (wasClosed)
                await connection.OpenAsync();

            using (var bulkCopy = new SqlBulkCopy(connection))
            {
                bulkCopy.DestinationTableName = tableName;
                bulkCopy.BulkCopyTimeout = bulkCopyTimeout;
                foreach (var map in mappings)
                {
                    bulkCopy.ColumnMappings.Add(map.Key, map.Value);
                }
                await bulkCopy.WriteToServerAsync(table);
            }

            if (wasClosed)
                connection.Close();  //Ensure the connection is closed after the operation, if it was closed before
        }
    }
}





As the code shows, we use the DbContext from EntityFramework and get the connection string from the Database.Connection.ConnectionString property. We also use the Database.Connection property and cast it to SqlConnection to get the connection object. As we see, SqlBulkCopy is part of ADO.Net and relies upon column mappings. The code above will ignore the properties marked by the BulkIgnore attribute and the NotMapped attribute

BulkIgnoreAttribute.cs



using System;

namespace BulkOperationsEntityFramework.Lib.Attributes
{

    /// 
    /// Indicates that a property should be ignored during bulk insert operations.
    /// 
    [AttributeUsage(AttributeTargets.Property)]
    public class BulkIgnoreAttribute : Attribute
    {
    }

}


The following helper class creates a DataTable for a collection of entities (an IEnumerable) and uses reflection to make such a mapping. Note that column mappings can also be provided in the BulkInsertExtensions methods shown above to override property to column mapping, if desired. Default sql bulk copy timeout is set to 30 seconds (it is also the default timeout)

DataTableExtensions.cs


The following code provides code for creating data table for a collection of objects (IEnumerable). The BulkInsertExtensions uses the specific method ToBulkDataTable. This data table skips properties that are marked with [NotMapped] or [BulkIgnore] attributes. Both ToDataTable methods shown here allows column mappings to customize the property to column mappings.

using BulkOperationsEntityFramework.Lib.Attributes;
using System;
using System.Collections.Generic;
using System.Data;
using System.Linq;
using System.Reflection;

namespace BulkOperationsEntityFramework.Lib.Extensions
{

    public static class DatatableExtensions
    {

        /// <summary>
        /// Converts an IEnumerable of type T to a DataTable.
        /// </summary>
        /// <typeparam name="T"></typeparam>
        /// <param name="data"></param>
        /// <param name="columnMappings"></param>
        /// <returns></returns>
        public static DataTable ToDataTable<T>(this IEnumerable<T> data, Dictionary<string, string> columnMappings = null)
        {
            var dataTable = new DataTable();
            var properties = typeof(T).GetProperties(BindingFlags.Public | BindingFlags.Instance);

            foreach (var prop in properties)
            {
                var columnName = columnMappings != null && columnMappings.ContainsKey(prop.Name) ? columnMappings[prop.Name] : prop.Name;
                dataTable.Columns.Add(columnName, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
            }

            foreach (var item in data)
            {
                var values = properties.Select(p => p.GetValue(item) ?? DBNull.Value).ToArray();
                dataTable.Rows.Add(values);
            }

            return dataTable;
        }

        /// <summary>
        /// Converts an IEnumerable of type T to a DataTable with specified column mappings. Tailored for Bulk operations.
        /// </summary>
        /// <typeparam name="T"></typeparam>
        /// <param name="entities"></param>
        /// <param name="columnMappings"></param>
        /// <param name="finalMappings"></param>
        /// <returns></returns>
        public static DataTable ToBulkDataTable<T>(this IEnumerable<T> entities, Dictionary<string, string> columnMappings, out Dictionary<string, string> finalMappings)
        {
            var dataTable = new DataTable();
            var properties = typeof(T).GetProperties(BindingFlags.Public | BindingFlags.Instance)
                .Where(p =>
                    !Attribute.IsDefined(p, typeof(System.ComponentModel.DataAnnotations.Schema.NotMappedAttribute)) &&
                    !Attribute.IsDefined(p, typeof(BulkIgnoreAttribute)))
                .ToArray();

            finalMappings = new Dictionary<string, string>();

            foreach (var prop in properties)
            {
                var columnName = columnMappings != null && columnMappings.ContainsKey(prop.Name)
                    ? columnMappings[prop.Name]
                    : prop.Name;

                dataTable.Columns.Add(columnName, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
                finalMappings[prop.Name] = columnName;
            }

            foreach (var entity in entities)
            {
                var values = properties.Select(p => p.GetValue(entity) ?? DBNull.Value).ToArray();
                dataTable.Rows.Add(values);
            }

            return dataTable;
        }

    }
}




BulkInsertBenchmark

The following benchmark is available in the solution for benchmarking the different approaches to bulk copy with EntityFramework.
  • EF - add one and save in a loop. Not suprisingly, performs the worst due to the many roundtrips to the database
  • EF - add one by one and save at the end. Better performance, since we have one roundtrip. Will handle poor cases where we try to add many items in a batch.
  • EF - addrange and save at the end. Similar to the one above, one roundtrip to database
  • Dapper - add as batch and save. Minor better speed than the previous two. One roundtrip to database.
  • SqlBulkCopy - clearly the fastest way to insert a batch of entities to the database



using BenchmarkDotNet.Attributes;
using Bogus;
using BulkOperationsEntityFramework.Lib.Extensions;
using BulkOperationsEntityFramework.Models;
using Dapper;
using System.Configuration;
using System.Data.SqlClient;
using System.Linq;
using System.Threading.Tasks;

namespace BulkOperationsEntityFramework.Benchmarks
{

    [MemoryDiagnoser]
    public class BulkInsertBenchmark
    {

        private static readonly Faker Faker = new Faker();

        [Params(100)]
        public int Size { get; set; }

        //First benchmark - Naive approach - add one and one entity and save changes everytime with round trip to database

        [Benchmark(Description = "EFAddOneAndSave - Add one user and then save. Then add another one. Results in many round-trips to database.")]
        public async Task EFAddOneAndSave()
        {
            using (var context = new ApplicationDbContext())
            {
                foreach (var user in GetUsers())
                {
                    context.Users.Add(user);
                    await context.SaveChangesAsync();
                }
            }
        }

        [Benchmark(Description = "EFAddOneByOneAndSave - Add one by one user, but save once after adding them. Results in one round-trip to database")]
        public async Task EFAddOneByOneAndSave()
        {
            using (var context = new ApplicationDbContext())
            {
                foreach (var user in GetUsers())
                {
                    context.Users.Add(user);

                }
                await context.SaveChangesAsync();
            }
        }

        [Benchmark(Description = "EFAddRange - Adds the users, then does the save. Results in one round-trip to database")]
        public async Task EFAddRange()
        {
            using (var context = new ApplicationDbContext())
            {
                var users = GetUsers();
                context.Users.AddRange(users);
                await context.SaveChangesAsync();
            }
        }

        [Benchmark(Description = "DapperInsertRange - Uses Dapper to insert the users. Results in one round-trip to database")]
        public async Task DapperInsertRange()
        {
            var connectionString = ConfigurationManager.ConnectionStrings["App"].ConnectionString;

            string sql = @"
                INSERT INTO Users (Email, FirstName, LastName, PhoneNumber)
                VALUES (@Email, @FirstName, @LastName, @PhoneNumber)
            ".Trim();

            using (var connection = new SqlConnection(connectionString))
            {
                var users = GetUsers().Select(u => new 
                {
                    u.Email,
                    u.FirstName,
                    u.LastName,
                    u.PhoneNumber
                }).ToArray();

                await connection.ExecuteAsync(sql, users);
            }
        }

        [Benchmark(Description = "SqlBulkCopy - Uses SqlBulkCopy to insert the users. Results in one round-trip to database")]
        public async Task SqlBulkCopy()
        {
            using (var context = new ApplicationDbContext())
            {
                await context.BulkInsertAsync(GetUsers(), "Users");
            }
        }



        private User[] GetUsers() =>
            Enumerable.Range(1, Size).Select(i => new User
            {
                Email = Faker.Internet.Email(),
                FirstName = Faker.Name.FirstName(),
                LastName = Faker.Name.LastName(),
                PhoneNumber = Faker.Phone.PhoneNumber()
            }).ToArray();

    }

}


The following screenshots shows the benchmark results after running Benchmark.NET for the mentioned approaches and comparing the performance.

Results

Benchmark for a given batch size of 100

Benchmark for a given batch size of 100

Saturday, 19 March 2022

Using C# 9 language features in .NET Framework and .NET Standard projects

C# 7.0 came out in March 2017 and Microsoft has published other frameworks later, such as .NET Core and .NET 5 plus .NET 6. If you are working with a .NET Framework based solution (or .NET Standard 2.0), you can actually get support for C# 8 and C# 9 language version - enabling you to utilize more of C# language features. The following steps can be used to enable C# language 9 in for example .NET Framework 4.8 (tested and verified that I could use records (a C# 9 language feature).
  • Specify in the .csproj file(s) that you want to use <LangVersion> element set to 9.0
  • Consider using a file called Directory.Build.props (at the root level of your solution) (Case sensitive on Linux) with this shared setting to enabled C# 9.0 version in all projects.
  • Using C# 9 language version also requires you to include a small file in each project listed below, call it IsExternalInitPatch.cs for example.
File IsExternalInitPatch.cs should include this :

  
namespace System.Runtime.CompilerServices
{
    internal static class IsExternalInit { }
}
  


Now you can start playing around with C# 9 in a .NET Framework 4.8 solution for example, which earlier has been limited to C# 7.1 and no later language version features of C#.

namespace SomeAcme.SomeProduct.Common.Test
{
    /// <summary>
    /// This is just a test of csharp 9 for SomeAcme.SomeProduct
    /// Note that Directory.Build.props in this branch uses LangVersion set to 9.0 and we need the file IsExternalInit.cs in every project
    /// </summary>
    /// <remarks>
    /// See these two urls: 
    /// https://btburnett.com/csharp/2020/12/11/csharp-9-records-and-init-only-setters-without-dotnet5.html
    /// https://blog.ndepend.com/using-c9-record-and-init-property-in-your-net-framework-4-x-net-standard-and-net-core-projects/
    /// </remarks>
    [TestFixture]
    public class TestOutCsharpNine
    {

        public record Operasjon (DateTime StartTid, bool ErElektiv, string PasientNavn);


        [Test]
        public void Test_Records_ChsharpNine_And_Deconstruction_And_Discardable_Variables()
        {
            var op = new Operasjon(DateTime.Today.AddHours(8).AddMinutes(15), true, "Bjarne BrΓΈndbo");
            (_, _, string pasientNavn) = op;
            pasientNavn.Should().Be(op.PasientNavn);
        }

        [Test]
        public void Test_Init_Only_Props()
        {
            var op = new OperasjonWithInitOnlyProps
            {
                ErElektiv = true,
                PasientNavn = "Thomas BrΓΈndbo"
            };
            // op.PasientNavn = "foo"; 
            //uncommenting line above should demonstrate init only property giving compiller error if trying to mutate or alter this property
            op.PasientNavn.Should().Contain("BrΓΈndbo");
        }

        [DataContract]
        public class OperasjonWithInitOnlyProps
        {
            [DataMember]
            public string PasientNavn { get; init; }
            [DataMember]
            public bool ErElektiv { get; init; }
        }
    }
}


The CSharp compiler sets up default the CSharp language features according to these rules: The compiler determines a default based on these rules:
Target framework	version	C# language version default
.NET	6.x	C# 10
.NET	5.x	C# 9.0
.NET Core	3.x	C# 8.0
.NET Core	2.x	C# 7.3
.NET Standard	2.1	C# 8.0
.NET Standard	2.0	C# 7.3
.NET Standard	1.x	C# 7.3
.NET Framework	all	C# 7.3
So .NET Framework and .NET Standard based solution has not gotten per default any modernization of C# sharp features since March 2017 (five years ago), but we can with some small modification still use C# 9.0 which came out 1.5 years ago. Of course, this C# language version is meant to be used with .NET 5, so do not expect everything to be supported on it. However, chances are high that much of C# 8 and C# 9 language features could be handy to use in many .NET Framework and .NET Standard based projects. For example, records with their support for immutability is definately a big new thing in C# compared to what is avilable in C# language version 8 or earlier. Lastly, you must also consider how to build C# 9 language features (which assumes .NET 5 available) on a build server. For Team City for example, you must install .NET 5 SDK on the build agent.
Also, most likely you have for example a MS Build step in Team City, so you should use MS Build 16 (VS 2019 Build tools) and install the build tools for VS 2019 on the build agent from this url or google for Build Tools for VS 2019: https://visualstudio.microsoft.com/thank-you-downloading-visual-studio/?sku=BuildTools&rel=16&src=myvs&utm_medium=microsoft&utm_source=my.visualstudio.com&utm_campaign=download&utm_content=vs+buildtools+2019 For Azure Devops, choose the VS 2022 agent. I still had to add a "Use NET Core" task and choose 'Package to install' set to 'SDK (contains runtime)', the YAML looks like this:

steps:
- task: UseDotNet@2
  displayName: 'Use .NET Core sdk 5.0.100'
  inputs:
    version: 5.0.100
    includePreviewVersions: true

Also note this - albeit you might have .NET Framework 4.8 in a project, your config file like app.config might have this :
 
  


<?xml version="1.0" encoding="utf-8" ?>
<configuration>
    <startup> 
        <supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.6.1" />
    </startup>
	<appSettings>
 
 
The supportedRuntime might force you to in a specific project to have a LangVersion set to a lower value anyways. So you might need for example to have LangVersion set to 7.1 in one project and default to LangVersion 9.0. To sum up :
  • .NET Framework and .NET Standard can still use C# language version 8 or 9. You need to do the adjustments I mentioned in this article.
  • C# language version 10 is only supported by .NET 6. To use this language version you have to upgrade framework..
  • Also - test out new language features in one project first and use the basic features first. If you use advanced language features of C# language version 8 or 9 you might consider some glitches.. However, you should get a compiler warning for most errors you encounter.
  • Don't forget that your build agent must be able to build the solution too. So you can use VS 2022 hosted agent and consider also the USE NET Core Sdk task I mentioned here if you build in Azure Devops. If you use a self-hosted agent, like Team City on-premises build agent, you need to install the newest VS 2019 SDK / Build Tools to ensure that you have the C# langversion.
In the Developer command prompt on the build agent you can run this command
 
  csc -langversion:? 
 
This should output the langversions of C# your build agent supports. It also works on a developer PC (use the VS 2019 command prompt). As I noted, C# 10 is only supported in .NET 6. We might have a future situation where C# 11 is still supported in a .NET 6 solution - I am not sure what Microsoft is planning here. But for other and earlier frameworks, it looks like C# 9 is the end of the road of language versions - we have to upgrade to .NET 6 to utilize newer language features (or consider dragging in Nuget compiler packages ..)