iTranslated by AI

The content below is an AI-generated translation. This is an experimental feature, and may contain errors. View original article
6️⃣

[.NET 10.0] dnx [Preview 6]

に公開

.NET 10.0 Preview 6 was released yesterday before I went to bed. I'll pick out a few points that caught my interest.

https://x.com/sator_imaging/status/1945290308951929168

dnx & dotnet tool exec

Just like npx (Node.js Package eXecute?), you can now execute dotnet tools without installing the package.

dnx PackageName --yes -- [options...]

Regarding the behavior where it asks for permission on the first run if you don't include --yes, it's the same as npx, but I wonder if it's really necessary. Isn't the point to run without installation for CI/CD use cases...?

At the moment, the process doesn't pull the package name from the NuGet package's ToolCommandName. So, I created a NuGet package named static-import (!) for the tool I made the other day 👇, so that I can run dnx static-import.

https://zenn.dev/sator_imaging/articles/7b1df223d17d89

Random Thoughts

It's nice, but the concern is that dotnet isn't as widespread as npx (npm) yet. It's often not included in Docker images by default, so what should we do? That's the impression I get.

One of the strengths of .NET is that you can do almost anything with just the standard libraries. On the other hand, the large SDK installation size might make it unsuitable for small use cases. In that regard, Node.js is good because while node_modules ends up everywhere, the installation size is small (around 100MB all-inclusive).

Maybe we should just make it so you can go from SDK installation to execution with:

npx dotnet -- package@x.x.x [options...]

Though the package name might already be taken.

(In the future, maybe everything will be solved with a "power play" like distributing lightweight, native execution environments compiled for each environment?)

Native AOT

https://github.com/dotnet/core/blob/main/release-notes/10.0/preview/preview6/sdk.md#platform-specific-net-tools

.NET tools can now be published with support for multiple Runtime Identifiers (RIDs) in a single package. Tool authors can bundle binaries for all supported platforms, and the .NET CLI will select the correct one at installation or execution time. This makes creating and distributing cross-platform tools much easier.

thx: DeepL translation

It's a bit confusing because the ToolType option shown in the sample isn't actually a newly added option.

dotnet pack -p ToolType=<variation>  # This isn't universally applicable!

If you look at the sample repository, you'll see that it's just a hard-coded <ToolType> property in the C# project file, which is then used to change options.

https://github.com/baronfel/multi-rid-tool/blob/81e93cec8ba7aacdd1f3933fd594cf55f6b3e186/toolsay/toolsay.csproj#L28-L56

Specifying multiple targets in <RuntimeIdentifiers> and running dotnet pack doesn't batch them together or anything like that. You still need to package them individually using -r or --use-current-runtime as before.

The sample GitHub Action has .zip files packaged in the following format as artifacts:

  • all-aot-packages.zip
    • aot-package-agnostic/
      • <package>.<version>.nupkg
    • aot-package-ARCH/
      • <package>.<ARCH>.<version>.nupkg

It's unclear if this structure allows dnx toolsay to execute the AOT runtime correctly. I couldn't test it because the actual package hasn't been uploaded to NuGet.

I tried testing it with static-import, but gave up because the native AOT compilation failed at the linker stage.

At any rate, the feature to:

  • automatically bundle all native AOT builds into a single .nupkg (ZIP archive) nicely via dotnet pack

has not been implemented. In short, dotnet pack hasn't changed. Most likely, it's just that dotnet tool exec side has been updated to choose the best file from within the .nupkg.

Ultimately, since all the runtimes need to be bundled into a single .nupkg (.zip), I suppose we're waiting for an update to dotnet pack. A .nupkg that can run without library dependencies might result in a massive distribution size, so I wonder... where would we use it? It might be a bit heavy for dnx.

while Loop Optimization

This is an optimization that can be applied (manually) even in older environments like Unity.

https://x.com/sator_imaging/status/1945302891813589128

The compilation results are as follows (verified in Unity).

while (Traditional)

  • Unconditional jump to the while condition check
  • Code block inside the while loop
  • Check for while continuation condition (where the initial unconditional jump lands)
  • Code block after exiting the while loop

if...do...while (Optimized)

  • Check with the same condition as while; if false, jump to after the loop
  • Code block inside the do...while loop
  • Check for do...while continuation condition
  • Code block after exiting the do...while loop (where the initial conditional branch lands)

In the traditional method, there was a high probability of passing through a code path that skips the code inside the while block only to jump back; it seems this has been improved by increasing the code size by one byte. Honestly, what really changes? Does it depend on the code size within the block? However, considering it will be applied throughout the BCL, it seems like a big deal.

It sounds like a case where an optimization—like "do...while(true) is better than while(true)" in C/C++, which is generally considered obsolete because modern compilers are so optimized—was actually still effective in C#.

Even in old environments

Just like 👇, if you tune it thoroughly, you can pull out performance equivalent to the latest .NET environments, so it's a technique worth keeping in mind. (Refer to the List benchmark results)

https://qiita.com/sator_imaging/items/c4ac0dab548a3e78878f#ベンチマーク

※ Note: I don't think it'll change anything unless you're working with a massive SDK lol

Conclusion

Single-file executable apps open up a lot of possibilities, but I feel like npx might be enough for now. After all, the .NET SDK isn't usually installed. Maybe in about two more versions.

That's all. Thank you for reading.

Discussion