Unexpected inconsistency in records

codeblog.jonskeet.uk

52 points by OptionOfT 3 days ago


andygocke - 2 hours ago

Unfortunately, there are alternatives to this behavior, but they all have other downsides. The biggest constraint was the schedule didn't support a new version of the .NET IL format (and reving the IL format is an expensive change for compat purposes, as well). There were two strong lowering contenders, with their own problems.

The first is to use a `With` method and rely on "optional" parameters in some sense. When you write `with { x = 3 }` you're basically writing a `.With(x: 3)` call, and `With` presumably calls the constructor with the appropriate values. The problem here is that optional parameters are also kind of fake. The .NET IL format doesn't have a notion of optional parameters -- the C# compiler just fills in the parameters when lowering the call. So that means that adding a new field to a record would require adding a new parameter. But adding a new parameter means that you've broken binary backwards compatibility. One of the goals of records was to make these kinds of "simple" data updates possible, instead of the current situation with classes where they can be very challenging.

The second option is a `With` method for every field. A single `with { }` call turns into N `WithX(3).WithY(5)` for each field being set. The problem with that is that it is a lot of dead assignments that need to be unwound by the JIT. We didn't see that happening reliably, which was pretty concerning because it would also result in a lot of allocation garbage.

So basically, this was a narrow decision that fit into the space we had. If I had the chance, I would completely rework dotnet/C# initialization for a reboot of the language.

One thing I proposed, but was not accepted, was to make records much more simple across the board. By forbidding a lot of the complex constructs, the footguns are also avoided. But that was seen as too limiting. Reading between the lines, I bet Jon wouldn’t have liked this either, as some of the fancy things he’s doing may not have been possible.

wpollock - 6 hours ago

As I read the post, I thought of relational data models. The behavior is expected. I believe the root issue is your records should not have computed fields that depend on mutable fields. Change your record schemas to eliminate that and you should have no further problems using "with".

If changing the schema isn't reasonable, use a copy constructor instead.

wavemode - 6 hours ago

If John Skeet of all people is confused about something in C#, that probably means it's objectively confusing behavior.

OptionOfT - 6 hours ago

I think this stems from having properties, which are synthetic sugar for a backing-field and a getter & setter function.

This muddies the water between just setting a field vs executing a function that does work and then sets the field.

If I write a record with an explicit setFoo(foo: Foo) I wouldn't expect a clone & subsequent direct field assignment to execute the setFoo(foo: Foo) code.

_old_dude_ - 7 hours ago

For the record (sorry), I believe C# uses the clone operation because records support inheritance.

For me, this is where lies the design flaw, trying to support both inheritance and be immutability at the same time.

benmmurphy - 7 hours ago

Seems like enough of a gotcha that other people have stumbled upon the issue as well: https://blog.codingmilitia.com/2022/09/01/beware-of-records-...

harpiaharpyja - 2 hours ago

Ehh, I think the real footgun here is using a property with backing storage to store what is clearly a derived value. Using a computed property is what we really should be doing here, if we think our code should line up with our intentions.

I feel like what's happened here is that the author actually needed a system to cache their derived values, but didn't think to build that out explicitly.

louthy - 6 hours ago

I've hit this before and facepalmed when I realised what they had done. Records were supposed to work more like immutable product-types in F# and other functional languages, but with this approach it can't be seen as anything other than broken IMHO.

Sometimes the csharplang team make some utterly insane decisions. This is the kind of thing that would happen back-in-the-day™ in the name of performance, but in the modern world just adds to the endless list of quirks (like uninitialised structs). I suspect there are elements of the design team that are still stuck in C#1.0/Java mode and so holes like this don't even seem that bad in their minds. But it literally leads to inconsistent data structures which is where bugs live (and potential security issues in the worst cases).

uticus - 7 hours ago

It's been a while since I followed Jon Skeet, but his books on Manning were always worthwhile. Plus the Jon Skeet facts [0] is fun.

[0] https://meta.stackexchange.com/questions/9134/jon-skeet-fact...

high_na_euv - 7 hours ago

To be fair I dont think this behaviour is unreasonable

apwell23 - 7 hours ago

i wonder how much of Coding Agents/AIs are now just Jon Skeet.

croemer - 7 hours ago

The "– Jon Skeet's coding blog" in the title is not necessary as the URL shows in parentheses after the title. Adding C# however might be helpful.