-
Notifications
You must be signed in to change notification settings - Fork 758
Add support for generic types for params parameter. #5066
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
@stevenaw I think I covered most cases. To be decided if we want to allow all C# built-in conversions or only the once supported by NUnit's ParamAttributeTypeConversions. If we want to change it, then some nunit test in Range need changing as the C# built-in rules allow more conversion. Alternatively we can call a separate method to convert params members. |
|
Thanks @manfred-brands ! I appreciate all the edge cases and coverage you've accounted for here. I admit, it was unintentional of me to add the special conversions for TestCaseSourceAttribute as part of my other PR. My understanding was that those conversions were added historically as a convenience feature for TestCaseAttribute as only limited types are supported to specified into the attribute constructor at compile time. For example, you can specify a double but not a decimal, so NUnit internally supported converting them. TestCaseSource, on the other hand, happens at runtime where users have much more control over the types they can pass into the test method. When faced with feature requests in the past to support implicit conversions for TestCaseSource (or even other scenarios in TestCase), we've tended to instead recommend users instead just pass the types they want to receive (ex: #3385). I see my PR has muddied those waters now. Had I caught that I certainly would've suggested a discussion before making such a change, but perhaps we could have the discussion now as a team. Personally, I'd prefer to keep TestCaseSource simple and undo the parameter conversion behaviour I'd inadvertently added to it. @nunit/framework-team what are your thoughts? There is also another historical piece to this, to my understanding. Support for the framework to emulate the C# implicit conversion spec by performing a "preconversion" of the arguments was added in order to support certain runtimes which didn't do it themselves. I understand .NET Framework CE was one such example. I don't believe that is something that still needs to be supported and it's likely that at least some of that conversion code could be removed. I had tried to start a discussion about it in #3761 |
|
Thanks @stevenaw, I didn't know we have soo many places doing conversions #3219 Regarding the removal, I don't think we should. Even though There is no type safety in From Built-in numeric conversions:
|
|
That's a fair example. I had been thinking differently, and more along the lines that "if a user wants an array of decimals then they should pass an array of decimals" so that it just works. If the data passed to the test isn't implicitly compatible then it might be a mistake by the user that they should be alerted about. If there are truly ambiguous cases for a generic params array then there is the In other words, I agree that adding these conversions here would improve consistency between |
|
@stevenaw Can this PR be approved ? |
|
@OsirisTerje I had a few concerns about the argument conversion aspect. Historically we have avoided that in TestCaseSource and instead encouraged users to simply pass their intended types rather than NUnit taking over and converting anything. Parameter conversion was accidentally introduced in my previous PR #5057. My preference would actually be that we remove that parameter conversion from TestCaseSource instead of building upon it. I think @manfred-brands and I have initially differing opinions about it and I'd welcome more thoughts on the long term direction we want to take the attribute as a team. |
|
At a glance, yes. If single is not implicitly convertible to int then that example working would rely on NUnit doing the conversion internally. I'd hoped a quick discussion might be possible here to get clarity for this PR, but perhaps a dedicated discussion is better for visibility. I could spin one up in a few hours One argument for adding the conversions that I would also offer here is cross-attribute consistency. |
@OsirisTerje Just coming back to this with a quick question in lieu of starting a larger discussion. Despite my preference not to have added the conversions within my own PR, I've also felt that I would be alright to keep the conversions in if we felt as a team that we wanted to go in that direction. It sounds like you may like the idea of adding the conversions based on your changes for #5079. Would you prefer to keep it in? |
|
Sorry for the delays here @manfred-brands |
|
@manfred-brands I see now that the parameter discussion has been open for about two weeks. It doesn't seem like there's been any objections raised which I am inclined to also take as agreement about the approach. It might be a day or two until I can review but I'll take a close look at this when I'm able. Thanks again for your patience on this one |
| } | ||
| else | ||
| { | ||
| // Should we replace with empty array or an array with one null object or leave as is? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question: Is this to account for a case such as:
[TestCase(1, 2, null)]
[TestCase("a", "b", null)]
public static void ReferenceParams<T>(T a, T b, params T[] rest)
{
if (rest is null) {
Console.WriteLine("null");
} else {
Console.WriteLine(rest.Length);
}
}If so then I think rest should be a null array instead of an empty array or array with a null inside in order to mimic what the runtime seems to do in that case. Both of the below would output "null" instead of 0 or 1 if called like a regular C# method
ReferenceParams(1, 2, null);
ReferenceParams("a", "b", null);If this is a change in behaviour from my first PR then thank you for catching it :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Your PR and before including 3.14 does wrap null into [null].
My remark was to question this behaviour.
If you change it up to only have a params field you get different behaviour in nunit:
[TestCase("a", "b")]
[TestCase()]
[TestCase([null])]
[TestCase(null)]
public void ReferenceParams(params string?[]? rest)
{
if (rest is null)
{
Console.WriteLine("null");
}
else
{
Console.WriteLine(rest?.Length);
}
}This result in an output:
2
0
1
1
The last two are considered the same and in Test Explorer it only shows 3 tests (but it runs 4)
If I execute this in normal C#, I get:
2
0
1
null
I agree with you we should emulate the same behaviour.
Changing that breaks the HandlesParamsArrayWithExplicitNullArgument test you added.
As well as the test GenericParams I added, but where I also questioned the behaviour.
@OsirisTerje This is a breaking change from 3.14 and 4.4.0, but one that wasn't previously tested.
The pre-existing behaviour is weird:
[TestCase(null)]
public static void MethodWithParamsArrayNullParameter(params int[]? a)
{
}This actually passes an array with 1 element, being the integer 0.
The cause (accidental?) is the test that checks if the argument passed has the correct type, in case the user passes aan array.
As null doesn't have a type, this test fails and it creates a new array with 1 element.
This might have been an unexpected side-effect of the original code.
I have updated the code to match the C# normal conventions.
Yes, I do. I'm thinking that it should follow the C# implicit conversions. As we're untyped, we (imho) should provide that conversion ourselves. Otherwise I feel it is a bit of a "surprise" there for the users. E.g. all of these are valid in C#: int x = 42;
float y = 42;
float z= x;
double w = z;and the total table should be like this:
or expressed as @manfred-brands does at #5066 (comment), ref MS links at the bottom there. The best would of course be that we were typed too, so that we could take advantage of the implicit conversions. |
|
@OsirisTerje For example, something like this will work in NUnit to allow a mechanism of defining a date at compile-time in a [TestCase("2025-10-10")]
public static void TestDate(DateTime d)
{
}However the following code would not compile which is the root of my initial hesitation to propagate certain parameter conversion behaviour to TestCaseSource DateTime x = "2025-10-10";
// or
TestDate("2025-10-10");Looking closer at the code same in #3125 I agree with your initial statement that that should be allowed as passing |
|
@stevenaw Hmm..... I noted when doing the opposite (double into int) I got a NUnit1001 warning from the analyzer. I was not even aware that one could convert from string to DateTime, and frankly - that conversion is not global, it is only InvariantCulture - a kind of US Light. It is kind of cool - although very surprised to see that functionality. and it happens here: Looking at the other code above It seems I was a bit too fast here, I "believed" I saw the intrinsic conversions, but when looking further I think I understand your concern. These are ALSO converting from higher to lower (e.g. int to byte) , meaning information is lost. But a test will still convert from e.g. float -> double, so why do we then do the intrinsic conversions ? And with that, I am pretty much lost here......That said: If I get what this issue is trying to do, is to introduce type safety in parameters. Then we should not need any extra conversions for values that can be intrinsic'ly converted, right ? |
|
So, when looking at @manfred-brands code, I really like the split of (Not quite fully typesafe though, and still needing intrinsic conversions - ARE we really sure we need that? ) But I see the code with the hardcoded InvariantCulture is still there. I assume that can be handled in a separate PR later. No one outside US can be using that anyway with their default culture. The other NUnitConversions, I assume that makes it easier to write |
|
Agreed with @OsirisTerje I like how we've separated that conversion split in this PR. |
stevenaw
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @manfred-brands
It's late here by the time I've been able to start this but I've tried to at least leave a few comments and questions to at least get the ball rolling. I understand the broader "conversions" discussion may still be going with @OsirisTerje
| /// Set of implicit conversion for built-in numeric types. | ||
| /// https://learn.microsoft.com/en-us/dotnet/csharp/language-reference/builtin-types/numeric-conversions#implicit-numeric-conversions | ||
| /// </summary> | ||
| private static readonly Dictionary<Type, HashSet<Type>> BuiltInNumericalConversions = new() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We have a similar mapping here
| private static readonly Dictionary<Type, List<Type>> ConvertibleValueTypes = new() |
Although that one is private and yours seems to have been updated to reflect newer types like
nuint, etc.
I don't know the "best" place for this to live long term right but I suspect we'll find that out as we work through #3219. For now, how would you feel about updating that mapping in Reflect.cs with the new types you've noticed and to call Reflect.CanImplicitlyConvertTo() until we can sort out the consolidation?
| bool convert = HasNUnitConversion(valueType, underlyingTargetType); | ||
|
|
||
| //if (convert is false) | ||
| // convert = HasImplicitConversion(valueType, underlyingTargetType); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure, was this accidentally left commented out?
| array.SetValue(Arguments[argsProvided - 1], 0); | ||
| Arguments[argsProvided - 1] = array; | ||
| Type lastArgumentType = lastArgument.GetType(); | ||
| if (lastArgumentType.IsArray) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good catch
| else | ||
| { | ||
| // Try to infer the type from the provided arguments | ||
| elementType = DetermineBestElementType(Arguments, argsNeeded - 1); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We have a TypeHelper.TryGetBestCommonType() method already. Would that work here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for pointing that out, I should have looked for it instead of implementing a new one.
But that method has yet another list of conversions.
| value is not null && | ||
| !elementType.IsAssignableFrom(value.GetType())) | ||
| { | ||
| value = ParamAttributeTypeConversions.Convert(value, elementType); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question: I can see you'd added a lot of type calculations before the code gets to this point. Do we still need to convert here or will the array type already be the maximally compatible type such that we wouldn't need to convert the individual arguments anymore?
I'm unsure if there's an edge case I'm still missing... maybe implicit conversions?
c7dd4a9 to
5bfa226
Compare
Fixes #5061