is this handling of references with multiple encodings correct?
Submitted by Kristof Szabados
Link to original bug (#563159)
Description
Assuming these types:
type enumerated E1 { val1, val2, val3 }
with {
encode "TEXT";
}
type record R1 {
E1 f1
}
with {
encode "XML";
encode "abc";
encode (f1) "XML";
variant (f1) "XML"."untagged";
encode (f1) "RAW";
variant (f1) "RAW"."16 bit";
}
In the following testcase:
testcase tc_ttcn_encvalue() runs on CT {
var template R1 x1 := { f1 := val1 };
var octetstring exp := char2oct("abc");
var octetstring enc := bit2oct(encvalue(x1, "", "abc"));
if (exp != enc) {
setverdict(fail, "Expected: ", exp, ", got: ", enc);
}
exp := char2oct("`<f1>`val1`</f1>`\n\n");
enc := bit2oct(encvalue(x1.f1, "", "XML"));
if (exp != enc) {
setverdict(fail, "Expected: ", exp, ", got: ", enc);
}
var U2 x2 := { alt1 := 10 };
exp := char2oct("10");
enc := bit2oct(encvalue(x2, "dummy")); // encvalue knows which encoding to use, since U2 only has JSON encoding
if (exp != enc) {
setverdict(fail, "Expected: ", exp, ", got: ", enc);
}
setverdict(pass);
}
The designer reports "XML" as a coding not supported. Is this a good behaviour? x1.f1 might have such an encoding ... but the type of x1.f1 aka. E1 really does not have.
Version: 6.6.1