Fix reflection of structs containing 16-bit types. Closes #3605

* It is impossible to emit a true 16-bit type on fxc, the minXX types we round
  up internally to a 32-bit type since that's how they are defined to appear in
  external resources like cbuffers and SRV/UAVs.
* The new 16-bit type enums that are shared between fxc/dxc structs are not
  actually ever emitted by fxc for RDEF types.
This commit is contained in:
baldurk
2025-05-05 14:49:24 +01:00
parent 62be3340a5
commit 4dbcfca343
@@ -61,12 +61,12 @@ static ShaderConstantType MakeShaderConstantType(bool cbufferPacking, DXBC::CBuf
type.varClass == DXBC::CLASS_SCALAR)
ret.flags |= ShaderVariableFlags::RowMajorMatrix;
uint32_t baseElemSize = (ret.baseType == VarType::Double) ? 8 : 4;
uint32_t baseElemSize = VarTypeByteSize(ret.baseType);
// in D3D matrices in cbuffers always take up a float4 per row/column. Structured buffers in
// SRVs/UAVs are tightly packed
if(cbufferPacking)
ret.matrixByteStride = uint8_t(baseElemSize * 4);
ret.matrixByteStride = AlignUp16(uint8_t(baseElemSize * 4));
else
ret.matrixByteStride = uint8_t(baseElemSize * (ret.RowMajor() ? ret.columns : ret.rows));