This is part of the work specified by github Issue 586, allowing the
ability to save out the overlay in the TextureViewer. If no overlays
are on then there is no option to save the overlay. Currently there is
no option to remap the overlay to a grayscale or absolute value range
before saving. This can be a future task.
NOTE: the overlay texture resource that's saved out is not the blended
texture that the user will see in the TextureViewer, it is just the
overlay itself. The ability to save out the blended texture would be a
future task.
* We search first in specified folders by the user (they can browse to
the android SDK and java JDK).
* If the tools we want aren't found there, we look relative to the UI
as we now distribute the required tools with windows builds.
* If we still don't find them, we prefer to look in PATH since the user
has 'opted in' to any tools found in there. If the tool isn't in PATH
either then we look relative to known environment variables.
* This will enable the last few python list emulation functions, like
index (which needs operator== to find objects) and sort (which
obviously needs operator< to sort).
* This is to support python bindings - the pyside implementation of
QVector, QString, etc is not available to SWIG, so SWIG treates these
all as opaque types.
* Rather than trying to set up bindings that work for rdcarray and
QList/QVector, or implementing separate bindings, we instead just say
that the public interface must use the rdc types. In most cases they
seamlessly convert to/from Qt types anyway.
* In a couple of places we use an array of pairs instead of a map. In
future we probably want an rdcdict or rdcmap with proper dict bindings
in python.
* Previously we'd cache a copy of each command buffer at load time, and
submit it any time we're not partially re-recording. This has a couple
of drawbacks though:
- Technically we do some things that invalidate those command buffers,
like updating descriptor sets (with initial state application) and
so for 100% correctness we'd need to re-record.
- It also means that any edits we apply, like modified shaders, don't
properly apply to the whole frame, they only apply to whichever
command buffer is currently being partially recorded.
* We refactor out the 're-record all commands' behaviour previously
reserved just for applying GPU counters, and use that for re-recording
any command buffers that are wholly or partially submitted. Note that
it's still true that only one primary and one secondary at most are
actually *partially* re-recorded. The others are re-recorded in their
entirety.
* We almost always mean DXGI_FORMAT_R10G10B10A2_UNORM instead and the
xr bias format is rarely used (so unlikely to be the right
interpretation) and might not work at all.
* Technically the resolve doesn't allow format conversion at all, so
instead of resolving directly to the swapchain image we need a middle-
man image of the same format as we use internally for the MSAA target
(RGBA8_SRGB) to resolve to, then blit from that to the actual
backbuffer.
* Historically a long time ago, resources created in the middle of a
frame capture were then replayed with their creation/destruction each
time the frame was replayed. Likewise resources destroyed before the
frame (but kept alive for a dependency) were also released on replay.
* This was faithful but unnecessary. Now we just create all resources
needed anywhere in the frame up front, and release them only on
shutdown.