Add GltfExtensionHandler interface for draco#22907
Add GltfExtensionHandler interface for draco#22907jiangheng90 wants to merge 14 commits intobevyengine:mainfrom
Conversation
ChristopherBiscardi
left a comment
There was a problem hiding this comment.
Is this draco extension available to read somewhere? It would be nice to understand if the api usage here, especially passing the entire gltf json blob, is required by the DracoExtension or if it is just convenient.
Doing this as an extension makes sense. The additional hook to process primitives makes sense. Bevy doesn't currently support asset streaming, so processing the data in the extension like this into a fully formed Mesh makes sense.
Support for wasm would be nice since compression is good for network traffic, but seems to require upstream changes. This is another place where it would be nice to be able to evaluate the DracoExtension implementation and see what it is doing to inform the hook APIs. We do use IoTaskPool in processing the textures, for example, so evolving this to support async is a possibility.
The Document argument is a bit awkward, but is the only owned struct the gltf crate offers. The buffer_data handling here also seems a bit strange. Due to the index usage and the out_data: Option<Vec<Vec<u8>>> type signature. Making users construct a new Document with the Primitive that then lines up its buffer index with the position in the new buffers vec is awkward, but is something that we can refactor and isn't reflective of this PR working.
Overall I this handling is about as good as it can be in the current context.
There is really only one question I want to have an answer to for an approval on this:
- What validation specifically needed to be disabled for this to work?
I would also like to see the DracoExtension, and a working example if possible. It doesn't need to be included in this PR, a link to a git repo or similar would be good. This is more for future work on the hook APIs, such as potentially providing async versions.
|
The CI failure is just an import ordering/cargo fmt. |
Co-authored-by: Chris Biscardi <chris@christopherbiscardi.com>
|
@ChristopherBiscardi I have make a plugin and open source it in In example, I have setup two model to test bevy_gltf_draco is a bevy side implement contains primitive document build and buffer insert the draco decode implement is in this repo contains native implement use cxxbridge and wasm implement is in made of javascript using draco wasm distribution here is the decoded model. animation works fine
|
ChristopherBiscardi
left a comment
There was a problem hiding this comment.
Thank you!
I've created #23026 to track the async'ing of the hooks.
There was a problem hiding this comment.
Can you explain more why this needs to be a &Vec? In https://github.com/jiangheng90-opensource/bevy_gltf_draco/blob/main/src/khr_draco_mesh_compression.rs you could just ask for a &[Vec<u8>] instead (I know this means changing the API which is unfortunate, but I don't think we should degrade our API for this).
| reason = "on_gltf_primitive requires &Vec<Vec<u8>> for external API compatibility" | ||
| )] | ||
| buffer_data: &Vec<Vec<u8>>, | ||
| out_doc: &mut Option<gltf::Document>, |
There was a problem hiding this comment.
Should we just return these instead? Maybe it should also return Option<(gltf::Document, Vec<Vec<u8>>)>?
There was a problem hiding this comment.
returning these would require passing them through to all the other extensions as owned data too, afaict. What do we do if we get back 3 document/vec pairs from 3 different extensions?
There was a problem hiding this comment.
Ohhhh I understand now. Gross. I guess this is the best we can do...
| settings: &'b GltfLoaderSettings, | ||
| ) -> Result<Gltf, GltfError> { | ||
| let gltf = gltf::Gltf::from_slice(bytes)?; | ||
| let gltf = if settings.validate { |
There was a problem hiding this comment.
I don't really understand what this validation is doing, and why turning this off would be ok. Could you explain?
There was a problem hiding this comment.
in this case validation is checking to see if an extension is registered in the required extensions array in the gltf data, and if the gltf crate specifically supports the extension. the gltf crate does not currently support user-side extensions, so this has to be disabled if an extension is required for rendering, which draco compressed meshes would be.
| &gltf, | ||
| &primitive, | ||
| &buffer_data, | ||
| &mut out_doc, |
There was a problem hiding this comment.
The way this is written right now, extensions can "overwrite" changes unknowingly (since later extensions don't get the "updated" doc/data).
Should we be updating the primitive we pass in to the next extension?
There was a problem hiding this comment.
maybe, but there's no other users here, and I can't think of a use case that requires draco decompression and then also more primitive processing that would result in more out_doc/out_data output. So I'd like to see a use case to design against first. There's also real data attached to the original primitive that we could lose (extras/extensions) if not handled properly.
There was a problem hiding this comment.
also worth noting that future extensions called after this one do get access to the out_data/out_doc plus the original primitive currently, so they probably should be checking the output if they're going to write into it and warning anyway (instead of blindly overwriting the data).
There was a problem hiding this comment.
Yup, I missed that, my bad! Following extensions can at least branch on if the previous data was set.
| &mut out_data, | ||
| ); | ||
| } | ||
| let primitive = if let Some(doc) = &out_doc |
There was a problem hiding this comment.
This is surprising - A) we only ever access the first mesh+primitive here, so we are assuming that the extension always produces only one primitive, B) If one of these is missing, we silently do nothing.
We need to document that if you return Some for the doc, the document must contain one mesh and one primitive. Then here we should check that and return an error if that's not true. Let's avoid sneaky footguns!
There was a problem hiding this comment.
this is surprising, and I intend to follow this PR up with additional documentation. Basically this PR would have to rewrite the internal processing to not do this since the rest of the processing internally relies on the gltf types existing and a Document is the only owned gltf type to return here (everything else deals in lifetimes). This is also "one primitive in, one mesh out", since a bevy mesh and a gltf primitive are the same thing.
There was a problem hiding this comment.
I'm fine deferring the doc changes, but I'd like the check that there's only one mesh and one primitive here.
| ) { | ||
| } | ||
|
|
||
| /// Called when an individual glTF primitive is processed |
There was a problem hiding this comment.
This needs a much better doc comment. What is buffer_data? What is out_data and out_doc and what does mutating them do?
There was a problem hiding this comment.
@andriyDev I have change buffer_data type to &[Vec] and sync update my implement in bevy_gltf_draco
and add document for buffer_data out_data out_doc
|
I was wondering use a struct wrapper is proper if other extension will add params /// Input data for glTF primitive processing.
/// Contains read-only references to the glTF document and buffer data.
pub struct GltfPrimitiveInput<'a> {
/// The glTF document being processed.
pub document: &'a gltf::Gltf,
/// The raw buffer data from the glTF file, where each `Vec<u8>` represents a buffer
/// containing geometry data such as vertex attributes and indices.
pub buffers: &'a [Vec<u8>],
}
/// Output data for glTF primitive processing.
/// Allows extensions to provide modified versions of the document or buffers.
pub struct GltfPrimitiveOutput {
/// Optional output for a modified glTF document. If set, the loader will use this
/// modified document for subsequent primitive processing.
pub document: Option<gltf::Document>,
/// Optional output for modified buffer data. If set, the loader will use this
/// modified buffer data instead of the original input buffers.
pub buffers: Option<Vec<Vec<u8>>>,
}
impl Default for GltfPrimitiveOutput {
fn default() -> Self {
Self {
document: None,
buffers: None,
}
}
}the hook will be like /// Called when an individual glTF primitive is processed.
/// glTF primitives are what become a Bevy `Mesh`.
///
/// `input` provides read-only access to the glTF document and buffer data needed for
/// primitive processing. Extensions can read the raw geometry data to process compressed
/// or encoded primitive data.
///
/// `output` allows extensions to provide modified versions of the document or buffers.
/// If set, the loader will use the modified data for subsequent primitive processing.
/// This is useful for extensions like `KHR_draco_mesh_compression` or `EXT_meshopt_compression`
/// that need to decompress or transform primitive data before it is converted to a Bevy `Mesh`.
#[expect(
unused,
reason = "default trait implementations do not use the arguments because they are no-ops"
)]
fn on_gltf_primitive(
&mut self,
load_context: &mut LoadContext<'_>,
gltf_primitive: &gltf::Primitive,
input: &GltfPrimitiveInput,
output: &mut GltfPrimitiveOutput,
) {
}the hook implement will be like let mut input = GltfPrimitiveInput {
document: &gltf,
buffers: &buffer_data,
};
let mut output = GltfPrimitiveOutput::default();
for extension in extensions.iter_mut() {
extension.on_gltf_primitive(load_context, &primitive, &mut input, &mut output);
}
let primitive = if let Some(doc) = &output.document
&& let Some(mesh) = doc.meshes().next()
&& let Some(primitive) = mesh.primitives().next()
{
primitive
} else {
primitive
};
let buffer_data = if let Some(data) = &output.buffers {
data
} else {
&buffer_data
};is this more reasonable? for others support implement extesions such as if there is there is a need to add something, it can be added directly on |
ChristopherBiscardi
left a comment
There was a problem hiding this comment.
Commit 7bbd2bc isn't necessary in my opinion. Especially the GltfPrimitiveInput, which doesn't appear in any other hook and is named GltfPrimitiveInput, but doesn't actually include the gltf::Primitive.
for others support implement extesions such as EXT_meshopt_compression
It may turn out that we should add new features for other extensions, but this PR isn't responsible for that. I think enabling the draco extension you already built is enough here, and the PR state before 7bbd2bc achieves that.
With respect to EXT_meshopt_compression, my understanding is that it operates on buffer views, so wouldn't need to use this new on_gltf_primitive hook in theory (ie: This is not an on_buffer_view hook). Making changes to try to support theoretical future EXT_meshopt_compression changes to on_gltf_primitive doesn't make sense because that doesn't seem to be where the processing would take place anyway. The extension can also deal with animation data, which wouldn't appear in this hook either.
I'd ask you to revert the GltfPrimitiveInput/output changes in 7bbd2bc (and the commit after it), since the proposed reason for including it doesn't hold up (it doesn't seem like EXT_meshopt_compression will require it, and EXT_meshopt_compression is not part of this PR).
I think the PR is good to merge after that.
already revert last two commit |


Objective
Solution
GltfExtensionHandlerbefore get geometry data from primitive buffer.Showcase
I have finish a extention plugin for draco decode in native.
since
GltfExtensionHandlerdo not support async function so wasm part can not be support yet.the plugin will be something like
I'm not sure about the direction Bevy is taking regarding Draco decoding support, nor am I certain whether this API design is appropriate. I've already fully implemented Draco decoding for both Wasm and Native, but the implementation is based on asynchronous APIs and cxx FFI bindings. However, the existing GltfExtensionHandler only provides a synchronous interface, which makes it impossible to execute the Wasm part synchronously on the main thread.
For now, I'm just writing down this idea as a discussion point: under the current API framework, Draco is capable of synchronous decoding on the Native side.