You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The current implementation does not allow for zero delay. The minimum delay is one render quantum because we have split up the DelayNode up front in a DelayWriter and a DelayReader
The idea would be to create a dummy connection between DelayReader and DelayWriter in the DelayNode::new constructor, this connection would do almost nothing (as basically the writer do not use output and the reader do not use input) expect guarantee the order of the processing if the delay is not in a loop. In the graph process, if the node is found in a cycle, the graph could just delete this connection and somehow flag the Reader as "in_cycle" so that the later would know it must clamp its minimum delay to the quantum duration. (I guess that's not that far from what is described in the spec, and I don't even think this would need to be reversible, e.g. once in a cycle the node behaves like that forever even if the cycle is broken later, which is really a weird edge-case I can't even imagine where users should anyway know what they are doing)
The text was updated successfully, but these errors were encountered:
The current implementation does not allow for zero delay. The minimum delay is one render quantum because we have split up the DelayNode up front in a DelayWriter and a DelayReader
As discussed in #70 :
The idea would be to create a dummy connection between DelayReader and DelayWriter in the DelayNode::new constructor, this connection would do almost nothing (as basically the writer do not use output and the reader do not use input) expect guarantee the order of the processing if the delay is not in a loop. In the graph process, if the node is found in a cycle, the graph could just delete this connection and somehow flag the Reader as "in_cycle" so that the later would know it must clamp its minimum delay to the quantum duration. (I guess that's not that far from what is described in the spec, and I don't even think this would need to be reversible, e.g. once in a cycle the node behaves like that forever even if the cycle is broken later, which is really a weird edge-case I can't even imagine where users should anyway know what they are doing)
The text was updated successfully, but these errors were encountered: