Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use AllowSharedBufferSource for MLGraphBuilder.constant() #790

Merged
merged 3 commits into from
Nov 28, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 18 additions & 33 deletions index.bs
Original file line number Diff line number Diff line change
Expand Up @@ -1341,7 +1341,8 @@ interface MLGraphBuilder {
MLOperand input(USVString name, MLOperandDescriptor descriptor);

// Create an operand for a graph constant.
MLOperand constant(MLOperandDescriptor descriptor, ArrayBufferView bufferView);
MLOperand constant(MLOperandDescriptor descriptor,
AllowSharedBufferSource buffer);

// Create a scalar operand from the specified number of the specified type.
MLOperand constant(MLOperandDataType type, MLNumber value);
Expand All @@ -1352,7 +1353,7 @@ interface MLGraphBuilder {
</script>

<div class="note">
The {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} method compiles the graph builder state up to the specified output operands into a compiled graph according to the type of {{MLContext}} that creates it. When the {{MLContext/[[contextType]]}} of the {{MLContext}} is set to "[=context type/default=]", the compiled graph is initialized right before the {{MLGraph}} is returned. This graph initialization stage is important for optimal performance of the subsequent graph executions. It typically involves a process known as "weight preprocessing" where all the constant inputs to the graph are preprocessed and cached at the operating system level for subsequent graph execution calls. The initializing inputs are typically the constant weight data specified through the {{MLGraphBuilder/constant(descriptor, bufferView)|constant()}} method as constant operands during graph construction time.
The {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} method compiles the graph builder state up to the specified output operands into a compiled graph according to the type of {{MLContext}} that creates it. When the {{MLContext/[[contextType]]}} of the {{MLContext}} is set to "[=context type/default=]", the compiled graph is initialized right before the {{MLGraph}} is returned. This graph initialization stage is important for optimal performance of the subsequent graph executions. It typically involves a process known as "weight preprocessing" where all the constant inputs to the graph are preprocessed and cached at the operating system level for subsequent graph execution calls. The initializing inputs are typically the constant weight data specified through the {{MLGraphBuilder/constant(descriptor, buffer)|constant()}} method as constant operands during graph construction time.
</div>

<div class=internal-slots>
Expand Down Expand Up @@ -1416,26 +1417,26 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input
### constant operands ### {#api-mlgraphbuilder-constant}
Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods.

#### {{MLGraphBuilder/constant(descriptor, bufferView)}} #### {#api-mlgraphbuilder-constant-bufferview}
#### {{MLGraphBuilder/constant(descriptor, buffer)}} #### {#api-mlgraphbuilder-constant-buffer}
Create a constant {{MLOperand}} of the specified data type and shape that contains the initializing data.

<div dfn-for="MLGraphBuilder/constant(descriptor, bufferView)" dfn-type=argument>
<div dfn-for="MLGraphBuilder/constant(descriptor, buffer)" dfn-type=argument>
**Arguments:**
- <dfn>descriptor</dfn>: an {{MLOperandDescriptor}}. The descriptor of the output tensor.
- <dfn>bufferView</dfn>: an {{ArrayBufferView}}. The view of the buffer containing the initializing data.
- <dfn>buffer</dfn>: an {{AllowSharedBufferSource}}. The buffer containing the initializing data.
**Returns:** an {{MLOperand}}. The constant output tensor.
</div>

<details open algorithm>
<summary>
The <dfn method for=MLGraphBuilder>constant(|descriptor|, |bufferView|)</dfn> method steps are:
The <dfn method for=MLGraphBuilder>constant(|descriptor|, |buffer|)</dfn> method steps are:
</summary>
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLOperandDescriptor/checking dimensions=] given |descriptor| returns false, then [=exception/throw=] a {{TypeError}}.
1. If [=validating buffer with descriptor=] given |bufferView| and |descriptor| returns false, then [=exception/throw=] a {{TypeError}}.
1. If [=validating buffer with descriptor=] given |buffer| and |descriptor| returns false, then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |operand| be the result of [=creating an MLOperand=] given [=this=] and |descriptor|.
1. Let |bytes| be the result of [=getting a copy of the bytes held by the buffer source=] given |bufferView|.
1. Let |bytes| be the result of [=getting a copy of the bytes held by the buffer source=] given |buffer|.
1. Add |operand| to [=this=]'s [=MLGraphBuilder/graph=]'s [=computational graph/constants=] with |bytes| as value.
1. Return |operand|.
</details>
Expand Down Expand Up @@ -1503,9 +1504,6 @@ Build a composed graph up to a given output operand into a computational graph a
1. Set |graph|.{{MLGraph/[[context]]}} to [=this=].{{MLGraphBuilder/[[context]]}}.
1. [=set/For each=] |operand| in |inputs|:
1. Set |graph|.{{MLGraph/[[inputDescriptors]]}}[|operand|.{{MLOperand/[[name]]}}] to |operand|.{{MLOperand/[[descriptor]]}}.

Issue(566): If {{MLGraphBuilder/constant(descriptor, bufferView)|constants'}} {{ArrayBuffer}}s are not [=ArrayBuffer/transferred=], make copies for [=MLGraphBuilder/graph=]'s [=computational graph/constants=] here.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note: this issue is closed and not relevant.


1. [=map/For each=] |name| → |operand| of |outputs|:
1. Set |graph|.{{MLGraph/[[outputDescriptors]]}}[|name|] to |operand|.{{MLOperand/[[descriptor]]}}.
1. Let |promise| be [=a new promise=].
Expand Down Expand Up @@ -2012,8 +2010,7 @@ partial dictionary MLOpSupportLimits {
input, builder.constant(input.dataType, options.minValue));
} else {
return builder.min(
builder.max(
input, builder.constant(input.dataType, options.minValue)),
builder.max(input, builder.constant(input.dataType, options.minValue)),
builder.constant(input.dataType, options.maxValue));
}
}
Expand Down Expand Up @@ -3421,8 +3418,8 @@ partial dictionary MLOpSupportLimits {
{shape: [4, 3]},
new Float32Array([0, 1, 2, 10, 11, 12, 20, 21, 22, 30, 31, 32]));

const indices1 = builder.constant(
{dataType: 'uint32', shape: [2]}, new Uint32Array([3, 1]));
const indices1 =
builder.constant({dataType: 'uint32', shape: [2]}, new Uint32Array([3, 1]));

const indices2 = builder.constant(
{dataType: 'uint32', shape: [3]}, new Uint32Array([2, 1, 1]));
Expand Down Expand Up @@ -3937,10 +3934,7 @@ partial dictionary MLOpSupportLimits {
let hiddenState = options.initialHiddenState;

if (!hiddenState) {
const desc = {
dataType: 'float32',
shape: [numDirections, 1, hiddenSize]
};
const desc = {dataType: 'float32', shape: [numDirections, 1, hiddenSize]};
const totalSize = numDirections * hiddenSize;
hiddenState = builder.constant(desc, new Float32Array(totalSize).fill(0));
}
Expand Down Expand Up @@ -4619,8 +4613,7 @@ partial dictionary MLOpSupportLimits {
const reduceOptions = {axes: [2, 3], keepDimensions: true};
const mean = builder.reduceMean(input, reduceOptions);
const variance = builder.reduceMean(
builder.pow(
builder.sub(input, mean), builder.constant(input.dataType, 2)),
builder.pow(builder.sub(input, mean), builder.constant(input.dataType, 2)),
reduceOptions);

// The scale and bias values are applied per input feature
Expand Down Expand Up @@ -4765,8 +4758,7 @@ partial dictionary MLOpSupportLimits {
const reduceOptions = {axes: [1, 2, 3], keepDimensions: true};
const mean = builder.reduceMean(input, reduceOptions);
const variance = builder.reduceMean(
builder.pow(
builder.sub(input, mean), builder.constant(input.dataType, 2)),
builder.pow(builder.sub(input, mean), builder.constant(input.dataType, 2)),
reduceOptions);

// The scale and bias tensors are of the shape of the input
Expand Down Expand Up @@ -5222,19 +5214,13 @@ partial dictionary MLOpSupportLimits {
let cellState = options.initialCellState;

if (!hiddenState) {
const desc = {
dataType: 'float32',
shape: [numDirections, 1, hiddenSize]
};
const desc = {dataType: 'float32', shape: [numDirections, 1, hiddenSize]};
const totalSize = numDirections * hiddenSize;
hiddenState = builder.constant(desc, new Float32Array(totalSize).fill(0));
}

if (!cellState) {
const desc = {
dataType: 'float32',
shape: [numDirections, 1, hiddenSize]
};
const desc = {dataType: 'float32', shape: [numDirections, 1, hiddenSize]};
const totalSize = numDirections * hiddenSize;
cellState = builder.constant(desc, new Float32Array(totalSize).fill(0));
}
Expand Down Expand Up @@ -5878,8 +5864,7 @@ partial dictionary MLOpSupportLimits {
<pre highlight="js">
// input: [[1,2,3], [4,5,6]]
const input = builder.constant(
{dataType: 'float32', shape: [2, 3]},
new Float32Array([1, 2, 3, 4, 5, 6]));
{dataType: 'float32', shape: [2, 3]}, new Float32Array([1, 2, 3, 4, 5, 6]));

const beginningPadding = [1, 2];
const endingPadding = [1, 2];
Expand Down