+
+ Examples of how scatterElements works in different slicing schemes.
+
+
+ // input of shape [4,3]:
+ // [[ 0, 1, 2],
+ // [10, 11, 12],
+ // [20, 21, 22],
+ // [30, 31, 32]]
+ // indices of shape [2,3]:
+ // [[3, 1, 1],
+ // [2, 0, 3]]
+ // updates of shape [2,3]:
+ // [[-1, -2, -3],
+ // [-4, -5, -6]]
+ // axis = 0 (default)
+ // output of shape [4,3]:
+ // [[ 0, -5, 2],
+ // [10, -2, -3],
+ // [-4, 21, 22],
+ // [-1, 31, -6]]
+
+ const input1 = builder.constant(
+ {shape: [4, 3]},
+ new Float32Array([0, 1, 2, 10, 11, 12, 20, 21, 22, 30, 31, 32]));
-
- The specific sampling algorithms are based on those widely used in existing Machine Learning frameworks. For example, when performing {{MLInterpolationMode/linear}} resampling from the following *[4, 4]* input tensor (considering only spatial dimensions):
+ const indices1 = builder.constant(
+ {dataType: 'uint32', shape: [2, 3]},
+ new Uint32Array([3, 1, 1, 2, 0, 3]));
- ```
- [ 0 1 2 3 ]
- [ 0 1 2 3 ]
- [ 12 13 14 15 ]
- [ 12 13 14 15 ]
- ```
+ const updates1 = builder.constant(
+ {dataType: 'float32', shape: [2, 3]},
+ new Uint32Array([-1, -2, -3, -4, -5, -6]));
- For an *[8, 8]* output tensor, the expected values are:
+ const output1 = builder.scatterElements(input1, indices1, updates1);
- ```
- [ 0 0.25 0.75 1.25 1.75 2.25 2.75 3 ]
- [ 0 0.25 0.75 1.25 1.75 2.25 2.75 3 ]
- [ 0 0.25 0.75 1.25 1.75 2.25 2.75 3 ]
- [ 3 3.25 3.75 4.25 4.75 5.25 5.75 6 ]
- [ 9 9.25 9.75 10.25 10.75 11.25 11.75 12 ]
- [ 12 12.25 12.75 13.25 13.75 14.25 14.75 15 ]
- [ 12 12.25 12.75 13.25 13.75 14.25 14.75 15 ]
- [ 12 12.25 12.75 13.25 13.75 14.25 14.75 15 ]
- ```
+ // input of shape [4,3]:
+ // [[ 0, 1, 2],
+ // [10, 11, 12],
+ // [20, 21, 22],
+ // [30, 31, 32]]
+ // indices of shape [4,1]:
+ // [[2],
+ // [1],
+ // [0],
+ // [2]],
+ // updates of shape [4,1]:
+ // [[-1],
+ // [-2],
+ // [-3],
+ // [-4]],
+ // axis = 1
+ // output of shape [4,3]:
+ // [[ 0, 1, -1],
+ // [10, -2, 12],
+ // [-3, 21, 22],
+ // [30, 31, -4]]
- This has the convenient properties that the sampling is evenly distributed, symmetric, robust to image mirroring, and the corner values are aligned.
+ const indices2 = builder.constant(
+ {dataType: 'uint32', shape: [4, 1]},
+ new Uint32Array([2, 1, 0, 2]));
+
+ const updates2 =
+ builder.constant({dataType: 'float32', shape: [4, 1]},
+ new Uint32Array([-1, -2, -3, -4]));
+
+ const output2 = builder.scatterElements(input1, indices2, updates2, {axis: 1});
+
+ // input of shape [4,2,2]:
+ // [[[ 0, 1],
+ // [ 10, 11]],
+ // [[100, 101],
+ // [110, 111]],
+ // [[200, 201],
+ // [210, 211]],
+ // [[300, 301],
+ // [310, 311]],]
+ // indices of shape [1,2,2]:
+ // [[[0, 2],
+ // [1, 3]]],
+ // updates of shape [1,2,2]:
+ // [[[-1, -2],
+ // [-3, -4]]],
+ // axis = 2
+ // output of shape [4,2,2]:
+ // [[[ -1, 1],
+ // [ 10, 11]],
+ // [[100, 101],
+ // [ -3, 111]],
+ // [[200, -2],
+ // [210, 211]],
+ // [[300, 301],
+ // [310, -4]],]
+ const input2 = builder.constant(
+ {shape: [4, 2, 2]},
+ new Float32Array([0, 1, 10, 11, 100, 101, 110, 111, 200, 201, 210, 211, 300, 301, 310, 311]));
+
+ const indices3 = builder.constant(
+ {dataType: 'uint32', shape: [1, 2, 2]},
+ new Uint32Array([0, 2, 1, 3]));
+
+ const updates3 =
+ builder.constant({dataType: 'float32', shape: [1, 2, 2]},
+ new Uint32Array([-1, -2, -3, -4]));
+
+ const output3 = builder.scatterElements(input2, indices3, updates3, {axis: 2});
+
+
-### reshape ### {#api-mlgraphbuilder-reshape-method}
-Alter the shape of a tensor to a new shape. Reshape does not copy or change the content of the tensor. It just changes the tensor's logical shape for the subsequent operations.
+### scatterNd ### {#api-mlgraphbuilder-scatternd}
+Scatter slices of values from the update tensor atop a copy of the input tensor according to the indices.
+
-
+
+
**Arguments:**
- - input: an {{MLOperand}}. The input tensor.
- - newShape: [=sequence=]<{{unsigned long}}>. The shape of the output tensor.
- The number of elements implied by {{MLGraphBuilder/reshape(input, newShape, options)/newShape}} must be the same as the
- number of elements in the input tensor.
- - options: an {{MLOperatorOptions}}. Specifies the optional parameters of the operation.
+ - input: an {{MLOperand}}. The input N-D tensor from to initialize the output with.
+ - indices: an {{MLOperand}}. The indices array contains entire coordinates into the output tensor, with the rightmost dimension holding the number of dimensions per coordinate. So an indices tensor of shape [10,1] holds 10 single-axis indices, and a shape of [4,3] holds 4 indices of 3D coordinates. The values must be of type {{MLOperandDataType/"int32"}}, {{MLOperandDataType/"uint32"}}, or {{MLOperandDataType/"int64"}}, and each must be in the range -N (inclusive) to N (exclusive) where N is the size of the corresponding output dimension, and a negative index means indexing from the end of the corresponding dimension.
+ - updates: an {{MLOperand}}. New values to replace atop the input.
+ - options: an optional {{MLScatterOptions}}. The optional parameters of the operation.
- **Returns:** an {{MLOperand}}. The output tensor. The values of the output
- tensor are the same as values of the input tensor. The shape of the output
- tensor is specified by {{MLGraphBuilder/reshape(input, newShape, options)/newShape}}.
+ **Returns:** an {{MLOperand}}. The output N-D tensor of [=MLOperand/rank=] equal to the [=MLOperand/rank=] of *input*'s [=MLOperand/rank=] + *indices*'s [=MLOperand/rank=] - *indices*'s [=MLOperand/shape=][-1] - 1.
-
- Constraints for {{MLGraphBuilder/reshape()}}
+
+ Constraints for {{MLGraphBuilder/scatterNd()}}
operand |
@@ -7290,42 +8433,162 @@ partial dictionary MLOpSupportLimits {
{{input}} |
[=/any data type|any=] |
- [=/any rank|N=] |
+ 1 to [=/any rank|N=] |
+
+
+ {{indices}} |
+ {{MLOperandDataType/"int32"}}, {{MLOperandDataType/"uint32"}}, {{MLOperandDataType/"int64"}} |
+ 1 to [=/any rank|N=] |
+
+
+ {{updates}} |
+ [=/same type as|same as=] {{input}} |
+ *input*'s [=MLOperand/rank=] + *indices*'s [=MLOperand/rank=] - *indices*'s [=MLOperand/shape=][-1] - 1 |
*output* |
[=/same type as|same as=] {{input}} |
- {{newShape}}'s [=list/size=] |
+ 1 to [=/any rank|N=] |
-{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/reshape()}}:
+{{MLOpSupportLimits}} has the following members for {{MLGraphBuilder/scatterNd()}}:
- : reshape
- :: Support limits for operator {{MLGraphBuilder/reshape()}}.
+ : scatterNd
+ :: Support limits for operator {{MLGraphBuilder/scatterNd()}}.
+
+ The {{MLGraphBuilder/scatterNd(input, indices, updates, options)/indices}} parameter to {{MLGraphBuilder/scatterNd()}} can not be clamped to the allowed range when the graph is built because the inputs are not known until execution. Implementations can introduce {{MLGraphBuilder/clamp()}} in the compiled graph if the required clamping behavior is not provided by the underlying platform. Similarly, if the underlying platform does not support negative indices, the implementation can introduce operations in the compiled graph to transform a negative index from the end of the dimension into a positive index.
+
+
- The reshape(|input|, |newShape|, |options|) method steps are:
+ The scatterNd(|input|, |indices|, |updates|, |options|) method steps are:
- 1. If [=this=] [=MLGraphBuilder/can not build=], then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
- 1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. Let |outputShape| be an empty array of {{unsigned long}}.
- 1. If |newShape|'s [=list/size=] is 0, set |outputShape| to an empty [=/list=] for a scalar.
- 1. If any [=list/item=] in |newShape| is not a [=valid dimension=], then [=exception/throw=] a {{TypeError}}.
- 1. Let |inputElementCount| be the product of all elements in |input|'s [=MLOperand/shape=]. Empty dimensions yield an |inputElementCount| of 1.
- 1. If product of all values in |newShape| is not equal to |inputElementCount|, then [=exception/throw=] a {{TypeError}}.
- 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}.
- 1. Set |desc|.{{MLOperandDescriptor/shape}} to |newShape|.
- 1. *Make graph connections:*
- 1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |desc|.
- 1. Let |operator| be an [=operator=] for the "reshape" operation, given |options|.
- 1. Set |output|.{{MLOperand/[[operator]]}} to |operator|.
- 1. Set |operator|'s [=operator/input=] to |input|.
- 1. Set |operator|'s [=operator/output=] to |output|.
- 1. Return |output|.
+ TODO:
+
+
+
+
+
+ Examples of how scatterNd works in different slicing schemes.
+
+
+ // input of shape [8]:
+ // [0, 1, 2, 3, 4, 5, 6, 7]
+ // indices of shape [4, 1]:
+ // [[4],
+ // [3],
+ // [1],
+ // [7]]
+ // updates of shape [4]:
+ // [-1, -2, -3, -4]
+ // output of shape [8]:
+ // [0, -3, 2, -2, -1, 5, 6, -4]
+
+ const input1 = builder.constant(
+ {shape: [8]},
+ new Float32Array([0, 1, 2, 3, 4, 5, 6, 7]));
+
+ const indices1 = builder.constant(
+ {dataType: 'uint32', shape: [4, 1]},
+ new Uint32Array([4, 3, 1, 7]));
+
+ const updates1 = builder.constant(
+ {dataType: 'uint32', shape: [4]},
+ new Uint32Array([-1, -2, -3, -4]));
+
+ const output1 = builder.scatterNd(input1, indices1, updates1);
+
+ // input of shape [2,2]:
+ // [[0, 1],
+ // [2, 3]]
+ // indices of shape [2,2]:
+ // [[0, 0],
+ // [1, 1]]
+ // updates of shape [2]:
+ // [-1, -2]
+ // output of shape [2,2]:
+ // [[-1, 1], <= -1 written to element coordinate [0, 0]
+ // [ 2, -2]] <= -2 written to element coordinate [1, 1]
+
+ const input2 = builder.constant(
+ {shape: [2, 2]},
+ new Float32Array([0, 1, 2, 3]));
+
+ const indices2 = builder.constant(
+ {dataType: 'uint32', shape: [2, 2]},
+ new Uint32Array([0, 0, 1, 1]));
+
+ const updates2 = builder.constant(
+ {dataType: 'uint32', shape: [2]},
+ new Uint32Array([-1, -2]));
+
+ const output2 = builder.scatterNd(input2, indices2, updates2);
+
+ // input of shape [3,2]:
+ // [[0, 1],
+ // [2, 3],
+ // [4, 5]]
+ // indices of shape [2,1]:
+ // [[2],
+ // [0]]
+ // updates of shape [2,2]:
+ // [[-1, -2],
+ // [-3, -4]]
+ // output of shape [3,2]:
+ // [[-3 ,-4], <= [-3, -4] written to element coordinate [0, *]
+ // [ 2, 3],
+ // [-1, -2]] <= [-1, -2] written to element coordinate [2, *]
+
+ const input3 = builder.constant(
+ {shape: [3, 2]},
+ new Float32Array([0, 1, 2, 3, 4, 5]));
+
+ const indices3 = builder.constant(
+ {dataType: 'uint32', shape: [2, 1]},
+ new Uint32Array([1, 0]));
+
+ const updates3 = builder.constant(
+ {dataType: 'uint32', shape: [2, 2]},
+ new Uint32Array([-1, -2, -3, 4]));
+
+ const output3 = builder.scatterNd(input3, indices3, updates3);
+
+ // input of shape [2,2,2]:
+ // [[[0, 1],
+ // [2, 3]],
+ // [[4, 5],
+ // [6, 7]]]
+ // indices of shape [2,2]:
+ // [[0, 1],
+ // [1, 0]]
+ // updates of shape [2,2]:
+ // [[-1, -2],
+ // [-3, -4]]
+ // output of shape [2,2,2]:
+ // [[[ 0, 1],
+ // [-1, -2]], <= [-1, -2] written to element coordinate [0, 1, *]
+ // [[-3, -4], <= [-3, -4] written to element coordinate [1, 0, *]
+ // [ 6, 7]]]
+
+ const input4 = builder.constant(
+ {shape: [2, 2, 2]},
+ new Float32Array([0, 1, 2, 3, 4, 5, 6, 7]));
+
+ const indices4 = builder.constant(
+ {dataType: 'uint32', shape: [2, 2]},
+ new Uint32Array([0, 1, 1, 0]));
+
+ const updates4 = builder.constant(
+ {dataType: 'uint32', shape: [2, 2]},
+ new Uint32Array([-1, -2, -3, 4]));
+
+ const output4 = builder.scatterNd(input4, indices4, updates4);
+
+
### sigmoid ### {#api-mlgraphbuilder-sigmoid-method}
@@ -7411,23 +8674,39 @@ partial dictionary MLOpSupportLimits {
### slice ### {#api-mlgraphbuilder-slice}
Produce a slice of the input tensor.
+
+{{MLSliceOptions}} has the following members:
+
+ : strides
+ ::
+ The stride to step over each input along each axis.
+ The length of the strides array must equal the [=MLOperand/rank=] of the input tensor.
+ The the default is an array of length [=MLOperand/rank=] consisting of all 1's.
+ e.g. [1,1,1] for a 3-D tensor.
+ Strides must be greater than zero.
+
+
**Arguments:**
- input: an {{MLOperand}}. The input tensor.
- - starts: [=sequence=]<{{unsigned long}}>. The starting index to slice of each input dimension, of length N where N is the [=MLOperand/rank=] of the input tensor. For each dimension *d* of {{MLGraphBuilder/slice(input, starts, sizes, options)/input}}, {{MLGraphBuilder/slice(input, starts, sizes, options)/starts}}[*d*] indicates the starting index to slice in that dimension. The starting index must be in the range [0, input size - 1] in that dimension.
- - sizes: a [=sequence=]<{{unsigned long}}>. The number of elements to slice of each input dimension, of length N where N is the [=MLOperand/rank=] of the input tensor. For each dimension *d* of {{MLGraphBuilder/slice(input, starts, sizes, options)/input}}, {{MLGraphBuilder/slice(input, starts, sizes, options)/sizes}}[*d*] indicates the number of elements to slice in that dimension. The size must not be 0 and must satisfy the constraint `starting index + size <= input size` in that dimension.
- - options: an {{MLOperatorOptions}}. Specifies the optional parameters of the operation.
+ - starts: an [=sequence=]<{{unsigned long}}>. The starting index to slice of each input dimension, of length N where N is the [=MLOperand/rank=] of the input tensor. For each dimension *d* of {{MLGraphBuilder/slice(input, starts, sizes, options)/input}}, {{MLGraphBuilder/slice(input, starts, sizes, options)/starts}}[*d*] indicates the starting index to slice in that dimension. The starting index must be in the range [0, input size - 1] in that dimension.
+ - sizes: an [=sequence=]<{{unsigned long}}>. The number of elements to slice of each input dimension, of length N where N is the [=MLOperand/rank=] of the input tensor. For each dimension *d* of {{MLGraphBuilder/slice(input, starts, sizes, options)/input}}, {{MLGraphBuilder/slice(input, starts, sizes, options)/sizes}}[*d*] indicates the number of elements to slice in that dimension. The size must not be 0 and must satisfy the constraint `starting index + size <= input size` in that dimension.
+ - options: an {{MLSliceOptions}}. Specifies the optional parameters of the operation.
**Returns:** an {{MLOperand}}. The output tensor of the same rank as the input tensor with tensor values stripped to the specified starting and ending indices in each dimension.
@@ -7467,6 +8746,8 @@ partial dictionary MLOpSupportLimits {
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
1. If any of |sizes|'s [=list/items=] are 0, then [=exception/throw=] a {{TypeError}}.
1. If |starts|'s [=list/size=] and |sizes|'s [=list/size=] are not both equal to |input|'s [=MLOperand/rank=], then [=exception/throw=] a {{TypeError}}.
+ 1. If |options|.{{MLSliceOptions/strides}} [=map/exists=]:
+ 1. If |options|.{{MLSliceOptions/strides}}'s [=list/size=] is not equal to |input|'s [=MLOperand/rank=], then [=exception/throw=] a {{TypeError}}.
1. [=list/For each=] |index| in [=the range=] 0 to |input|'s [=MLOperand/rank=], exclusive:
1. If |sizes|[|index|] is 0, then [=exception/throw=] a {{TypeError}}.
@@ -7474,6 +8755,8 @@ partial dictionary MLOpSupportLimits {
1. If |starts|[|index|] is greater than or equal to |input|'s [=MLOperand/shape=][|index|], then [=exception/throw=] a {{TypeError}}.
1. If |starts|[|index|] + |sizes|[|index|] is greater than |input|'s [=MLOperand/shape=][|index|], then [=exception/throw=] a {{TypeError}}.
+ 1. If |options|.{{MLSliceOptions/strides}} [=map/exists=]:
+ 1. If |options|.{{MLSliceOptions/strides}}[|index|] is less than 1, then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
1. Let |operator| be an [=operator=] for the "slice" operation, given |starts|, |sizes|, and |options|.
@@ -7949,6 +9232,77 @@ partial dictionary MLOpSupportLimits {
+### tile ### {#api-mlgraphbuilder-tile}
+Repeat a tensor the given number of times along each dimension.
+
+
+
+
+ **Arguments:**
+ - input: an {{MLOperand}}. The input N-D tensor.
+ - repetitions: A count per dimension of how many times to repeat that dimension. The |repetitions| [=list/size=] must match the |input|'s [=MLOperand/rank=], using 1's for any axis that should retain the same size.
+ - options: an optional {{MLOperatorOptions}}. The optional parameters of the operation.
+
+ **Returns:** an {{MLOperand}}. The reversed N-D tensor.
+
+
+
+ Constraints for {{MLGraphBuilder/tile()}}
+
+
+ operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+ *output* |
+ [=/same type as|same as=] {{input}} |
+ [=/same rank as|same as=] {{input}} |
+
+
+
+{{MLOpSupportLimits}} has the following members for {{MLGraphBuilder/tile()}}:
+
+ : tile
+ :: Support limits for operator {{MLGraphBuilder/tile()}}.
+
+
+
+
+ The tile(|input|, |repetitions|, |options|) method steps are:
+
+ 1. If [=this=] [=MLGraphBuilder/can not build=], then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
+ 1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
+ 1. If |repetitions|'s [=list/size=] is not equal to |input|'s [=MLOperand/rank=], then [=exception/throw=] a {{TypeError}}.
+ 1. If |repetitions|'s values contain 0's, then [=exception/throw=] a {{TypeError}}.
+
+ Issue(391): If 0-size dimensions are allowed, revise these steps.
+
+ 1. *Make graph connections:*
+ 1. Let |output| be the result of [=copying an MLOperand=] given |input|.
+ 1. Let |operator| be an [=operator=] for the "tile" operation, given |options|.
+ 1. Set |output|.{{MLOperand/[[operator]]}} to |operator|.
+ 1. Set |operator|'s [=operator/input=] to |input|.
+ 1. Set |operator|'s [=operator/output=] to |output|.
+ 1. Return |output|.
+
+
### transpose ### {#api-mlgraphbuilder-transpose}
Permute the dimensions of the input tensor according to {{MLTransposeOptions/permutation}}.
@@ -8308,6 +9662,8 @@ The shapes of the input tensors must be compatible. A tensor is [=unidirectional
Two tensors are [=bidirectionally broadcastable=] if they can be mutually "stretched" (repeated) across their various dimensions, starting from the last dimension. For example, a *[5,1]* tensor can be bidirectionally broadcast with a *[1,6]* tensor by repeating the first tensor 6 times in the last dimension and the second tensor 5 times in preceding dimension. The result of the operation will be a *[5,6]* tensor. Bidirectional broadcasting is convenient for element-wise operations.
+A tensor is [=blockwise broadcastable=] if the all dimensions can be upsampled by integer multiples to the target tensor's shape. For example, a *[4,5]* tensor can be blockwise broadcast up to a *[16,10]* tensor as it is an exact multiple (16 % 4 = 0, 10 % 5 = 0) by repeating every element 4 times in the first dimension and every element 2 times in the last dimension (e.g. values *[1,2,3,4,5]* in a single slice would be repeated to *[1,1,2,2,3,3,4,4,5,5]*). However, a *[4,5]* tensor would be incompatible with a *[9,3]* tensor since both dimensions have a nonzero remainder (9 % 4 = 1, 3 % 5 = 3). Blockwise broadcasting is useful for sharing common values in larger blocks to save memory. Both tensors are expected to have the same rank, and the output shape is simply the target tensor's shape which the smaller one is being upsampled to.
+
Some operations allow broadcasting with special semantics. For example, {{MLGraphBuilder/matmul()}} treats the last two dimensions of the input tensors as the rows and columns of the matrices, and the number of columns in the first matrix must be equal to the number of rows in the second matrix. The matrix multiplication is bidirectionally broadcast across any additional dimensions, treating the input tensors as stacks of matrices to multiply.
@@ -8360,6 +9716,22 @@ To bidirectionally broadcast the sha
|shapeA| is bidirectionally broadcastable to |shapeB| if [=bidirectionally broadcasting=] |shapeA| and |shapeB| does not result in failure.
+
+
+To blockwise broadcast the shapes |shapeFrom| and |shapeTo|, perform the following steps. |shapeFrom| and |shapeTo| are [=/lists=] of positive integers, representing the dimensions of tensors, and the steps return a new [=/list=] of positive integers, or failure.
+
+
+1. If |shapeFrom|'s [=list/size=] is not equal to |shapeTo|'s [=list/size=], then return failure.
+1. [=list/For each=] |index| in [=the range=] 0 to |shapeTo|'s [=list/size=], exclusive:
+ 1. If |shapeFrom|[|index|] is not exactly divisible into |shapeTo|[|index|], then return failure.
+1. Return |shapeTo|.
+
+
+
+
+|shapeFrom| is blockwise broadcastable to |shapeTo| if [=blockwise broadcasting=] |shapeFrom| and |shapeTo| does not result in failure.
+
+
## Casting ## {#algorithms-casting}
Explicit numeric casting is used in algorithms where parameters passed as {{MLNumber}} or {{double}} need to be converted to match the {{MLOperandDataType}} of input or output {{MLOperand}}s.
@@ -8569,8 +9941,8 @@ Operations present in other neural network inference APIs can often be emulated
function flatten(builder, input, axis) {
if (axis > input.shape.length)
return input;
- const before = axis.slice(0, axis).reduce((a, b) => a * b);
- const after = axis.slice(axis, input.shape.length).reduce((a, b) => a * b);
+ const before = axis.slice(0, axis).reduce((a, b) => a * b, 1);
+ const after = axis.slice(axis, input.shape.length).reduce((a, b) => a * b, 1);
return builder.reshape(input, [before, after]);
}
@@ -9127,6 +10499,12 @@ Thanks to Feng Dai for his continuous contributions that keep web-platform-tests
"Thomas Scialom"
],
"date": "July 2023"
+ },
+ "Prefix-Sum": {
+ "href": "https://en.wikipedia.org/wiki/Prefix_sum",
+ "title": "Prefix Sum",
+ "authors": ["The Wikipedia community"],
+ "date": "January 2025"
}
}