import numpy as np
# Scalar
scalar_a = 3
# Vector
vector_a = np.arange(5)
# Matrix
matrix_a = np.random.random((6, 5))
# Tensor
tensor_a = np.random.random((7, 6, 5))
# broadcast
matrix_b = matrix_a + vector_a
tensor_b = tensor_a + vector_a
tensor_c = tensor_a + matrix_a
What is Broadcast Broad Cast Rule
1 If the arrays do not have the same rank, prepend the shape of the lower rank array with 1s until both shapes have the same length.
2 The two arrays are said to be compatible in a dimension if they have the same size in the dimension, or if one of the arrays has size 1 in that dimension.
3 The arrays can be broadcast together if they are compatible in all dimensions.
4 After broadcasting, each array behaves as if it had shape equal to the elementwise maximum of shapes of the two input arrays.
5 In any dimension where one array had size 1 and the other array had size greater than 1, the first array behaves as if it were copied along that dimension