This is a learning note related to the Python learning and exercise, including but not limited to numpy, pytorch. Continuously updating…
-
The outputs of a same function name are different since importing a library.
print(sum(range(5),-1)) ## output: 9 from numpy import * print(sum(range(5),-1)) ## output: 10-
built-in function sum(iterable, start):
iterableis a iterable (list, tuple, dict, etc).Startis a value that will be added to the sum of items of the iterable. The default value of start is 0 (if omitted). -
numpy.sum(array, axis):
arrayare the elements to sum.axisis an integer indicating which axis or axes along to sum. Ifaxisis negative it counts from the last to the first axis.
-
-
bitwise shift operator
<<and>>-
<<:x << yreturnsxwith the bits shifted to the left byyplaces. Same as multiplyingxby2**y. -
>>:x >> yreturnsxwith the bits shifted to the right byyplaces. Same as dividingxby2**y.
-
-
generator function
A generator function doesn’t return a single value but returns an iterator object with a sequence of values. It use
yieldratherreturn. The difference betweenyieldandreturnis thatyieldreturns a value and pauses the execution while maintaining the internal states, whereas thereturnstatement returns a value and terminates the execution of the function.def generate(): for x in range(10): yield x a = np.fromiter(generate(), dtype = int) print(a) -
broadcasting semantics Two tensors are “broadcastable” if the following rules hold:
-
Each tensor has at least one dimension.
-
When iterating over the dimension sizes, starting at the trailing dimension, the dimension sizes must either be equal, one of them is 1, or one of them does not exist.
-