scipy.special.log_softmax — SciPy v1.7.1 Manual
docs.scipy.org › scipyscipy.special.log_softmax. ¶. Logarithm of softmax function: Input array. Axis to compute values along. Default is None and softmax will be computed over the entire array x. An array with the same shape as x. Exponential of the result will sum to 1 along the specified axis. If x is a scalar, a scalar is returned.
scipy.linalg.logm — SciPy v1.7.1 Manual
docs.scipy.org › doc › scipyDiscrete Fourier transforms ( scipy.fft ) Legacy discrete Fourier transforms ( scipy.fftpack ) Integration and ODEs ( scipy.integrate ) Interpolation ( scipy.interpolate ) Input and output ( scipy.io ) Linear algebra ( scipy.linalg ) Low-level BLAS functions ( scipy.linalg.blas )
scipy.stats.loguniform — SciPy v1.7.1 Manual
docs.scipy.org › scipyscipy.stats.loguniform¶ scipy.stats. loguniform = <scipy.stats._continuous_distns.reciprocal_gen object> [source] ¶ A loguniform or reciprocal continuous random variable. As an instance of the rv_continuous class, loguniform object inherits from it a collection of generic methods (see below for the full list), and completes them with details specific for this particular distribution.
scipy.stats.poisson — SciPy v1.7.1 Manual
docs.scipy.org › doc › scipyscipy.stats.poisson¶ scipy.stats. poisson = <scipy.stats._discrete_distns.poisson_gen object> [source] ¶ A Poisson discrete random variable. As an instance of the rv_discrete class, poisson object inherits from it a collection of generic methods (see below for the full list), and completes them with details specific for this particular distribution.
scipy.stats.lognorm — SciPy v1.7.1 Manual
docs.scipy.org › doc › scipyscipy.stats.lognorm¶ scipy.stats. lognorm = <scipy.stats._continuous_distns.lognorm_gen object> [source] ¶ A lognormal continuous random variable. As an instance of the rv_continuous class, lognorm object inherits from it a collection of generic methods (see below for the full list), and completes them with details specific for this particular distribution.
scipy.stats.entropy — SciPy v1.7.1 Manual
docs.scipy.org › doc › scipyscipy.stats.entropy. ¶. Calculate the entropy of a distribution for given probability values. If only probabilities pk are given, the entropy is calculated as S = -sum (pk * log (pk), axis=axis). If qk is not None, then compute the Kullback-Leibler divergence S = sum (pk * log (pk / qk), axis=axis). This routine will normalize pk and qk if ...