In the Wikipedia article, the predecessor $f$ on the final coalgebra is not defined at 0, it is only defined at $n+1$ and $\infty$.In the $n$Lab article, $\operatorname{pred}(0) = *$, but $*$ is never defined. What is $*$? What is $\operatorname{pred
Let $\mathsf{C}$ be a symmetric monoidal category. An object $X \in \mathsf{C}$ has two operads "naturally" (this is not the mathematical word, the two constructions aren't functorial in general) associated to it: the operad of endomorphisms $\m
Let $f(x) \in$ $Z[x]$ , if $f(x)$ is reducible over $Q$ , then it is reducible over $Z$.I wen through the proof from the book I'm reading , which starts as follows : We're given $f(x)$ is reducible over $Q$ , so we can write $f(x)$ as : $f(x) = g(x)h
How do I prove that $$\lim_{(m,n) \to \infty} a_m^{b_n} = a^b$$where $a,b \in \mathbb R$, $a_i,b_i \in \mathbb Q$, $a_m \to a$, $b_n \to b$ and $a$ and $b$ are not both zero.I can prove it with $n$ or $m$ constant, but not when both are limiting at t
I'm building a package for B-spline interpolation in Julia, and I've come across a boundary condition that I want to implement but can't wrap my head around how to do it (mathematically).Basically, it boils down to a more general problem, which I sup
Let's take a look at this proofIt is claimed:It follows that both the $n$-th columns of $B$ and $C$ must contain leading 1's, for otherwise those columns would be free columns and we could arbitrarily choose the value of $x_n$. I'm not convinced by t
Suppose I'm given a sigmoid function of $s=\frac{1}{1+e^{-t}}$ over [a,b], and I want to approximate with a linear equation $y=mx+b$. I want to minimize the error between my linear model and the sigmoid function.Ie: minimize the $L_\infty$ norm of $|
This post has also been posted here. Please see the comment on the linked page, useful information!Let $\Gamma\subset \mathbb R^N$ be $\mathcal H^{N-1}$-rectifiable. Then we know that $\mathcal H^{N-1}$ a.e. $x\in \Gamma$ has density $1$. In particul
I've been stuck on this one for a while now. It's problem 2.4 from Falconer's "The geometry of fractals"Given an $\mathcal{H}^{s}$ measurable subset $E\subset \mathbb{R}^n$ with $0<\mathcal{H}^{s}(E)<\infty$, we let $\overline{D}^{s}(E,x)$
I've been told that on a n-dimensionnal Riemannian manifold, the Hausdorff measure of dimension n-1 and the codimension 1 measure $v_{-1}$ (defined below) are mutually absolutely continuous. I've tried to prove it, but it is not obvious for me. I thi
I am working on finding the determinant of the following block matrix $$ \begin{pmatrix} C & D \\ D^* & C \\ \end{pmatrix}, $$ where C and D are 4x4 matrices with complex entries. I've found a theorem that states $$ det\begin{pmatrix} A & B \\
Let $(E,\pi,B)$ be a principal bundle with structure group $G$. The connection $1$-form can be thought of as a projection on the vertical part. It allows us to characterize the horizontal subspaces as $H_p E = \ker \omega_p$ then.Apart from that, the
What is a good geometric way of thinking of complex tangent vectors on a manifold? I can convince myself that I understand tangent vectors by thinking of them as paths on the manifold. Is there a nice way to visualize or think of complex vectors on a
What is the geometric interpretation for Ricci and Holomorphic Bisectional curvatures in the two dimensional space,like an open ball in the real plane??Any intuitive idea or source will be helpful.
I've stumbled upon a problem that basically reduces to having two random variables $$X \sim N(\mu_X,\sigma_X)$$ $$Y \sim N(\mu_Y,\sigma_Y)$$ and defining the third as $$Z = \sqrt{X^2 + Y^2}$$ Although it would be convenient to have the exact expressi
Let $ X $ and $Y$ be two random variables with means $\mu_X$ and $\mu_Y$ respectively, as well as variances $\sigma_X$ and $\sigma_Y$ (all of which exist). I am interested in computing the following variance:$$ Var[sgn(X-Y)]$$where, of course, sgn de
I have been requested to solve this problem:Compute the distance between the lines:$L_{1}:\frac{x-2}{3}=\frac{y-5}{2}=\frac{z-1}{-1}$ and $L_{2}:\frac{x-4}{-4}=\frac{y-5}{4}=\frac{z+2}{1}$This is my solution:I specify one point for each line:$P_{L_{1
Let $R$ be a commutative ring with identity. Consider the polynomial ring $R[x]$. Suppose $f \in R[x]$ and $a \in R$ are such that $f(a) = 0$. Is it true that $f(x) = (x - a)g(x)$ for some $g \in R[x]$?If $f(x)=\sum_{k=0}^n a_k x^k$, we can write $$f