Image not available

1280x720

jenn-calcworkshop....jpg

🧵 Untitled Thread

Anonymous No. 16452449

What is the fastest way to check that I've Orthonormalized a basis? I usually have to do O(n^2) dot products of size n to verify that they are all 0. Is there a faster way, and what kind of math do I have to study to prove that there is or isn't?

Anonymous No. 16452464

>>16452449
do it well so you don’t have to check

Image not available

310x163

images.png

Anonymous No. 16452473

>>16452464
>Do it well
I want to do it fast. My method of checking being the fastest would imply that Gram Schmidt is the fastest way. This is O(n^3), not to mention all the fast inverse square roots I have to do.

Anonymous No. 16452481

>>16452473
https://en.wikipedia.org/wiki/Gram%E2%80%93Schmidt_process
just check wikipedia. It has all the references. Besides, don’t bother with all this crap when numerical linear algebra libraries like Eigen already exist. I can guarantee you that they have implementations that run faster than yours.

Anonymous No. 16452493

>>16452481
My pursuit is theoretical. I don't care about libraries. I just want to be sure there's no faster way.

Anonymous No. 16452539

>>16452449
>Is there a faster way, and what kind of math do I have to study to prove that there is or isn't?
No, there is no faster way. The kind of math is differential geometry. Consider the following. Orthonormality properties of a coordinate system are entirely contained in its metric tensor [math]g_{ij}=v_i\cdot v_j[/math]. If the metric tensor is the identity matrix, then you have an orthonormal basis. Since this is a rank-2 tensor, it requites O(N^2) operations to construct.

Anonymous No. 16452550

>>16452493
>I just want to be sure there's no faster way
nigga, people still don't know the low bounds of the complexity of matrix multiplication
also, it's important to know how big the basis is, as that may influence what algorith is best

>>16452473
this is a numerically unstable implementation of GS, not good to program on a computer except for the smallest of bases

Anonymous No. 16452554

>>16452539
Ok, I'll look into it. I was honestly expecting something related to automata.

Anonymous No. 16452555

>>16452554
If you aren't convinced by my post, I can justify why the metric tensor must be of rank 2. The inner product of two arbitrary vectors is given by [math](a,b) = \sum_{i=1}^N\sum_{j=1}^N g_{ij}a_ib_j[/math]. The inner product is invariant under coordinate transformations. If the metric were given by a rank-1 tensor instead, the resulting expression for the inner product would be a vector, which isn't invariant. Contradiction.

Anonymous No. 16452562

>>16452539
no one determines if they have an orthonormal basis this way on a computer

Anonymous No. 16452563

>>16452562
I don't care and neither does OP (>>16452493). This is a mathematical proof. Fuck off to >>>/g/ if you don't like it.

Anonymous No. 16452565

>>16452555
So I have to take O(n^2) dot products, and there's no transformation that will do any better. Got it. But if I'm working in a finite field, I bet there's a fast way.

Anonymous No. 16452582

>>16452565
Oh, very interesting. I believe vector spaces over finite fields aren't metric spaces, no? I'm not even sure one can come up with a well-defined notion of an inner product on such spaces. Such definitions fail even for the rationals, because the space isn't complete in the topological sense. So it seems that this problem is ill-posed, because there can be no notions of orthogonality and normality in such vector spaces.

Anonymous No. 16453609

>>16452550
>nigga, people still don't know the low bounds of the complexity of matrix multiplication
We practically do, it's O(irrelevant) because the algorithms are galactic anyway.