🧵 Untitled Thread
Anonymous at Sun, 27 Oct 2024 17:39:15 UTC No. 16452449
What is the fastest way to check that I've Orthonormalized a basis? I usually have to do O(n^2) dot products of size n to verify that they are all 0. Is there a faster way, and what kind of math do I have to study to prove that there is or isn't?
Anonymous at Sun, 27 Oct 2024 17:44:57 UTC No. 16452464
>>16452449
do it well so you don’t have to check
Anonymous at Sun, 27 Oct 2024 17:48:55 UTC No. 16452473
>>16452464
>Do it well
I want to do it fast. My method of checking being the fastest would imply that Gram Schmidt is the fastest way. This is O(n^3), not to mention all the fast inverse square roots I have to do.
Anonymous at Sun, 27 Oct 2024 17:50:59 UTC No. 16452481
>>16452473
https://en.wikipedia.org/wiki/Gram%
just check wikipedia. It has all the references. Besides, don’t bother with all this crap when numerical linear algebra libraries like Eigen already exist. I can guarantee you that they have implementations that run faster than yours.
Anonymous at Sun, 27 Oct 2024 17:55:38 UTC No. 16452493
>>16452481
My pursuit is theoretical. I don't care about libraries. I just want to be sure there's no faster way.
Anonymous at Sun, 27 Oct 2024 18:29:29 UTC No. 16452539
>>16452449
>Is there a faster way, and what kind of math do I have to study to prove that there is or isn't?
No, there is no faster way. The kind of math is differential geometry. Consider the following. Orthonormality properties of a coordinate system are entirely contained in its metric tensor [math]g_{ij}=v_i\cdot v_j[/math]. If the metric tensor is the identity matrix, then you have an orthonormal basis. Since this is a rank-2 tensor, it requites O(N^2) operations to construct.
Anonymous at Sun, 27 Oct 2024 18:36:53 UTC No. 16452550
>>16452493
>I just want to be sure there's no faster way
nigga, people still don't know the low bounds of the complexity of matrix multiplication
also, it's important to know how big the basis is, as that may influence what algorith is best
>>16452473
this is a numerically unstable implementation of GS, not good to program on a computer except for the smallest of bases
Anonymous at Sun, 27 Oct 2024 18:38:45 UTC No. 16452554
>>16452539
Ok, I'll look into it. I was honestly expecting something related to automata.
Anonymous at Sun, 27 Oct 2024 18:40:23 UTC No. 16452555
>>16452554
If you aren't convinced by my post, I can justify why the metric tensor must be of rank 2. The inner product of two arbitrary vectors is given by [math](a,b) = \sum_{i=1}^N\sum_{j=1}^N g_{ij}a_ib_j[/math]. The inner product is invariant under coordinate transformations. If the metric were given by a rank-1 tensor instead, the resulting expression for the inner product would be a vector, which isn't invariant. Contradiction.
Anonymous at Sun, 27 Oct 2024 18:45:40 UTC No. 16452562
>>16452539
no one determines if they have an orthonormal basis this way on a computer
Anonymous at Sun, 27 Oct 2024 18:46:33 UTC No. 16452563
>>16452562
I don't care and neither does OP (>>16452493). This is a mathematical proof. Fuck off to >>>/g/ if you don't like it.
Anonymous at Sun, 27 Oct 2024 18:52:38 UTC No. 16452565
>>16452555
So I have to take O(n^2) dot products, and there's no transformation that will do any better. Got it. But if I'm working in a finite field, I bet there's a fast way.
Anonymous at Sun, 27 Oct 2024 19:11:20 UTC No. 16452582
>>16452565
Oh, very interesting. I believe vector spaces over finite fields aren't metric spaces, no? I'm not even sure one can come up with a well-defined notion of an inner product on such spaces. Such definitions fail even for the rationals, because the space isn't complete in the topological sense. So it seems that this problem is ill-posed, because there can be no notions of orthogonality and normality in such vector spaces.
Anonymous at Mon, 28 Oct 2024 15:21:42 UTC No. 16453609
>>16452550
>nigga, people still don't know the low bounds of the complexity of matrix multiplication
We practically do, it's O(irrelevant) because the algorithms are galactic anyway.