Login

Welcome, Guest. Please login or register.

September 28, 2024, 06:37:17 pm

Author Topic: TT's Maths Thread  (Read 122764 times)  Share 

0 Members and 3 Guests are viewing this topic.

TrueTears

  • TT
  • Honorary Moderator
  • Great Wonder of ATAR Notes
  • *******
  • Posts: 16363
  • Respect: +667
Re: TT's Maths Thread
« Reply #1125 on: November 26, 2010, 12:03:31 am »
0


Anyways so during my maths reading today I came across this, just have a few questions:

Firstly, I don't get how is equivalent to (2)? Isn't it horizontally translated...? And furthermore, how is (3) equivalent to (2)...?

Thanks guys.
PhD @ MIT (Economics).

Interested in asset pricing, econometrics, and social choice theory.

/0

  • Victorian
  • ATAR Notes Legend
  • *******
  • Posts: 4124
  • Respect: +45
Re: TT's Maths Thread
« Reply #1126 on: November 26, 2010, 12:13:03 am »
0
By equivalent they mean they are all cubic polynomials with constant cofficients. If you expand (3) out you will get something similar to the first equation, and if you 'absorb' the constants together it will look just like the first equation, e.g. , where is from the first equation.
« Last Edit: November 26, 2010, 12:15:10 am by /0 »

TrueTears

  • TT
  • Honorary Moderator
  • Great Wonder of ATAR Notes
  • *******
  • Posts: 16363
  • Respect: +667
Re: TT's Maths Thread
« Reply #1127 on: November 26, 2010, 12:17:59 am »
0
oh so THAT'S what they mean, sigh, thanks man lol
PhD @ MIT (Economics).

Interested in asset pricing, econometrics, and social choice theory.

onur369

  • Victorian
  • Part of the furniture
  • *****
  • Posts: 1070
  • Respect: +9
Re: TT's Maths Thread
« Reply #1128 on: November 26, 2010, 10:44:29 am »
0
Anyone can find me the eBook for: Essentials Further Mathematics? Please, PM me if you can.
2011:
Aims-
English 35, Further 45+, Methods 35, Physics 32, Turkish 33, Legal 28.

TrueTears

  • TT
  • Honorary Moderator
  • Great Wonder of ATAR Notes
  • *******
  • Posts: 16363
  • Respect: +667
Re: TT's Maths Thread
« Reply #1129 on: November 29, 2010, 02:33:41 am »
0
So I was just reading and I came across this in my book, this isn't a question from the book or whatever it's just that something that crossed my mind, are these the ONLY subspaces for R^2 and R^3? If so how do I prove uniqueness, I know how to prove they are subspaces but how can i show they are the ONLY ones? (if what I hypothesized was true) if not, then do there exist infinite subspaces for every R^n? Or are subspaces finite for all R^n? haha all of these are just random questions from the top of my head, maybe i'll find the answers to them further in the book but yeah if anyone would like to discuss these with me that'd be good.

[IMG]http://img80.imageshack.us/img80/7439/asdfasdfi.jpg[/img]

« Last Edit: November 29, 2010, 02:36:16 am by TrueTears »
PhD @ MIT (Economics).

Interested in asset pricing, econometrics, and social choice theory.

kamil9876

  • Victorian
  • Part of the furniture
  • *****
  • Posts: 1943
  • Respect: +109
Re: TT's Maths Thread
« Reply #1130 on: November 29, 2010, 01:44:21 pm »
0
They are indeed the only "kinds" of subspaces. It is easily shown if you know what a basis is.

I guess if you do not know this, you can prove it like this.

Suppose your subspace has some non-zero vector v, then your subspace must contain the line joining v and 0 by scalar multiplication. If your subspace has some other point w not on this line, show that your subspace must contain the plane that contains the line through 0 to w, and the line through 0 to v. If you subspace contains yet another vector z, now show that it must contain the whole R^3.
« Last Edit: November 29, 2010, 01:50:14 pm by kamil9876 »
Voltaire: "There is an astonishing imagination even in the science of mathematics ... We repeat, there is far more imagination in the head of Archimedes than in that of Homer."

TrueTears

  • TT
  • Honorary Moderator
  • Great Wonder of ATAR Notes
  • *******
  • Posts: 16363
  • Respect: +667
Re: TT's Maths Thread
« Reply #1131 on: November 29, 2010, 03:17:47 pm »
0
They are indeed the only "kinds" of subspaces. It is easily shown if you know what a basis is.

I guess if you do not know this, you can prove it like this.

Suppose your subspace has some non-zero vector v, then your subspace must contain the line joining v and 0 by scalar multiplication. If your subspace has some other point w not on this line, show that your subspace must contain the plane that contains the line through 0 to w, and the line through 0 to v. If you subspace contains yet another vector z, now show that it must contain the whole R^3.

ahh yeah i haven't read about basis yet in my book but i kinda know what they are from watching mit lectures, but yeah your explanation makes sense.
PhD @ MIT (Economics).

Interested in asset pricing, econometrics, and social choice theory.

TrueTears

  • TT
  • Honorary Moderator
  • Great Wonder of ATAR Notes
  • *******
  • Posts: 16363
  • Respect: +667
Re: TT's Maths Thread
« Reply #1132 on: December 08, 2010, 11:42:05 pm »
0
Just with this question, I understand how they get the maximum value for the rank of A but what's up with the linearly dependent statements?

[IMG]http://img72.imageshack.us/img72/1843/linearalgebrarank.jpg[/img]

Eg, why must the 7 row vectors be linearly dependent if A is a 7x4 matrix? Can anyone show the argument? Maybe I am missing something obvious but I don't see how the rank of a matrix has to do with linear independence/dependence?

Thanks
PhD @ MIT (Economics).

Interested in asset pricing, econometrics, and social choice theory.

kamil9876

  • Victorian
  • Part of the furniture
  • *****
  • Posts: 1943
  • Respect: +109
Re: TT's Maths Thread
« Reply #1133 on: December 09, 2010, 01:02:09 am »
0
Are you aware of "column rank=row rank". What is your definition of rank btw? Mostly the definition of (column) rank is the maximal independent set of columns. Similairly for row rank (interesting theorem: column rank=row rank)
Voltaire: "There is an astonishing imagination even in the science of mathematics ... We repeat, there is far more imagination in the head of Archimedes than in that of Homer."

TrueTears

  • TT
  • Honorary Moderator
  • Great Wonder of ATAR Notes
  • *******
  • Posts: 16363
  • Respect: +667
Re: TT's Maths Thread
« Reply #1134 on: December 09, 2010, 11:53:33 am »
0
yeah column rank = row rank i know that, the definition of rank i use is that it is the dimension of the rowspace and the column space.
PhD @ MIT (Economics).

Interested in asset pricing, econometrics, and social choice theory.

kamil9876

  • Victorian
  • Part of the furniture
  • *****
  • Posts: 1943
  • Respect: +109
Re: TT's Maths Thread
« Reply #1135 on: December 09, 2010, 12:52:17 pm »
0
Oh ok yeah that's the same as the definition I was referring to. So for the 7 by 4 matrix we look at the columns and see that:

dim column space 4

hence: dim

So the 7 rows are living inside a space that has no more than 4 dimensions. Therefore they are linearly dependent.


That is their argument but it's pretty stupid because you can easily see that the 7 rows are dependent by just looking at the fact that they live in which is 4 dimensional. But I guess it just serves the purpose of illustrating some things about rank of matrix.
« Last Edit: December 09, 2010, 12:53:57 pm by kamil9876 »
Voltaire: "There is an astonishing imagination even in the science of mathematics ... We repeat, there is far more imagination in the head of Archimedes than in that of Homer."

TrueTears

  • TT
  • Honorary Moderator
  • Great Wonder of ATAR Notes
  • *******
  • Posts: 16363
  • Respect: +667
Re: TT's Maths Thread
« Reply #1136 on: December 18, 2010, 12:59:27 am »
0
Okay so I just wanna confirm this, the Gram-Schmidt process, as my book has like 2 pages missing from it and only has the intro to this process. Just wanna confirm if this is correct:

Let V be any nonzero finite-dimensional inner product space, and suppose that is any basis for V. It suffices to show that V has an orthogonal basis. Let be the orthogonal basis for V.

Step 1: Let

Step 2: We can find a vector orthogonal to by computing the component of that is orthogonal to the space spanned by . This is given by

Step 3: We can find a vector orthogonal to and by computing the component of that is orthogonal to the space spanned by and . This is given by

We keep going until we have found .

Now is my step 3 correct? It's missing in my book lol

PhD @ MIT (Economics).

Interested in asset pricing, econometrics, and social choice theory.

kamil9876

  • Victorian
  • Part of the furniture
  • *****
  • Posts: 1943
  • Respect: +109
Re: TT's Maths Thread
« Reply #1137 on: December 18, 2010, 01:36:36 am »
0
yes.
Voltaire: "There is an astonishing imagination even in the science of mathematics ... We repeat, there is far more imagination in the head of Archimedes than in that of Homer."

TrueTears

  • TT
  • Honorary Moderator
  • Great Wonder of ATAR Notes
  • *******
  • Posts: 16363
  • Respect: +667
Re: TT's Maths Thread
« Reply #1138 on: December 19, 2010, 02:02:04 am »
0
thx kamil

also another query, my book says that the normal system of may have infinitely many solutions in which all of its solutions are least squares solutions of .

But then my book goes on to prove a theorem that states:

If A is a matrix with linearly independent column vectors then for every matrix the linear system has an unique least squares solution given by .

Doesn't this theorem mean that every normal system will only ever have ONE unique least square solution? Then why did my book say that the normal system could have infinitely many solutions...? Am i missing something obvious coz it's quite late lol
« Last Edit: December 19, 2010, 02:06:01 am by TrueTears »
PhD @ MIT (Economics).

Interested in asset pricing, econometrics, and social choice theory.

kamil9876

  • Victorian
  • Part of the furniture
  • *****
  • Posts: 1943
  • Respect: +109
Re: TT's Maths Thread
« Reply #1139 on: December 19, 2010, 02:01:00 pm »
0
I forgot what least squares are... havn't seen it since first year and it was quite boring. However the system may have infinitely many solutions if the column vectors are linearly independent. Your theorem applies to ones with linearly independent columns.
Voltaire: "There is an astonishing imagination even in the science of mathematics ... We repeat, there is far more imagination in the head of Archimedes than in that of Homer."