Login

Welcome, Guest. Please login or register.

May 19, 2024, 07:51:12 am

Author Topic: Abstract Algebra  (Read 5638 times)  Share 

0 Members and 1 Guest are viewing this topic.

/0

  • Victorian
  • ATAR Notes Legend
  • *******
  • Posts: 4124
  • Respect: +45
Re: Abstract Algebra
« Reply #15 on: July 15, 2010, 08:14:19 pm »
0
Another way you could prove that is to write and then use the binomial theorem to expand that.


err could you explain how that works? I can't see how that explains it

zzdfa

  • Victorian
  • Forum Obsessive
  • ***
  • Posts: 328
  • Respect: +4
Re: Abstract Algebra
« Reply #16 on: July 15, 2010, 10:43:21 pm »
0
Well after typing it up I see that its not as quick as I thought it was, I thought the last step (*) would be more straightforward. I'll post it anyway since the technique is occaisonally useful.

We want to show that A^n is never equal to the identity.
which is the same as showing that (I+B)^n is never the identity , where

By the binomial theorem, so that if (I+B)^n = I then





(*) and so the problem reduces to showing that the expression is never 0. I can't think of any quick way to do this for this particular B (apart from induction, which would make it too similar to your proof). Oh well, this would work with a more convenient b, (say the matrix with a  single 1 in the top left).






/0

  • Victorian
  • ATAR Notes Legend
  • *******
  • Posts: 4124
  • Respect: +45
Re: Abstract Algebra
« Reply #17 on: July 16, 2010, 01:28:20 am »
0
Ah, thanks zzdfa, that certainly looks useful

/0

  • Victorian
  • ATAR Notes Legend
  • *******
  • Posts: 4124
  • Respect: +45
Re: Abstract Algebra
« Reply #18 on: July 17, 2010, 09:33:27 am »
0
In the video lectures, the guy takes the determinant of the matrix:



By multiplying together the determinants of

and

Is there a general rule for this sort of stuff? I don't remember seeing it before

(I know how to take the determinant by interchanging the top two rows and multiplying along the diagonal, but is he using a different principle?)



Also, how would you go about proving that for is the ONLY centre of ?

i.e. for all square matrices of dimension .

Thanks
« Last Edit: July 17, 2010, 12:08:09 pm by /0 »

zzdfa

  • Victorian
  • Forum Obsessive
  • ***
  • Posts: 328
  • Respect: +4
Re: Abstract Algebra
« Reply #19 on: July 17, 2010, 12:54:06 pm »
0
For your first question, you probably saw the example and probably realized that 'the det of a block diagonal matrix is the product of the det of the blocks'. This is true and easy to prove (using Leibniz's formula).

The natural generalization is 'the det of a block matrix is the det of the matrix of the det of the blocks'. (see how the diagonal case is a special case of this because the det ofa  diagonal matrix is the product of its entries). Unfortunately this is not true. See
http://en.wikipedia.org/wiki/Determinant#Block_matrices

And you were asking about 'general rules', another 'general rule' you could use is that the matrix you gave above is a permutation matrix (this particular matrix swaps entries 1 and 2 when you let it act on a vector) and the det of a permutation matrix is just the sign of the permutation, which is -1 in this case since it swaps 2 elements.





« Last Edit: July 17, 2010, 12:58:52 pm by zzdfa »

/0

  • Victorian
  • ATAR Notes Legend
  • *******
  • Posts: 4124
  • Respect: +45
Re: Abstract Algebra
« Reply #20 on: July 17, 2010, 01:03:26 pm »
0
thanks zzdfa

/0

  • Victorian
  • ATAR Notes Legend
  • *******
  • Posts: 4124
  • Respect: +45
Re: Abstract Algebra
« Reply #21 on: July 17, 2010, 01:22:47 pm »
0
A homomorphism is proposed in the lecture:

, where under composition.



But isn't the homomorphism meant to be a function of alone? How come we have elements of here?

The lecture then proceeds to show it's a homomorphism by doing this:



?

Thanks

zzdfa

  • Victorian
  • Forum Obsessive
  • ***
  • Posts: 328
  • Respect: +4
Re: Abstract Algebra
« Reply #22 on: July 17, 2010, 02:30:44 pm »
0
here's a sketch of how you'd do the 2nd question. The idea is to, given an nonidentity matrix A, construct a matrix B that doesn't commute with A.
Given an A, since it is not the scalar multiple of an identity,
1. there is a vector v such that Av is not the scalar multiple of v. In particular Av is linearly independent to v.
2. choose a basis of R^n that contains the 2 vectors {v,Av}
3. let B be a matrix that takes v to v, i.e. Bv=v, and takes Av to 0, B(Av)=0.
4. then verify that ABv \neq BAv and so AB \neq BA.
I haven't justified each step rigorously , but these are just direct applications of the basic theorems in LA. except for step 1, can't think of a simple reason for that.


---


if g is an element of G, then f(g) is an element of Aut(G). that is, f(g) is an automorphism. remember, automorphisms are functions.
the formula f(g)(h)=ghg^-1 is telling you that
f(g) is the automorphism on G that takes a h in G to

if the notation is confusing (as it often does when you have a function that outputs another function)then try writing ("to every there is an associated automorphism that acts on G according the following rule:

dont get confused, there are 2  'levels' of morphisms here, the first level is the fact that the map is a homomorphism , and the 2nd one is the fact that the map  is an automorphism





kamil9876

  • Victorian
  • Part of the furniture
  • *****
  • Posts: 1943
  • Respect: +109
Re: Abstract Algebra
« Reply #23 on: July 18, 2010, 04:37:23 am »
0
Another way you could prove that is to write and then use the binomial theorem to expand that.


does the binomial theorem hold for matrix multiplication? u need commutativity to prove it.

also for fun:

are and (with addition only) isomorphic? I'll give 10 quid to anyone who proves it without the continuum hypothesis.

Voltaire: "There is an astonishing imagination even in the science of mathematics ... We repeat, there is far more imagination in the head of Archimedes than in that of Homer."

zzdfa

  • Victorian
  • Forum Obsessive
  • ***
  • Posts: 328
  • Respect: +4
Re: Abstract Algebra
« Reply #24 on: July 19, 2010, 08:22:22 pm »
0
yes, because I commutes with everything

also, nfi how to approach your question. hints?

kamil9876

  • Victorian
  • Part of the furniture
  • *****
  • Posts: 1943
  • Respect: +109
Re: Abstract Algebra
« Reply #25 on: July 22, 2010, 03:13:06 am »
0
ahh yes sorry I was high on English beer. I made the problem up myself, I tried to prove in vain that they are not isomorphic ( I first proved that Z and Z^2, Q and Q^2 are not isomorphic). When I come back home on saturday I will post in more detail. I'm not even sure if it is decidable without the continumm hypothesis or axiom of choice. A clue is to think of R and R^2 as vector spaces over Q.
Voltaire: "There is an astonishing imagination even in the science of mathematics ... We repeat, there is far more imagination in the head of Archimedes than in that of Homer."

zzdfa

  • Victorian
  • Forum Obsessive
  • ***
  • Posts: 328
  • Respect: +4
Re: Abstract Algebra
« Reply #26 on: July 24, 2010, 12:59:59 pm »
0
yeah I thought about it a little more:
The Z->Z^2 case follows from that fact that p(z)=zp(1)
The Q->Q^2 case follows from the fact that any homomorphism is in fact a linear transformation (by fiddling around with the defn of homomorphism), and we know that there is no linear isomorphism Q->Q^2 (different dimensions)


For the R->R^2 case,  assuming that bases for R and R^2 exist and have the same cardinality, find a bijection between the bases and we're done.
And it seems like you need AC to find the basis and CH to prove that they have the same cardinality.


Ahmad

  • Victorian
  • Part of the furniture
  • *****
  • Posts: 1296
  • *dreamy sigh*
  • Respect: +15
Re: Abstract Algebra
« Reply #27 on: July 24, 2010, 06:12:59 pm »
0
An idea for showing R and R^2 as Q vectorspaces have dimension of the continuum: If R had a countable basis then we can express any element of R uniquely as a finite linear combination of elements of the basis B which means R is the union of all elements expressible by a linear combination of 1 basis element, and those expressible as a linear combination of 2 basis elements, and 3 basis elements and so on. This is a countable union of countable sets (why are they countable?), which is countable but R is uncountable.

So the basis has size more than aleph_0 and at most the continuum (R is a spanning set for itself, of course), now you need CH. Same goes for R^2. Intriguing result.
« Last Edit: July 24, 2010, 06:15:15 pm by Ahmad »
Mandark: Please, oh please, set me up on a date with that golden-haired angel who graces our undeserving school with her infinite beauty!

The collage of ideas. The music of reason. The poetry of thought. The canvas of logic.


kamil9876

  • Victorian
  • Part of the furniture
  • *****
  • Posts: 1943
  • Respect: +109
Re: Abstract Algebra
« Reply #28 on: July 24, 2010, 10:11:40 pm »
0
yep good
Voltaire: "There is an astonishing imagination even in the science of mathematics ... We repeat, there is far more imagination in the head of Archimedes than in that of Homer."

/0

  • Victorian
  • ATAR Notes Legend
  • *******
  • Posts: 4124
  • Respect: +45
Re: Abstract Algebra
« Reply #29 on: August 21, 2010, 11:15:28 am »
0
Let be a cyclic group of order . Can someone please explain how if are coprime then is isomorphic to ? I arrived late to one of the lectures and must have missed the explanation. Thanks