Linear Algebra for Engineers, Semester A, 5768

Important notices

The announcements below may change. Check here regularly for updates.
  1. First meeting of enhancement lectures (shi'urey ha'ashara): Wednesday, 26 March, usual place & time.
  2. Second (and probably last) meeting of enhancement lectures (shi'urey ha'ashara): Monday, 31 March, usual place & time.

Exams from previous years in Linear Algebra for mathematicians. Note: Not all questions there are relevant, and moreover you should look at exams of Linear Algebra 2 as well, for questions on determinants, inner products, Gram-Schmidt, etc. Unfortunately, these exams have no questions on Engineering material like QR and LU. Look in Beck's exams (link given below) for these.

The lectures and summaries are from Lay's Linear Algebra Webpage.

A summary of most (but not all) of the material covered in lecture 1: Part 1 Part 2

A summary of most (but not all) of the material covered in lecture 2: Part 1 Part 2 Part 3
Please read at least pages 4 and 5 of Part 3.

A summary of most (but not all) of the material covered in lecture 3: Part 1 Part 2

A summary of most (but not all) of the material covered in lecture 4: Part 1 Part 2 Part 3

A summary of most (but not all) of the material covered in lecture 5: Part 1 Part 2

A summary of most (but not all) of the material covered in lecture 6: Part 1 Part 2 Part 3

A summary of most (but not all) of the material covered in lecture 7: Part 1 Part 2 Part 3

A summary of most (but not all) of the material covered in lecture 8: Part 1 Part 2 Part 3 (until page 6).

On Wednesday (March 12) meeting, we will try to cover the following: Part 1 (from page 7) Part 2 Part 3 .

Older stuff

A copy of the Moed A exam and its solution follows:
Moed A Exam.
Correct answers (read from left to right): 2 3 2 1 4 4 4 1 4 4 4 1 3 2 3.

Finding a line close to passing through three points
Summary on eigenvalues

Old exams in Linear Algebra for Engineers

5764 Moed A * 5765 Moed A * 5765 Moed B * 5766 Moed A * 5766 Moed B

Homework and solutions

By Meital Eliyahu and Oshrit Ovrutzki

Assignment 1 Solution 1 * Assignment 2 Solution 2 * Assignment 3 (part A) Solution 3a * Assignment 3 (part B) Solution 3b * Assignment 4 Solution 4 * Assignment 5 Solution 5 * Assignment 6 * Assignment 7 Solution 7 * Assignment 8 Solution 8 Solution 8 (part b) * Assignment 9 Solution 9

Of interest

Matrix calculator: Computes Determinant, LU Decomposition, Rank, QR Decomposition, Matrix Inverse, Singular Value Decomposition, Eigenvalues/eigenvectors, etc.

Gram Shmidt view

Gilbert Strang's webpage: Problems, solutions, demo's, Videos

Exercises and solutions from previous years

Targil 1 Solution 1 * Targil 2 Solution 2 * Targil 3 Solution 3 * Targil 4 Solution 4 * Targil 5 Solution 5 * Targil 6 Solution 6 * Targil 7 Solution 7 * Targil 8 Solution 8 Solution 8b
* Targil 9

Linear algebra

From Wikipedia, the free encyclopedia

Linear algebra is the branch of mathematics concerned with the study of vectors, vector spaces (also called linear spaces), linear maps (also called linear transformations), and systems of linear equations. Vector spaces are a central theme in modern mathematics; thus, linear algebra is widely used in both abstract algebra and functional analysis. Linear algebra also has a concrete representation in analytic geometry and it is generalized in operator theory. It has extensive applications in the natural sciences and the social sciences, since nonlinear models can often be approximated by linear ones.

Contents

History

The history of modern linear algebra dates back to the early 1840's. In 1843, William Rowan Hamilton introduced quaternions, which describe mechanics in three-dimensional space. In 1844, Hermann Grassmann published his book Die lineale Ausdehnungslehre (see References). Arthur Cayley introduced matrices, one of the most fundamental linear algebraic ideas, in 1857. Despite these early developments, linear algebra has been developed primarily in the twentieth century.

Matrices were poorly-defined before the development of ring theory within abstract algebra. With the coming of special relativity, many practitioners gained appreciation of the subtleties of linear algebra. Furthermore, the routine application of Cramer's rule to solve partial differential equations led to the inclusion of linear algebra in standard coursework at universities. E.T. Copson wrote, for instance,

When I went to Edinburgh as a young lecturer in 1922, I was surprised to find how different the curriculum was from that at Oxford. It included topics such as Lebesgue integration, matrix theory, numerical analysis, Riemannian geometry, of which I knew nothing...

—E.T. Copson, Preface to Partial Differential Equations, 1973

Francis Galton initiated the use of correlation coefficients in 1888. Often more than one random variable is in play and may be cross-correlated. In statistical analysis of multivariate random variables the correlation matrix is a natural tool. Thus, statistical study of such random vectors helped establish matrix usage.

Elementary introduction

Linear algebra had its beginnings in the study of vectors in Cartesian 2-space and 3-space. A vector, here, is a directed line segment, characterized by both its magnitude, represented by length, and its direction. Vectors can be used to represent physical entities such as forces, and they can be added to each other and multiplied with scalars, thus forming the first example of a real vector space.

Modern linear algebra has been extended to consider spaces of arbitrary or infinite dimension. A vector space of dimension n is called an n-space. Most of the useful results from 2- and 3-space can be extended to these higher dimensional spaces. Although people cannot easily visualize vectors in n-space, such vectors or n-tuples are useful in representing data. Since vectors, as n-tuples, are ordered lists of n components, it is possible to summarize and manipulate data efficiently in this framework. For example, in economics, one can create and use, say, 8-dimensional vectors or 8-tuples to represent the Gross National Product of 8 countries. One can decide to display the GNP of 8 countries for a particular year, where the countries' order is specified, for example, (United States, United Kingdom, France, Germany, Spain, India, Japan, Australia), by using a vector (v1, v2, v3, v4, v5, v6, v7, v8) where each country's GNP is in its respective position.

A vector space (or linear space), as a purely abstract concept about which theorems are proved, is part of abstract algebra, and is well integrated into this discipline. Some striking examples of this are the group of invertible linear maps or matrices, and the ring of linear maps of a vector space. Linear algebra also plays an important part in analysis, notably, in the description of higher order derivatives in vector analysis and the study of tensor products and alternating maps.

In this abstract setting, the scalars with which an element of a vector space can be multiplied need not be numbers. The only requirement is that the scalars form a mathematical structure, called a field. In applications, this field is usually the field of real numbers or the field of complex numbers. Linear maps take elements from a linear space to another (or to itself), in a manner that is compatible with the addition and scalar multiplication given on the vector space(s). The set of all such transformations is itself a vector space. If a basis for a vector space is fixed, every linear transform can be represented by a table of numbers called a matrix. The detailed study of the properties of and algorithms acting on matrices, including determinants and eigenvectors, is considered to be part of linear algebra.

One can say quite simply that the linear problems of mathematics - those that exhibit linearity in their behavior - are those most likely to be solved. For example differential calculus does a great deal with linear approximation to functions. The difference from nonlinear problems is very important in practice.

The general method of finding a linear way to look at a problem, expressing this in terms of linear algebra, and solving it, if need be by matrix calculations, is one of the most generally applicable in mathematics.

Some useful theorems

Generalisations and related topics

Since linear algebra is a successful theory, its methods have been developed in other parts of mathematics. In module theory one replaces the field of scalars by a ring. In multilinear algebra one considers multivariable linear transformations, that is, mappings which are linear in each of a number of different variables. This line of inquiry naturally leads to the idea of the tensor product. In the spectral theory of operators control of infinite-dimensional matrices is gained, by applying mathematical analysis in a theory that is not purely algebraic. In all these cases the technical difficulties are much greater.

See also

Note

  1. ^ The existence of a basis is straightforward for finitely generated vector spaces, but in full generality it is logically equivalent to the axiom of choice.

References

See also: List of linear algebra references

Textbooks

Free Online books

History

External links

Wikibooks has more on the topic of
Linear algebra