Say we wish to solve the following equations:
4x - y + z = 7 4x - 8y + z =-21 -2x + y + 5z = 15Then we can re-write them as:
We can solve these with Gauss-Seidel iteration just by plugging in the current x,y,z values we calculate from these equations (and begining with an initial estimate.)
Thus, the Gauss-Siedel method in C-code looks something like:
#include <stdio.h> int main() { //a sparse way of representing the equations float eq[3][4]; eq[0][0] = 7/4.0; eq[0][1] = 0; eq[0][2] = 1/4.0; eq[0][3]= -1/4.0; eq[1][0] = 21/8.0; eq[1][1] = 4/8.0; eq[1][2] = 0; eq[1][3]= 1/8.0; eq[2][0] = 15/5.0; eq[2][1] = 2/5.0; eq[2][2] = -1/5.0; eq[2][3]= 0; float x,y,z; x=1;y=1;z=2; //initial guess //10 iterations of gauss-seidel for (int i=0;i < 10;i++) { x = eq[0][0] + eq[0][2]*y + eq[0][3]*z; y = eq[1][0] + eq[1][1]*x + eq[1][3]*z; z = eq[2][0] + eq[2][1]*x + eq[2][2]*y; printf("%f %f %f\n",x,y,z); } return 0; }Producing this output:
1.500000 3.625000 2.875000 1.937500 3.953125 2.984375 1.992188 3.994141 2.998047 1.999023 3.999268 2.999756 1.999878 3.999908 2.999969 1.999985 3.999989 2.999996 1.999998 3.999999 3.000000 2.000000 4.000000 3.000000 2.000000 4.000000 3.000000 2.000000 4.000000 3.000000
Converging towards the solution nicely. (Things don't always converge, only when Ax=B, A is diagonally dominant - but that is another story)
Jacobi iteration does not converge as quickly, but is easy to execute in parallel. With Jacobi iteration you simply use the last iterations x,y,z value instead of updating it.
See? Solving systems of equations is easy.
No comments:
Post a Comment