Post by tsh73 on Aug 12, 2019 16:53:21 GMT -5
[RC] Gradient descent
output (numbers more or less match with other languages results)
And you can see plotted function online - it really does have min there
code
Gradient descent (also known as steepest descent) is a first-order iterative optimization algorithm for finding the minimum of a function which is described in this Wikipedia article.
Task
Use this algorithm to search for minimum values of the bi-variate function:
Task
Use this algorithm to search for minimum values of the bi-variate function:
f(x, y) = (x - 1)(x - 1)e^(-y^2) + y(y+2)e^(-2x^2)
around x = 0.1 and y = -1.
output (numbers more or less match with other languages results)
And you can see plotted function online - it really does have min there
Iterations: 23
Testing steepest descent method
The minimum is at x[0] = 0.10758686, x[1] = -1.22330304
Min value is -0.75006342
code
'[RC] gradient descent
'http://rosettacode.org/wiki/Gradient_descent
'typeScript code converted to LB/JB
'tsh73 Aug 2019
global N
N=2 'array dimension. Arrays are global
dim x(N)
dim fi(N)
call gradientDescentMain
' supposed output:
' Testing steepest descent method
' The minimum is at x[0] = 0.10768224291553158, x[1] = -1.2233090211217854
end
sub gradientDescentMain
tolerance = 0.0000006
alpha = 0.1
'dim x(N)
x(0) = 0.1 '//Initial guesses
x(1) = -1 '//of location of minimums
call steepestDescent alpha, tolerance
print "Testing steepest descent method"
print "The minimum is at x[0] = " ; x(0) ; ", x[1] = " ; x(1)
print "Min value is " ;g(x(0), x(1))
end sub
' // Using the steepest-descent method to search
' // for minimum values of a multi-variable function
sub steepestDescent alpha, tolerance 'x() is global
h = 0.0000006 '//Tolerance factor
g0 = g(x(0), x(1)) '//Initial estimate of result
'//Calculate initial gradient
'dim fi(N) 'let fi: number[] = [n];
'//Calculate initial norm
call GradG h
'// console.log("fi:"+fi);
'print "fi ";fi(0);" ";fi(1)
'//Calculate initial norm
DelG = 0.0
for i = 0 to N-1
DelG = DelG + fi(i) * fi(i)
next
DelG = sqr(DelG)
b= alpha / DelG
'//Iterate until value is <= tolerance limit
while (DelG > tolerance)
'//Calculate next value
for i = 0 to N-1
x(i) = x(i)-b * fi(i)
next
h = h/2
'//Calculate next gradient
call GradG h
'//Calculate next norm
DelG = 0
for i = 0 to N-1
DelG = DelG + fi(i) * fi(i)
next
DelG = sqr(DelG)
'LB/JB fix:
if DelG <= tolerance then exit while
'... or it divides by 0 here
b= alpha / DelG
k=k+1
'//Calculate next value
g1 = g(x(0), x(1))
'//Adjust parameter
if (g1 > g0) then
alpha = alpha/2
else
g0 = g1
end if
wend
print "Iterations: ";k
end sub
'// Provides a rough calculation of gradient g(x).
sub GradG h
'in: global x(), h. Out: global fi()
'fi would be z()
'let n: number = x.length; 'have global N
redim fi(N) 'let z: number[] = [n];
dim y(N)
y(0)= x(0):y(1)= x(1)
g0 = g(x(0), x(1))
'// console.log("y:" + y);
for i = 0 to N-1
y(0)= x(0):y(1)= x(1) 'not in ref code
y(i) = y(i)+h
fi(i) = (g(y(0), y(1)) - g0) / h
next
'// console.log("z:"+z);
'return z; 'Out: global fi()
end sub
'// Method to provide function g(x).
function g(x0, x1) 'could not pass arrays in JB/LB
g = (x0 - 1) * (x0 - 1) * exp(0-x1 * x1) + x1 * (x1 + 2) * exp(-2 * x0 * x0)
end function