<html><head></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><div>All,</div><div><br></div><div>Is there anyway to access final gradient information from the function_minimizer class. Some people I support have interest in doing this from the report section.</div><div>Thanks.</div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><br><div>
<span class="Apple-style-span" style="border-collapse: separate; color: rgb(0, 0, 0); font-family: Helvetica; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; font-size: medium; "><div><div>Matthew Supernaw</div><div>Scientific Programmer</div><div>National Oceanic and Atmospheric Administration</div><div>National Marine Fisheries Service</div><div>Sustainable Fisheries Division</div><div>St. Petersburg, FL, 33701</div><div>Office 727-551-5606</div><div>Fax 727-824-5300</div></div></span>
</div>
<br><div><div>On Sep 17, 2012, at 3:00 PM, <a href="mailto:developers-request@admb-project.org">developers-request@admb-project.org</a> wrote:</div><br class="Apple-interchange-newline"><blockquote type="cite"><div>Send Developers mailing list submissions to<br><span class="Apple-tab-span" style="white-space:pre"> </span><a href="mailto:developers@admb-project.org">developers@admb-project.org</a><br><br>To subscribe or unsubscribe via the World Wide Web, visit<br><span class="Apple-tab-span" style="white-space:pre"> </span>http://lists.admb-project.org/mailman/listinfo/developers<br>or, via email, send a message with subject or body 'help' to<br><span class="Apple-tab-span" style="white-space:pre"> </span>developers-request@admb-project.org<br><br>You can reach the person managing the list at<br><span class="Apple-tab-span" style="white-space:pre"> </span>developers-owner@admb-project.org<br><br>When replying, please edit your Subject line so it is more specific<br>than "Re: Contents of Developers digest..."<br><br><br>Today's Topics:<br><br> 1. Fwd: [New post] Julia, Matlab, and C (John Sibert)<br><br><br>----------------------------------------------------------------------<br><br>Message: 1<br>Date: Mon, 17 Sep 2012 07:18:38 -1000<br>From: John Sibert <sibert@hawaii.edu><br>To: ADMB Developers <developers@admb-project.org><br>Subject: [Developers] Fwd: [New post] Julia, Matlab, and C<br>Message-ID: <50575B6E.8090606@hawaii.edu><br>Content-Type: text/plain; charset=UTF-8; format=flowed<br><br><br><br><br>-------- Original Message --------<br>Subject: <span class="Apple-tab-span" style="white-space:pre"> </span>[New post] Julia, Matlab, and C<br>Date: <span class="Apple-tab-span" style="white-space:pre"> </span>Mon, 17 Sep 2012 02:37:37 +0000<br>From: <span class="Apple-tab-span" style="white-space:pre"> </span>Justin Domkes Weblog <comment-reply@wordpress.com><br>Reply-To: <span class="Apple-tab-span" style="white-space:pre"> </span>Justin Domkes Weblog <br><comment+_dvt62oavm4btqopdkeujz@comment.wordpress.com><br>To: <span class="Apple-tab-span" style="white-space:pre"> </span>sibert@hawaii.edu<br><br><br><br>WordPress.com<br>justindomke posted: "Julia is a new language in the same arena as Matlab <br>or R. I've had failed attempts to quit the Matlab addiction in the past, <br>making me generally quite conservative about new platforms. However, <br>I've recently been particularly annoyed by Matlab's slow spee"<br>Respond to this post by replying above this line<br><br><br> New post on *Justin Domke's Weblog*<br><br><span class="Apple-tab-span" style="white-space:pre"> </span><br><br><http://justindomke.wordpress.com/author/justindomke/> <span class="Apple-tab-span" style="white-space:pre"> </span><br><br><br> Julia, Matlab, and C<br> <http://justindomke.wordpress.com/2012/09/17/julia-matlab-and-c/><br><br>by justindomke <http://justindomke.wordpress.com/author/justindomke/><br><br>Julia <http://julialang.org/> is a new language in the same arena as <br>Matlab or R. I've had failed attempts <br><http://justindomke.wordpress.com/2008/11/29/mandelbrot-in-scala/> to <br>quit the Matlab addiction in the past, making me generally quite <br>conservative about new platforms. However, I've recently been <br>particularly annoyed by Matlab's slow speed, evil license manager <br>errors, restrictions on parallel processes, C++ .mex file pain, etc., <br>and so I decided to check it out. It seems inevitable that Matlab will <br>eventually displaced by /something/. The question is: is that something <br>Julia?<br><br> "We want a language that?s open source, with a liberal license. We<br> want the speed of C with the dynamism of Ruby. We want a language<br> that?s homoiconic, with true macros like Lisp, but with obvious,<br> familiar mathematical notation like Matlab. We want something as<br> usable for general programming as Python, as easy for statistics as<br> R, as natural for string processing as Perl, as powerful for linear<br> algebra as Matlab, as good at gluing programs together as the shell.<br> Something that is dirt simple to learn, yet keeps the most serious<br> hackers happy. We want it interactive and we want it compiled."<br><br>Essentially, the goal seems to be a faster, freer Matlab that treats <br>users like adults (macros!) and /doesn't require writing any .mex files <br>in C++ or Fortan/. Sounds too good to be true? I decided to try it out. <br>My comparisons are to Matlab (R2012a) and C (gcc 4.2 with -O2 and <br>-fno-builtin to prevent compile-time computations) all on a recent <br>MacBook Pro (2.3 GHz Intel Core i7 w/ 8GB RAM).<br><br>Installation was trivial: I just grabbed a pre-compiled binary <br><https://github.com/JuliaLang/julia/downloads> and started it up.<br><br>I deliberately used naive algorithms, since I am just testing raw speed. <br>It should be a fair comparison, as long as the algorithm is constant. <br>Please let me know about any bugs, though.<br><br>Click here to skip to the results <#results>.<br><br><br> First benchmark: Fibonnaci<br><br>% Matlab<br>function f=fib(n)<br> if n <= 2<br> f=1.0;<br> else<br> f=fib(n-1)+fib(n-2);<br> end<br>end<br><br>// C<br>double fib(int n){<br> if(n<=2)<br> return(1.0);<br> else<br> return(fib(n-2)+fib(n-1));<br>}<br><br>% julia<br>function fib(n)<br> if n <= 2<br> 1.0<br> else<br> fib(n-1)+fib(n-2);<br> end<br>end<br><br>Clarity is basically a tie. Running them for n=30 we get:<br><br>time in matlab (fib): 14.344231<br>time in c (fib): 0.005795<br>time in julia (fib): 0.281221<br><br><br> Second benchmark: Matrix Multiplication<br><br>This is a test of the naive O(N^3) matrix multiplication algorithm.<br><br>% matlab<br>function C=mmult(A,B,C)<br> [M,N] = size(A);<br> for i=1:M<br> for j=1:M<br> for k=1:M<br> C(i,j) = C(i,j) + A(i,k)*B(k,j);<br> end<br> end<br> end<br>end<br><br>// C<br>#define M 500<br>void mmult(double A[M][M],double B[M][M], double C[M][M]){<br> //double C[M][M];<br> int i,j,k;<br> for(i=0; i<M; i++)<br> for(j=0; j<M; j++){<br> C[i][j] = 0;<br> for(k=0; k<M; k++)<br><span class="Apple-tab-span" style="white-space:pre"> </span>C[i][j] += A[i][k]*B[k][j];<br> }<br>}<br><br># julia<br>function mmult(A,B)<br> (M,N) = size(A);<br> C = zeros(M,M);<br> for i=1:M<br> for j=1:M<br> for k=1:M<br> C[i,j] += A[i,k]*B[k,j];<br> end<br> end<br> end<br> C;<br>end<br><br>Here, I think that Matlab and Julia and a bit clearer, and Julia wins <br>though the wonders of having "+=". The timing results on 500x500 <br>matrices are:<br><br>time in matlab (matmult): 1.229571<br>time in c (matmult): 0.168296<br>time in julia (matmult): 0.505222<br><br><br> Third Benchmark: numerical quadrature<br><br>Here, we attempt to calculate the integral \int_{x=5}^{15} sin(x) dx by <br>numerical quadrature, using a simple midpoint rule with computations at <br>10^7 points.<br><br>% matlab<br>function val=numquad(lb,ub,npoints)<br> val = 0.0;<br> for x=lb:(ub-lb)/npoints:ub<br> val = val + sin(x)/npoints;<br> end<br>end<br><br>// C<br>double numquad(double lb,double ub,int npoints){<br> double val = 0.0;<br> int i;<br> for(i=0; i<=npoints; i++){<br> double x = lb + (ub-lb)*i/npoints;<br> val += sin(x)/npoints;<br> }<br> return(val);<br>}<br><br># julia<br>function numquad(lb,ub,npoints)<br> val = 0.0<br> for x=lb:(ub-lb)/npoints:ub<br> val += sin(x)/npoints<br> end<br> val<br>end<br><br>The timings are:<br><br>time in matlab (numquad): 0.446151<br>time in c (numquad): 0.201071<br>time in julia (numquad): 0.273863<br><br><br> Fourth Benchmark: Belief Propagation<br><br>Finally, I decided to try a little algorithm similar to what I actually <br>tend to implement for my research. Roughly speaking, Belief Propagation <br><http://en.wikipedia.org/wiki/Belief_propagation> is a repeated sequence <br>of matrix multiplications, followed by normalization.<br><br>% matlab<br>function x=beliefprop(A,x,N)<br> for i=1:N<br> x = A*x;<br> x = x/sum(x);<br> end<br>end<br><br>// C<br>void beliefprop(double A[25][25], double x[25], int N){<br> int i, n, j;<br> double x2[25];<br> for(n=0; n<N; n++){<br> for(i=0; i<25; i++){<br> x2[i]=0;<br> for(j=0; j<25; j++)<br><span class="Apple-tab-span" style="white-space:pre"> </span>x2[i] += A[i][j]*x[j];<br> }<br> for(i=0; i<25; i++)<br> x[i]=x2[i];<br> double mysum = 0;<br> for(i=0; i<25; i++)<br> mysum += x[i];<br> for(i=0; i<25; i++)<br> x[i] /= mysum;<br> }<br> return;<br>}<br><br>% julia<br>function beliefprop(A,x,N)<br> for i=1:N<br> x = A*exp(x);<br> x /= sum(x);<br> end<br> x<br>end<br><br>Here, I think we can agree that Matlab and Julia are clearer. (Please <br>don't make fun of me for hardcoding the 25 dimensions in C.) Using a <br>matrix package for C would probably improve clarity, but perhaps also <br>slow things down. The results are:<br><br>time in matlab (beliefprop): 0.627478<br>time in c (beliefprop): 0.073564<br>time in julia (beliefprop): 0.489665<br><br><br> Fifth Benchmark: BP in log-space<br><br>In practice, Belief Propagation is often implemented in log-space (to <br>help avoid numerical under/over-flow.). To simulate an algorithm like <br>this, I tried changing to propagation to take an exponent before <br>multiplication, and a logarithm before storage.<br><br>% matlab<br>function x=beliefprop2(A,x,N)<br> for i=1:N<br> x = log(A*exp(x));<br> x = x - log(sum(exp(x)));<br> end<br>end<br><br>// C<br>void beliefprop2(double A[25][25], double x[25], int N){<br> int i, n, j;<br> double x2[25];<br> for(n=0; n<N; n++){<br> for(i=0; i<25; i++){<br> x2[i]=0;<br> for(j=0; j<25; j++)<br><span class="Apple-tab-span" style="white-space:pre"> </span>x2[i] += A[i][j]*exp(x[j]);<br> }<br> for(i=0; i<25; i++)<br> x[i]=log(x2[i]);<br> double mysum = 0;<br> for(i=0; i<25; i++)<br> mysum += exp(x[i]);<br> double mynorm = log(mysum);<br> for(i=0; i<25; i++)<br> x[i] -= mynorm;<br> }<br> return;<br>}<br><br># julia<br>function beliefprop2(A,x,N)<br> for i=1:N<br> x = log(A*exp(x));<br> x -= log(sum(exp(x)));<br> end<br> x<br>end<br><br>Life is too short to write C code like that when not necessary. But how <br>about the speed, you ask?<br><br>time in matlab (beliefprop2): 0.662761<br>time in c (beliefprop2): 0.646153<br>time in julia (beliefprop2): 0.615113<br><br><br> Sixth Benchmark: Markov Chain Monte Carlo<br><br>Here, I implement a simple Metropolis algorithm. For no particular <br>reason, I use the two-dimensional distribution:<br><br>p(x) \propto \exp(\sin(5 x_1) - x_1^2 - x_2^2)<br><br><http://justindomke.files.wordpress.com/2012/09/dist.jpg><br><br>% matlab<br>function mcmc(x,N)<br>f = @(x) exp(sin(x(1)*5) - x(1)^2 - x(2)^2);<br>p = f(x);<br>for n=1:N<br> x2 = x + .01*randn(size(x));<br> p2 = f(x2);<br> if rand < p2/p<br> x = x2;<br> p = p2;<br> end<br>end<br>end<br><br>// C<br>double f(double *x){<br> return exp(sin(x[0]*5) + x[1]*x[1]);<br>}<br>#define pi 3.141592653589793<br>void mcmc(double *x,int N){<br> double p = f(x);<br> int n;<br> double x2[2];<br> for(n=0; n<N; n++){<br> // run Box_Muller to get 2 normal random variables<br> double U1 = ((double)rand())/RAND_MAX;<br> double U2 = ((double)rand())/RAND_MAX;<br> double R1 = sqrt(-2*log(U1))*cos(2*pi*U2);<br> double R2 = sqrt(-2*log(U1))*sin(2*pi*U2);<br> x2[0] = x[0] + .01*R1;<br> x2[1] = x[1] + .01*R2;<br> double p2 = f(x2);<br> if(((double)rand())/RAND_MAX< p2/p){<br> x[0] = x2[0];<br> x[1] = x2[1];<br> p = p2;<br> }<br> }<br>}<br><br>% julia<br>function mcmc(x,N)<br>f(x) = exp(sin(x[1]*5) - x[1]^2 - x[2]^2);<br>p = f(x);<br>for n=1:N<br> x2 = x + .01*randn(size(x));<br> p2 = f(x2);<br> if rand() < p2/p<br> x = x2;<br> p = p2;<br> end<br>end<br>end<br><br>Again, I think that C is far less clear than Matlab or Julia. The <br>timings are:<br><br>time in matlab (mcmc): 7.747716<br>time in c (mcmc): 0.646153<br>time in julia (mcmc): 0.613223<br><br><br> Table<br><br> Matlab C Julia<br>fib 14.344 0.005 0.281<br>matmult 1.229 0.168 0.505<br>numquad 0.446 0.201 0.273<br>bp 0.627 0.073 0.489<br>bp2 0.662 0.646 0.615<br>mcmc 7.747 0.646 0.613<br><br><br> Conclusions<br><br>I'm sure all these programs can be sped up. In particular, I'd bet that <br>an expert could optimize the C code to beat Julia on |bp2| and |mcmc|. <br>These are a test of "how fast can Justin Domke make these programs", not <br>the intrinsic capabilities of the languages. That said, Julia allows for <br>optional type declarations. I did experiment with these but found <br>absolutely no speed improvement. (Which is a good or a bad thing, <br>depending on how you look at life.)<br><br>Another surprise to me was how often Matlab's JIT managed a speed within <br>a reasonable factor of C. (Except when it didn't...)<br><br>The main thing that at Matlab programmer will miss in Julia is <br>undoubtedly plotting. The Julia designers seem to understand the <br>importance of this ("non-negotiable"). If Julia equalled Matlab's <br>plotting facilities, Matlab would be in real trouble!<br><br>Overall, I think that the killer features of freedom, kinda-sorta-C-like <br>speed, and ease of use make Julia more likely as a Matlab-killer than <br>other projects such as R, Sage, Octave, Scipy, etc. (Not to say that <br>those projects have not succeeded in other ways!) Though Julia's <br>designers also seem to be targeting current R users, my guess is that <br>they will have more success with Matlab folks in the short term, since <br>most Matlab functionality (other than plotting) already exists, while <br>reproducing R's statistical libraries will be quite difficult. I also <br>think that Julia would be very attractive to current users of languages <br>like Lush <http://lush.sourceforge.net/>. Just to never write another <br>.mex file, I'll very seriously consider Julia for new projects. Other <br>benefits such as macros, better parallelism support are just bonuses. As <br>Julia continues to develop, it will become yet more attractive.<br><br>There was an interesting discussion on Lambda the Ultimate <br><http://lambda-the-ultimate.org/node/4452> about Julia back when it was <br>announced<br><br>*justindomke <http://justindomke.wordpress.com/author/justindomke/>* | <br>September 17, 2012 at 2:37 am | Tags: c++ <br><http://justindomke.wordpress.com/?tag=c>, efficiency <br><http://justindomke.wordpress.com/?tag=efficiency>, julia <br><http://justindomke.wordpress.com/?tag=julia>, matlab <br><http://justindomke.wordpress.com/?tag=matlab> | Categories: <br>Uncategorized <http://justindomke.wordpress.com/?cat=1> | URL: <br>http://wp.me/pgOXE-j8<br><br>Comment <br><http://justindomke.wordpress.com/2012/09/17/julia-matlab-and-c/#respond> <br>See all comments <br><http://justindomke.wordpress.com/2012/09/17/julia-matlab-and-c/#comments><br><br>Unsubscribe or change your email settings at Manage Subscriptions <br><https://subscribe.wordpress.com/?key=4220e3bc2f2ce5bd9d8f568e895ce9d4&email=sibert%40hawaii.edu>. <br><br><br>*Trouble clicking?* Copy and paste this URL into your browser:<br>http://justindomke.wordpress.com/2012/09/17/julia-matlab-and-c/<br><br><br>Thanks for flying with WordPress.com <http://wordpress.com><br><br><br><br><br><br><br><br><br>------------------------------<br><br>_______________________________________________<br>Developers mailing list<br>Developers@admb-project.org<br>http://lists.admb-project.org/mailman/listinfo/developers<br><br><br>End of Developers Digest, Vol 43, Issue 5<br>*****************************************<br></div></blockquote></div><br></body></html>