Adventures in Julia
by Willie Wong
Recently I have been playing around with the Julia programming language as a way to run some cheap simulations for some geometric analysis stuff that I am working on. So far the experience has been awesome.
A few random things …
Julia has a decent IDE in JunoLab, which is built on top of Atom. In terms of functionality it captures most of the sort of things I used to use with Spyder for python, so is very convenient.
Julia interfaces with Jupyter notebooks through the IJulia kernel. I am a fan of Jupyter (I will be using it with the MATLAB kernel for a class I am teaching this fall).
For plotting, right now one of the most convenience ways is through Plots.jl, which is a plotting front-end the bridges between your code and various different backends that can be almost swapped in and out on the fly. The actual plotting is powered by things like matplotlib or plotlyJS, but for the most part you can ignore the backend. This drastically simplifies the production of visualizations. (At least compared to what I remembered for my previous simulations in python.)
I just learned very recently about automatic differentiation. At a cost in running time for my scripts, it can very much simplify the coding of the scripts. For example, we can have a black-box root finder using Newton iteration that does not require pre-computing the Jacobian by hand:
module NewtonIteration using ForwardDiff export RootFind function RootFind(f, init_guess::Vector, accuracy::Float64, cache::ForwardDiffCache; method="Newton", max_iters=100, chunk_size=0) ### Takes input function f(x::Vector) → y::Vector of the same dimension and an initial guess init_guess. Apply Newton iteration to find solution of f(x) = 0. Stop when accuracy is better than prescribed, or when max_iters is reached, at which point a warning is raised. ### Setting chunk_size=0 deactivates chunking. But for large dimensional functions, chunk_size=5 or 10 improves performance drastically. Note that chunk_size must evenly divide the dimension of the input vector. ### Available methods are Newton or Chord # First check if we are already within the accuracy bounds error_term = f(init_guess) if norm(error_term) < accuracy info("Initial guess accurate.") return init_guess end # Different solution methods i = 1 current_guess = init_guess if method=="Chord" df = jacobian(f,current_guess,chunk_size=chunk_size) while norm(error_term) >= accuracy && i <= max_iters current_guess -= df \ error_term error_term = f(current_guess) i += 1 end elseif method=="Newton" jake = jacobian(f, ForwardDiff.AllResults, chunk_size=chunk_size, cache=cache) df, lower_order = jake(init_guess) while norm(value(lower_order)) >= accuracy && i <= max_iters current_guess -= df \ value(lower_order) df, lower_order = jake(current_guess) i += 1 end error_term = value(lower_order) else warn("Unknown method: ", method, ", returning initial guess.") return init_guess end # Check if converged if norm(error_term) >= accuracy warn("Did not converge, check initial guess or try increasing max_iters (currently: ", max_iters, ").") end info("Used ", i, " iterations; remaining error=", norm(error_term)) return current_guess end end
This can then be wrapped in finite difference code for solving nonlinear PDEs!