To a packed hall at University College London last month, a group of computer scientists announced the new version of Julia, a programming language that’s been gaining traction and is among the top 50 programming languages in the world. We’d covered the origin story of Julia a couple of years ago in an interview with Viral Shah, one of the co-founders of the computing language.
The August release of Julia is what programmers call a long term stable release. Which is like a promise to the developer community that every program made on Julia 1.0, will work on subsequent versions of Julia as well. This means the language can now move from the largely experimental usage to an enterprise-ready state.
“This makes it better suited for enterprise releases. You want long-term for production enterprise use cases. Until now, while there were early customers, majority users were data scientists or one-off project users like researchers,” said Shah in an interview with FactorDaily. Edited excerpts from the interview:
For those who aren’t familiar with programming, what does this release mean?
We’ve just launched Julia 1.0. It was a nights and weekends project we started 9 years ago. We publicly announced it about 6.5 years ago. Until now, we have had six major releases. This is the first major release that will be a long-term stable release. Until this, every year, we would roughly release a new version and it would make breaking changes in the next release. For instance, if you released code in Julia .3, it won’t work on Julia .5. Because the language was evolving and changing. We as a community now feel like we can guarantee support. So, even when we release future versions, we will continue to support 1.0 for a long term. This makes it better suited for enterprise releases. You want long-term for production enterprise use cases. Until now, while there were early customers, majority users were data scientists or one-off project users like researchers. If I’m a company putting Julia in fraud detection, I’d want long-term stability.
Julia has also had a lot of contributors from India…
This is one of the few projects that has a massive India contribution from an open source point of view. We have at least another 20 major contributors from India. Some of the people who have contributed to it big time are Ranjan Anantharaman, Shashi Gowda, Amit Murthy, Tanmay Mohapatra. There’s another person who goes by the name Svaksha who keeps a list of all Julia packages. Half of our Google Summer of Code students come from India.
What are some of the projects where Julia has been used?
One of the most exciting use cases has been with the Federal Aviation Administration (FAA). They are using Julia to write the collision avoidance system. They decided to use Julia back when it was in .3. Such a mission-critical system is a huge deal for us. It’s a marquee application that shows the real power of the language.
In finance, it has been used by Aviva (the insurance company in UK). They use Julia for regulatory safeguards. They have to figure out if they’re holding assets to all liabilities in the long term, they implement all these safeguards and regulatory reporting in production. Use cases will grow 10- or 100-fold now that we have announced 1.0.
People have even put Julia on the Raspberry Pi. It can enable all kinds of interesting projects. At Julia Con, we had a talk that was about how to control Minecraft using Julia. We also ran Julia for an astronomy application using one of the supercomputers at 1.5 petaflops to map out the entire universe. That was to analyse 60 terabytes of data. We never imagined we’d be on both ends. Julia has also been ported to run on the GPU so it’s the only language for AI apart from CUDA on GPUs.
How do you think programming languages will evolve?
We started Julia not just to create a programming language but to have a real impact in the world around us. Julia was used to digitally catalogue the universe in a project called Celeste. The project was computationally very intense and used a large supercomputer all in pure Julia. This is the kind of stuff programming languages will be used more and more for. This telescope generated 60 terabytes of data in the last 15 years. The next telescope is going to generate that data every three days. So this is the challenge we are going to face.
How big is the Julia community now?
We are now at over two million downloads. From some of the data at Julia Computing, we’ve seen downloads from 700 universities and over 1,000 companies. The actual community is larger than this. There’s also a huge surge of interest from China. Until recently, India used to show up in our top five downloads. India has started falling now and others have started scaling up. It’s also taught in over 100 classes. Julia Con was 350 people this year and it was the first time in Europe.
What are the major milestones to look forward to?
We want to keep the compiler faster and improve the tooling around it. So far, the focus was around the language. Now we’ll focus on integrated development environment (IDEs). We want to add multi-threading. A rudimentary version of multi-threading already works on Julia. But usually, the challenge is to avoid programs from clashing for resources. If you do it right, all the programs can be written multi-threaded and the system will decide how you get resources. So we will work on nested and composable multi-threading. It’s something we don’t yet have yet. That will take about 1-2 years or more time and then there’s always changing priorities.
How are you funding Julia?
There are three pillars of Julia contribution. There are over 700 contributors on open sources. That’s the largest contribution base. Then, there’s a research arm of Julia at MIT (Alan Edelman’s Julia lab). The funding comes from government and research grants. And Julia Computing is the commercial arm and we have a combination of contracts with investors and product revenue.
Viral was also featured in our Outliers podcast. Listen in.
Subscribe to FactorDaily
Our daily brief keeps thousands of readers ahead of the curve. More signals, less noise.