for Can Do More

Image by Annie Ruygt

We’re Fly.io. We run apps for our users on hardware we host around the world. Fly.io happens to be a great place to run Phoenix applications. Check out how to get started!

Elixir has a for special form called a “list comprehension” and not enough people know what it can do. It can do so much more than a simple for loop, it is kind of like if the Enum and Stream modules got together and had a macro. And I want you to know about it!

Here is a standard Elixir example, reading from a file, mapping over it, filtering over it, converting it into a map:

File.read!("my_lines.txt") # key=val\n
|> Enum.filter(fn str -> "" == str end)
|> Enum.map(fn line -> 
      [key, val] = String.split(line, "=", trim: true)
        {key, val}
    end)
|> Enum.into(%{})

This is standard Elixir, clearly breaking each step into a pipeline flow, doing what you expect. One downside is that each step of this iteration creates a copy of the list with the changes applied. In most cases this is totally cool, the Erlang VM will garbage collect this with ease.

But, if my_lines.txt ends up being a huge file we might have a problem with memory. Next we break that into Stream:

File.stream!("my_lines.txt") # key=val\n
|> Stream.filter(fn str -> "" == str end)
|> Stream.map(fn line -> 
      [key, val] = String.split(line, "=", trim: true)
        {key, val}
    end)
|> Enum.into(%{})

In programming parlance a Stream is what we call a “Lazy” container, meaning until we call Enum.into(%{}) in our above code, the code is not executed. The Enum functions are eager and will immediately execute. A Stream is simply a struct that builds up a list of operations to apply to a list, when executed it iterates through the input list, Range, Enumerableor functions that create a stream.

If we were to reimplement using for:

for line <- File.stream!("my_lines.txt"), line != "", into: %{} do
  [key, val] = String.split(line, "=", trim: true)
    {key, val}
end

The first line accomplishes most of the work in this code so let’s break it down:

  • line <- File.stream(..), is the source of iteration you can tell because it has that funky left pointing arrow, in our case streaming the file. for is eager, so it will fully execute the stream similar to Enum does.
  • The next argument after our iteration, line != "" is a guard function this is equivalent to our Enum.filter above.
  • The final argument, into: %{}, lets us use the Collectable protocol to roll the result into a map. It’s functionally equivalent to Enum.into above.

The beauty of this statement is that it works like the Stream in that it does one iteration over one list , and we don’t need to mix and match Enum/Stream functions. We can even use a Stream to start it! One major downside is in clarity, this line is verbose and can be less clear than our pipeline of operations.

Let’s reduce

We could have also used the reduce keyword here instead of into like so:

for line <- File.stream!("my_lines.txt"), line != "", reduce: %{} do
  acc ->
      [key, val] = String.split(line, "=", trim: true)
        Map.put(acc, key, val)
end

The key change is that after the do block, we need to have a var_name -> right-hand arrow that names the accumulator and returns a new accumulator. This example isn’t an amazing use of reduce, but it lets us have fine-grained control over how our accumulator is updated.

More?

We can also do multiple iterators, like so:

for x <- [1, 2], y <- [2, 3] do
    x * y 
end 
# [2, 3, 4, 6]

This one is straight from the Elixir Docs, which is a testament to how frequently I use this feature. I suspect it might be handy when doing code challenges or to implement an advanced FizzBuzz! Let me know if you have a great example!

Discussion

The for special form is an invaluable tool for iteration and should not be overlooked!. That said, it’s also a bit of a sharp blade… Able to produce truly unreadable lines of code for future coders, while making you feel like a genius.

Bonus: Recursion

Elixir is also superb at recursion, in fact the Erlang Community uses it much more often than we do, though it does require rethinking how we might write it! So here it is:

def parse(), do: parse(File.read!("my_lines.txt"), %{})
def parse([], acc), do: acc # End Condition
def parse(["" | rest], acc), do: parse(rest, acc) # Skip empty
def parse([line | rest], acc) do # Default Case
  [key, val] = String.split(line, "=", trim: true)
    parse(rest, Map.put(acc, key, val))
end

We lean pretty hard on pattern matching here, and the best way to read this from top to bottom:

  • Read the text into a list of lines and set up an empty map as our accumulator
  • if the list is empty, we’re done, and we return the accumulator
  • if the line is empty, ignore that line and continue parsing, could have also used a guard here.
  • finally parse the item, add it to the accumulator and continue parsing.

This is called “tail recursion” because it does the recursion at the “tail” or “end” of our function. The VM optimizes this into a what is effectively a loop to avoid infinite stacks. You might be wondering why the Erlang Community maybe prefers this style?

  • For one thing it’s cultural, people learn to do it, they do it that way, they teach it that way. This is always the case for any norm.
  • It gives fine-grained control over what allocation happens, how you filter/map/reduce, you don’t need any of those concepts you just write code that does the thing.
  • Finally, it is that they don’t have do/end or def so their function heads are less verbose.

Fly.io ❤️ Elixir

Fly.io is a great way to run your Phoenix LiveView app close to your users. It’s really easy to get started. You can be running in minutes.

Deploy a Phoenix app today!