In the second of a two-part sequence, we proceed to discover how we will import concepts from purposeful programming methodology into Python to have the perfect of each worlds.
In the earlier put up, we coated immutable data structures. Those enable us to write down “pure” capabilities, or capabilities that don’t have any uncomfortable side effects, merely accepting some arguments and returning a consequence whereas sustaining respectable efficiency.
In this put up, we construct on that utilizing the
toolz library. This library has capabilities that manipulate such capabilities, they usually work particularly nicely with pure capabilities. In the purposeful programming world, these are also known as “higher-order functions” since they take capabilities as arguments and return capabilities as outcomes.
Let’s begin with this:
def add_one_word(phrases, phrase):
return phrases.set(phrases.get(phrase, zero) + 1)
This perform assumes that its first argument is an immutable dict-like object, and it returns a brand new dict-like object with the related place incremented: It’s a easy frequency counter.
However, it’s helpful provided that we apply it to a stream of phrases and cut back. We have entry to a reducer within the built-in module
functools.cut back(perform, stream, initializer).
We need a perform that, utilized to a stream, will return a frequency depend.
We begin by utilizing
add_all_words = curry(functools.cut back, add_one_word)
With this model, we might want to provide the initializer. However, we won’t simply add
pyrsistent.m to the
curry; it’s within the incorrect order.
add_all_words_flipped = flip(add_all_words)
flip higher-level perform returns a perform that calls the unique, with arguments flipped.
get_all_words = add_all_words_flipped(pyrsistent.m())
We make the most of the truth that
flip auto-curries its argument to offer it a beginning worth: an empty dictionary.
Now we will do
get_all_words(word_stream) and get a frequency dictionary. However, how will we get a phrase stream? Python information are by line streams.
for line in traces:
yield from line.cut up()
After testing every perform by itself, we will mix them:
words_from_file = toolz.compose(get_all_words, to_words)
In this case, the composition being of simply being two capabilities was simple to learn: Apply
to_words first, then apply
get_all_words to the consequence. The prose, it appears, is within the inverse of the code.
This issues once we begin taking composability severely. It is typically attainable to write down the code as a sequence of models, take a look at every individually, and eventually, compose all of them. If there are a number of components, the ordering of
compose can get tough to grasp.
toolz library borrows from the Unix command line and makes use of
pipe as a perform that does the identical, however within the reverse order.
words_from_file = toolz.pipe(to_words, get_all_words)
Now it reads extra intuitively: Pipe the enter into
to_words, and pipe the outcomes into
get_all_words. On a command line, the equal would appear like this:
$ cat information | to_words | get_all_words
toolz library permits us to control capabilities, slicing, dicing, and composing them to make our code simpler to grasp and to check.