Func-Oriented Code or Francis' Odd Collection.
foc
is a non-frilled and seamlessly integrated functional Python
tool.
- provides a collection of higher-order functions and placeholder lambda syntax (
_
) - provides an easy way to compose functions with symbols. (
^
and|
)
The collection of utilities contained in previous versions has been separated into a new project.
$ pip install -U foc
For more examples, see the documentation provided with each function.
>>> from foc import *
>>> (_ + 7)(3) # (lambda x: x + 7)(3)
10
>>> 3 | _ + 4 | _ * 6 # (3 + 4) * 6
42
>>> (length ^ range)(10) # length(range(10))
10
>>> cf_(rev, filter(even), range)(10) # rev(filter(even)(range(10)))
[8, 6, 4, 2, 0]
>>> ((_ * 5) ^ nth(3) ^ range)(5) # range(5)[3] * 5
10
>>> cf_(sum, map(_ + 1), range)(10) # sum(map(_ + 1, range(10)))
55
>>> range(5) | map((_ * 3) ^ (_ + 2)) | sum # sum(map(lambda x: (x + 2) * 3, range(5)))
60
>>> range(73, 82) | map(chr) | unchars # unchars(map(chr, range(73, 82)))
'IJKLMNOPQ'
fx
(Function eXtension) is the backbone offoc
and provides a new syntax when composing functions.
Technically,fx
maps every function in Python to a monadic function infx
monad.
In fact,fx
is a lift function, but here, the functions generated byfx
are also expressed asfx
.
There are two ways to compose functions with symbols as shown in the previous section.
Symbol | Description | Evaluation Order |
---|---|---|
^ (caret) |
same as dot(. ) mathematical symbol |
Right-to-Left |
| (pipeline) |
in Unix pipeline manner | Left-to-Right |
If you don't like function composition using symbols, use
cf_
.
In fact, it's the most reliable and safe way to use it for all functions.
fx
is just a function decorated by @fx
.
Wrap any function in fx
when you need function composition on the fly.
>>> [1, 2, 3] | sum | (lambda x: x * 7) # error, lambda is not a 'fx'
TypeError: unsupported operand ...
>>> [1, 2, 3] | sum | fx(lambda x: x * 7) # just wrap it in 'fx'.
42
>>> @fx
... def func(arg): # place @fx above the definition or bind 'g = fx(func)'
... ... # 'func' is now 'composable' with symbols
Most of the functions provided by
foc
arefx
functions.
If you don't have one, you can just create one and use it.
# map := map(predicate, iterable)
# currying 'map' -> map(predicate)(iterable)
>>> map(_ * 8)(seq(1,...)) | take(5) # seq(1,...) == [1,2,3,..], 'infinite' sequence
[8, 16, 24, 32, 40] # seq(1,3,...) == [1,3,5,..]
# seq(1,4,,11) == [1,4,7,10]
# bimap := bimap(f, g, tuple)
# bimap(f, g) := first(f) ^ second(g) # map over both 'first' and 'second' argument
>>> bimap(_ + 3)(_ * 7)((5, 7))
(8, 49)
>>> (5, 7) | bimap(_ + 3)(_ * 7)
(8, 49)
>>> filterl(_ == "f")("fun-on-functions") # filterl == (filter | collect)
['f', 'f']
>>> foldl(op.sub)(10)(range(1, 5))
0
@fx
def args(a, b, c, d):
return f"{a}-{b}-{c}-{d}"
>>> args(1)(2)(3)(4) == args(1,2)(3,4) == args(1,2,3)(4) == args(1)(2,3,4) == args(1,2,3,4)
True
You can get the curried function of
g
withfx(g)
.
But if you want to get a curried function other thanfx
, usecurry(g)
.
>>> [1, 2, 3] | sum | (_ * 7) # Use '_' lambda instead.
42
>>> ((_ * 6) ^ (_ + 4))(3) # (3 + 4) * 6
42
>>> 2 | (_ * 7) | (60 % _) | (_ // 3) # (60 % (2 * 7)) // 3
1
Partial application driven by _
is also possible when accessing dict
, object
or iterable
, or even calling functions. How about using _(_)
as a curried function caller?
Operator | Equiv Function |
---|---|
_[_] |
op.getitem |
_[item] |
op.itemgetter(item) |
_._ |
getattr |
_.attr |
op.attrgetter(attr) |
_(_) |
apply |
_(*a, **k) |
lambda f: f(*a, **k) |
# dict
>>> d = dict(one=1, two=2, three="three")
>>> _[_](d)("two") # curry(lambda a, b: a[b])(d)("two")
2
>>> _["one"](d) # (lambda x: x["one"])(d)
1
>>> cf_(_[2:4], _["three"])(d) # d["three"][2:4]
're'
# iterable
>>> r = range(5)
>>> _[_](r)(3) # curry(lambda a, b: a[b])(r)(3)
3
>>> _[3](r) # (lambda x: x[3])(r)
3
# object
>>> o = type('', (), {"one": 1, "two": 2, "three": "three"})()
>>> _._(o)("two") # curry(lambda a, b: getattr(a, b))(o)("two")
2
>>> _.one(o) # (lambda x: x.one)(o)
1
>>> o | _.three | _[2:4] # o.three[2:4]
're'
# function caller
>>> _(_)(foldl)(op.add)(0)(range(5))
10
>>> _(7 * _)(mapl)(range(1, 10))
[7, 14, 21, 28, 35, 42, 49, 56, 63]
# Not seriously, this creates multiplication table.
>>> [ mapl(f)(range(1, 10)) for f in _(_ * _)(map)(range(1, 10)) ]
To see all the functions provided by
foc
, runcatalog()
.
fx
pure basic functionsid
,const
,take
,drop
,repeat
,replicate
..- higher-order functions like
f_
,g_
,curry
,uncurry
,flip
,map
,filter
,zip
,.. - function composition tools like
cf_
,cfd_
, .. - useful yet very fundamental like
seq
,force
,trap
,error
,guard
,..
A causal self-attention of the transformer
model based on pytorch
can be described as follows.
Somebody insists that this helps to follow the process flow without distraction. (plus, 3-5% speed-up)
def forward(self, x):
B, S, E = x.size() # size_batch, size_block (sequence length), size_embed
N, H = self.config.num_heads, E // self.config.num_heads # E == (N * H)
q, k, v = self.c_attn(x).split(self.config.size_embed, dim=2)
q = q.view(B, S, N, H).transpose(1, 2) # (B, N, S, H)
k = k.view(B, S, N, H).transpose(1, 2) # (B, N, S, H)
v = v.view(B, S, N, H).transpose(1, 2) # (B, N, S, H)
# Attention(Q, K, V)
# = softmax( Q*K^T / sqrt(d_k) ) * V
# // q*k^T: (B, N, S, H) x (B, N, H, S) -> (B, N, S, S)
# = attention-prob-matrix * V
# // prob @ v: (B, N, S, S) x (B, N, S, H) -> (B, N, S, H)
# = attention-weighted value (attention score)
return cf_(
self.dropout, # dropout of layer's output
self.c_proj, # linear projection
g_(_.view)(B, S, E), # (B, S, N, H) -> (B, S, E)
torch.Tensor.contiguous, # contiguos in-memory tensor
g_(_.transpose)(1, 2), # (B, S, N, H)
_ @ v, # (B, N, S, S) x (B, N, S, H) -> (B, N, S, H)
self.dropout_attn, # attention dropout
f_(F.softmax, dim=-1), # softmax
g_(_.masked_fill)(mask == 0, float("-inf")), # no-look-ahead
_ / math.sqrt(k.size(-1)), # / sqrt(d_k)
_ @ k.transpose(-2, -1), # Q @ K^T -> (B, N, S, S)
)(q)